Lovable is great at building features fast. But the less it knows about your existing codebase, the more you end up correcting — wrong patterns, missing conventions, code that doesn't fit your architecture.
ArcLume solves this by giving Lovable structured context about your actual system. Through the Model Context Protocol (MCP), your AI coding tool can query ArcLume directly — searching your codebase semantically, understanding service boundaries, and generating implementation prompts grounded in real code.
This guide walks through the full workflow: connecting ArcLume to your IDE, using MCP tools to explore your codebase, generating context bundles, and handing them to Lovable for implementation.
What is MCP?
The Model Context Protocol is an open standard that lets AI tools connect to external data sources. Instead of copying and pasting context manually, MCP lets your AI assistant call tools that fetch exactly the information it needs — in real time.
ArcLume ships an MCP server that exposes your codebase intelligence to any MCP-compatible client: Claude Code, VS Code with Copilot, Cursor, or any tool that supports the protocol.
Prerequisites
- A ArcLume account with at least one connected and indexed repository
- An API key (generated in Settings → MCP Setup)
- An MCP-compatible AI coding tool (Claude Code, VS Code, or Cursor)
- A Lovable account for implementation
Step 1: Connect ArcLume to your IDE
ArcLume provides an MCP server as an npm package. You configure it once in your IDE settings, and it stays available across all sessions.
Claude Code
Add the following to your ~/.claude/settings.json:
{
"mcpServers": {
"arclume": {
"command": "npx",
"args": ["-y", "@arclume/mcp-server"],
"env": {
"ARCLUME_API_KEY": "efk_your_api_key",
"ARCLUME_ORG_ID": "your-org-id",
"ARCLUME_API_URL": "https://arclume.dev/api/"
}
}
}
} VS Code / Cursor
Add the same configuration to .vscode/mcp.json or .cursor/mcp.json in your project root:
{
"servers": {
"arclume": {
"command": "npx",
"args": ["-y", "@arclume/mcp-server"],
"env": {
"ARCLUME_API_KEY": "efk_your_api_key",
"ARCLUME_ORG_ID": "your-org-id",
"ARCLUME_API_URL": "https://arclume.dev/api/"
}
}
}
} You can generate your API key and find your organization ID from the Settings → MCP Setup page in ArcLume. The key is shown once on creation — copy it immediately.
Step 2: Explore your codebase with MCP tools
Once connected, your AI coding tool has access to seven tools that query your ArcLume knowledge graph. Here are the ones most relevant to the Lovable workflow:
search_code — Semantic code search
Search your indexed codebases by intent, not just keywords. This uses vector embeddings to find relevant code even when the exact terms don't appear in the source.
// Example: find payment-related handlers across all repos
search_code({ query: "payment webhook handler", limit: 10 })
// Filter by language or symbol type
search_code({ query: "user authentication", language: "typescript", chunk_type: "function" }) Results include file paths, line numbers, function signatures, and dependency information (imports and importers) — giving your AI tool full structural context.
ask_codebase — Natural language Q&A
Ask questions about your codebase in plain English. ArcLume uses RAG (retrieval-augmented generation) to ground its answers in actual code.
ask_codebase({ question: "How does the order service communicate with inventory?" })
ask_codebase({ question: "What middleware runs before the /api/users endpoint?" }) get_interfaces — Cross-repo interface mapping
Query detected API interfaces — REST endpoints, message queues, gRPC services, WebSocket handlers — and see which services produce and consume them.
// Find all REST endpoints related to payments
get_interfaces({ search: "payment", interface_type: "rest" })
// See all BullMQ job producers and consumers
get_interfaces({ interface_type: "bullmq" }) get_codebase_overview — High-level stats
Get a quick summary of your indexed codebases: repo count, total embeddings, file type distribution, and symbol type breakdown. Useful as a starting point before diving into specific searches.
Step 3: Generate a context bundle
Context bundles are the bridge between ArcLume and Lovable. A bundle is a structured markdown prompt that contains everything Lovable needs to implement a story correctly:
- Story details — title, description, acceptance criteria, complexity estimate
- Implementation context — notes about approach, affected services, dependencies
- Relevant code — actual code snippets retrieved via RAG, with file paths, line numbers, and dependency metadata
Generating from the ArcLume UI
Navigate to any story in your map and open the Code Workspace. Click Generate Bundle to create a raw context bundle. Optionally toggle AI Synthesize to have Claude rewrite the raw bundle into a prompt optimized for Lovable — restructuring the information into a natural, instruction-oriented format.
Once generated, you can:
- Copy to clipboard — paste directly into a Lovable session
- Download as
.md— attach to a Lovable session or save for reference
Generating via MCP
If you're working in your IDE, you can use MCP tools to gather the same context programmatically. Ask your AI assistant to:
- Search for relevant code with
search_code - Understand the architecture with
ask_codebaseandget_interfaces - Assemble the context into a structured prompt for Lovable
This approach is flexible — you can tailor the context to exactly what the implementation needs, adding or removing context as you go.
Step 4: Hand off to Lovable
With your context bundle ready, open a new Lovable session and paste the bundle as your initial prompt. Because the bundle includes:
- Specific file paths and function signatures from your codebase
- Actual code patterns and conventions your team uses
- Dependency information (what imports what, what calls what)
- Clear acceptance criteria and implementation notes
Lovable generates code that fits your existing architecture instead of inventing its own patterns. The result: fewer corrections, faster iterations, and implementation that engineers can actually merge.
Example: full workflow
Here's what the end-to-end workflow looks like for a typical feature:
- Product manager uploads a meeting transcript to ArcLume. AI generates a structured epic with 8 stories, each with acceptance criteria and complexity estimates.
- Team reviews and refines the stories in ArcLume. Adjusts scope, splits a large story, adds implementation notes.
- Engineer picks up a story and opens the Code Workspace. Clicks "Generate Bundle" with AI Synthesize enabled.
- Engineer pastes the bundle into Lovable. Lovable generates a Vue component that follows the team's existing Tailwind conventions, uses the correct Pinia store pattern, and imports from the right API module.
- Engineer reviews and iterates. Because the initial output is architecturally correct, iteration is about refinement — not starting over.
Tips for better results
- Keep repos indexed — ArcLume auto-indexes on push via the GitHub App webhook. If you've made significant changes, you can trigger a re-index from the MCP tool
trigger_indexor from the repo settings page. - Use AI Synthesize — The raw bundle is comprehensive but dense. The synthesized version restructures it into a natural prompt that Lovable processes more effectively.
- Add implementation context to stories — The more specific your implementation notes, the better the bundle. Mention specific files, services, or patterns you want the implementation to follow.
- Use
ask_codebasefor exploration — Before generating a bundle, use the MCP tool to ask questions about how the relevant parts of your system work. This helps you write better implementation notes. - Iterate on the bundle — You can regenerate bundles after editing a story's acceptance criteria or implementation context. The RAG retrieval adapts to the updated content.
What's next
ArcLume is in beta, and MCP support is actively evolving. Upcoming improvements include:
- Direct Lovable integration — generate and push context bundles without leaving ArcLume
- Multi-story bundles — generate context across related stories for larger features
- Custom prompt templates — define your team's preferred bundle format
Join the beta to try the MCP workflow with your own codebase.
Ready to try ArcLume?
ArcLume is currently in beta. Connect your repos, build a knowledge graph, and start generating codebase-aware epics and stories.
Join the Beta