Give your AI assistant
your entire architecture.
Connect ArcLume to Cursor, Claude Code, or VS Code via MCP. Your AI assistant gets live, structural context across every repo — so it can answer questions that single-repo tools can't.
Your AI sees one repo. Your system spans twenty.
Cursor, Claude Code, and Copilot are powerful — but they only see the repo you're working in. That means wrong answers at every service boundary.
Blind to downstream impact
You change an endpoint's response shape. Your AI says it looks fine. Three downstream services break in staging because it couldn't see them.
Similarity, not structure
Your AI finds code that looks related by text similarity. It can't trace actual call graphs, consumer relationships, or shared interface contracts across repos.
Manual archaeology
Before every cross-service change, you're grep-ing through repos, reading Confluence pages, and asking teammates. The context exists — it's just scattered.
What ArcLume MCP gives your AI assistant
A live, structural model of your entire system — every repo, every dependency, every cross-service connection — queryable directly from your editor.
Blast radius analysis
"If I change this endpoint's response shape, what breaks?" ArcLume traces downstream consumers across services and tells your AI exactly what's affected — before you commit.
Cross-repo dependency queries
"What services consume this Kafka topic?" "Which repos import this shared interface?" Answered by traversing the actual dependency graph, not guessing from text.
Interface point mapping
REST endpoints, Kafka topics, BullMQ queues, gRPC services — every cross-service interface point detected and mapped. Your AI knows what connects to what.
Confidence-scored connections
Not all cross-repo connections are certain. ArcLume scores each one — confirmed, likely, or uncertain — so your AI tells you where to investigate further.
Example: querying from Cursor
"What services consume the order.completed Kafka topic?"
3 consumers found:
confirmed — billing-service/src/consumers/orderCompleted.ts
confirmed — analytics-service/src/handlers/events.ts
likely — notification-service/src/workers/email.ts
"What would break if I add a field to the order.completed payload?"
Adding a field is non-breaking for the 2 confirmed consumers (they destructure specific fields). The likely consumer in notification-service spreads the full payload — verify it handles unknown fields.
Works with your existing tools
ArcLume exposes a standard MCP server. Any MCP-compatible AI assistant gets full access to your structural model.
Cursor
Native MCP support
Claude Code
Native MCP support
VS Code
Via MCP extension
Set up in two minutes
Connect your repos via GitHub App
One-click installation. ArcLume indexes every repository and builds a structural model — symbols, relationships, cross-service dependencies. Automatic re-indexing on every push.
Add ArcLume as an MCP server
Authenticate via OAuth in your IDE settings. ArcLume registers as an MCP context provider — your AI assistant now has access to your full architecture.
Ask questions you couldn't before
Blast radius, downstream consumers, shared interfaces, dependency chains — your AI answers structurally across every connected repo. No more manual archaeology.
Your AI deserves the full picture.
Connect your repos, add the MCP server, and give your AI assistant the cross-repo context it's been missing.
Connect your IDE