Your codebase, in 1,500 tokens.
One command generates a committed JSON index that any AI agent can read. No server, no setup.
npx stacklit initWithout stacklit: Agent reads 8-12 files to build context. ~400,000 tokens. 45 seconds before writing a line.
With stacklit: Agent reads stacklit.json. ~1,500 tokens. Knows every module, dependency, and convention instantly.
| File | What it does | Committed? |
|---|---|---|
stacklit.json |
Machine-readable codebase index | Yes |
DEPENDENCIES.md |
Mermaid dependency diagram (renders on GitHub) | Yes |
stacklit.html |
Interactive visual map with 4 views | No (gitignored) |
stacklit serveAdd to Claude Desktop or Cursor MCP config:
{
"mcpServers": {
"stacklit": {
"command": "stacklit",
"args": ["serve"]
}
}
}Seven tools: get_overview, get_module, find_module, list_modules, get_dependencies, get_hot_files, get_hints.
stacklit init # scan, generate all outputs, open HTML
stacklit generate # regenerate from current source
stacklit view # regenerate HTML and open in browser
stacklit diff # check if index is stale
stacklit serve # start MCP server
Go (AST), TypeScript, JavaScript, Python, Rust, Java (regex), plus generic fallback for any language.
| Stacklit | Repomix | Aider repo-map | Codebase Memory MCP | |
|---|---|---|---|---|
| Output | ~1,500 token JSON | 500k+ token dump | Ephemeral text | SQLite DB |
| Committed to repo | Yes | Too large | No | No |
| Dependency graph | Yes | No | Yes | Yes |
| Visual output | HTML (4 views) | No | No | No |
| MCP server | Yes | No | No | Yes |
| Runtime needed | No | No | Yes (Python) | Yes (C server) |
MIT

