An MCP server that indexes your Claude Code and Codex conversations for semantic search. Find solutions, decisions, and context from previous coding sessions.
Features
- Semantic Search - Find conversations by meaning, not just keywords
- Auto-Indexing - Automatically indexes new conversations as they happen
- Local & Private - Everything runs locally using LanceDB and a Huggingface model
- Resumable - Indexing saves progress incrementally, picks up where it left off
- Fast - Uses all-MiniLM-L6-v2 for quick, lightweight embeddings
Installation
Use the setup for your client:
Claude Code
Add to your Claude Code MCP settings (~/.claude/settings.json):
{
"mcpServers": {
"clancey": {
"command": "npx",
"args": ["-y", "clancey"]
}
}
}Restart Claude Code.
Codex
Add to your Codex config (~/.codex/config.toml):
[mcp_servers.clancey] command = "npx" args = ["-y", "clancey"]
Restart Codex.
MCP Tools
search_conversations
Search through your indexed conversations semantically.
query: "how did I fix the auth bug"
limit: 5
project: "my-app" # optional filter
index_conversations
Manually trigger conversation indexing.
force: true # reindex everything
index_status
Get statistics about indexed conversations.
How It Works
- Scans conversation files from
~/.claude/projects/and~/.codex/sessions/ - Parses JSONL events into user/assistant messages
- Chunks long conversations into searchable segments
- Generates embeddings with
all-MiniLM-L6-v2 - Stores vectors in LanceDB at
~/.clancey/conversations.lance - Watches both source directories and incrementally re-indexes changed
.jsonlfiles
Data is stored in ~/.clancey/. Logs are at ~/.clancey/clancey.log.
Development
git clone https://github.com/divmgl/clancey.git
cd clancey
bun install
bun run buildTech Stack
- LanceDB - Vector database
- Huggingface Transformers.js - Embedding model
- MCP SDK - Claude integration
- Chokidar - File watching
License
MIT