GitHub - divmgl/clancey: An MCP server that indexes Claude Code and Codex conversations for semantic search using local embeddings.

2 min read Original article ↗

CI Publish to npm

An MCP server that indexes your Claude Code and Codex conversations for semantic search. Find solutions, decisions, and context from previous coding sessions.

Features

  • Semantic Search - Find conversations by meaning, not just keywords
  • Auto-Indexing - Automatically indexes new conversations as they happen
  • Local & Private - Everything runs locally using LanceDB and a Huggingface model
  • Resumable - Indexing saves progress incrementally, picks up where it left off
  • Fast - Uses all-MiniLM-L6-v2 for quick, lightweight embeddings

Installation

Use the setup for your client:

Claude Code

Add to your Claude Code MCP settings (~/.claude/settings.json):

{
  "mcpServers": {
    "clancey": {
      "command": "npx",
      "args": ["-y", "clancey"]
    }
  }
}

Restart Claude Code.

Codex

Add to your Codex config (~/.codex/config.toml):

[mcp_servers.clancey]
command = "npx"
args = ["-y", "clancey"]

Restart Codex.

MCP Tools

search_conversations

Search through your indexed conversations semantically.

query: "how did I fix the auth bug"
limit: 5
project: "my-app"  # optional filter

index_conversations

Manually trigger conversation indexing.

force: true  # reindex everything

index_status

Get statistics about indexed conversations.

How It Works

  1. Scans conversation files from ~/.claude/projects/ and ~/.codex/sessions/
  2. Parses JSONL events into user/assistant messages
  3. Chunks long conversations into searchable segments
  4. Generates embeddings with all-MiniLM-L6-v2
  5. Stores vectors in LanceDB at ~/.clancey/conversations.lance
  6. Watches both source directories and incrementally re-indexes changed .jsonl files

Data is stored in ~/.clancey/. Logs are at ~/.clancey/clancey.log.

Development

git clone https://github.com/divmgl/clancey.git
cd clancey
bun install
bun run build

Tech Stack

License

MIT