GitHub - norrietaylor/distillery: Team knowledge evaporates daily — pairing sessions, debugging context, architectural rationale lost to Slack. Distillery captures it at the point of creation, connects it into a living graph, and surfaces it conversationally. It monitors feeds, tracks what matters to your projects, and alerts you before you know to ask. A team brain that learns.

3 min read Original article ↗

Distillery

Team Knowledge, Distilled
Capture, classify, connect, and surface team knowledge through conversational commands.

Documentation · Skills · Quick Start · Roadmap · Slides

PyPI version PyPI downloads License Python version


What is Distillery?

Distillery is a team knowledge base accessed through Claude Code skills. It refines raw information from working sessions, meetings, bookmarks, and conversations into concentrated, searchable knowledge — stored as vector embeddings in DuckDB and retrieved through natural language. Runs locally over stdio or as a hosted HTTP service with GitHub OAuth for team access.

Distillery captures the highest-value transformation — from noise to signal — and makes it a tool the whole team can use.

Full documentation: norrietaylor.github.io/distillery

Distillery demo — /distill captures a decision, /pour synthesizes it

Skills

Distillery provides 14 Claude Code slash commands:

Skill Purpose Example
/distill Capture session knowledge with dedup detection /distill "We decided to use DuckDB for local storage"
/recall Semantic search with provenance /recall distributed caching strategies
/pour Multi-entry synthesis with citations /pour how does our auth system work?
/bookmark Store URLs with auto-generated summaries /bookmark https://example.com/article #caching
/minutes Meeting notes with append updates /minutes --update standup-2026-03-22
/classify Classify entries and triage review queue /classify --inbox
/watch Manage monitored feed sources /watch add github:duckdb/duckdb
/radar Ambient feed digest with source suggestions /radar --days 7
/tune Adjust feed relevance thresholds /tune relevance 0.4
/digest Team activity summary from internal entries /digest --days 7 --project myapp
/gh-sync Sync GitHub issues/PRs into the knowledge base /gh-sync owner/repo --issues
/investigate Deep context builder with relationship traversal /investigate distributed caching
/briefing Team knowledge dashboard with metrics /briefing --days 7
/setup Onboarding wizard for MCP connectivity and config /setup

Quick Start

Step 1: Install the Plugin

claude plugin marketplace add norrietaylor/distillery
claude plugin install distillery

This installs all 14 skills and configures the MCP server to run locally via uvx distillery-mcp — a private, self-contained knowledge base on your machine. Requires Python 3.11+ and uv (install: curl -LsSf https://astral.sh/uv/install.sh | sh).

Step 2: Set Your Embedding API Key (Optional but Recommended)

# Get a free API key from jina.ai
export JINA_API_KEY=jina_...

uvx inherits this from your shell environment. Without a key, Distillery falls back to a stub embedding provider (search quality degraded).

Restart Claude Code and run the onboarding wizard:

Try the Hosted Demo (Opt-In)

Want to evaluate without installing anything locally? Override the plugin default with the hosted demo at distillery-mcp.fly.dev:

claude mcp add distillery --scope user --transport http --url https://distillery-mcp.fly.dev/mcp

Demo Server: distillery-mcp.fly.dev is for evaluation only. Do not store sensitive or confidential data.

See the Local Setup Guide for full configuration options, or deploy your own instance for team use.

Development

uv pip install -e ".[dev]"
# or
pip install -e ".[dev]"
pytest                              # run tests
mypy --strict src/distillery/       # type check
ruff check src/ tests/              # lint

See Contributing for the full guide.

License

Apache 2.0 — see LICENSE for details.