About Cognee
Cognee is an open-source tool and platform that transforms your raw data into persistent and dynamic AI memory for Agents. It combines vector search with graph databases to make your documents both searchable by meaning and connected by relationships. Cognee offers default memory creation and search which we describe bellow. But with Cognee you can build your own!
⭐ Help us reach more developers and grow the cognee community. Star this repo!
Cognee Open Source:
- Interconnects any type of data — including past conversations, files, images, and audio transcriptions
- Replaces traditional RAG systems with a unified memory layer built on graphs and vectors
- Reduces developer effort and infrastructure cost while improving quality and precision
- Provides Pythonic data pipelines for ingestion from 30+ data sources
- Offers high customizability through user-defined tasks, modular pipelines, and built-in search endpoints
Basic Usage & Feature Guide
To learn more, check out this short, end-to-end Colab walkthrough of Cognee's core features.
Quickstart
Let’s try Cognee in just a few lines of code. For detailed setup and configuration, see the Cognee Docs.
Prerequisites
- Python 3.10 to 3.13
Step 1: Install Cognee
You can install Cognee with pip, poetry, uv, or your preferred Python package manager.
Step 2: Configure the LLM
import os os.environ["LLM_API_KEY"] = "YOUR OPENAI_API_KEY"
Alternatively, create a .env file using our template.
To integrate other LLM providers, see our LLM Provider Documentation.
Step 3: Run the Pipeline
Cognee will take your documents, generate a knowledge graph from them and then query the graph based on combined relationships.
Now, run a minimal pipeline:
import cognee import asyncio from pprint import pprint async def main(): # Add text to cognee await cognee.add("Cognee turns documents into AI memory.") # Generate the knowledge graph await cognee.cognify() # Add memory algorithms to the graph await cognee.memify() # Query the knowledge graph results = await cognee.search("What does Cognee do?") # Display the results for result in results: pprint(result) if __name__ == '__main__': asyncio.run(main())
As you can see, the output is generated from the document we previously stored in Cognee:
Cognee turns documents into AI memory.
Use the Cognee CLI
As an alternative, you can get started with these essential commands:
cognee-cli add "Cognee turns documents into AI memory." cognee-cli cognify cognee-cli search "What does Cognee do?" cognee-cli delete --all
To open the local UI, run:
Demos & Examples
See Cognee in action:
Persistent Agent Memory
cognee_langgraph.mp4
Simple GraphRAG
simple_graphrag_demo.mp4
Cognee with Ollama
cognee-with-ollama.mp4
Community & Support
Contributing
We welcome contributions from the community! Your input helps make Cognee better for everyone. See CONTRIBUTING.md to get started.
Code of Conduct
We're committed to fostering an inclusive and respectful community. Read our Code of Conduct for guidelines.
Research & Citation
We recently published a research paper on optimizing knowledge graphs for LLM reasoning:
@misc{markovic2025optimizinginterfaceknowledgegraphs, title={Optimizing the Interface Between Knowledge Graphs and LLMs for Complex Reasoning}, author={Vasilije Markovic and Lazar Obradovic and Laszlo Hajdu and Jovan Pavlovic}, year={2025}, eprint={2505.24478}, archivePrefix={arXiv}, primaryClass={cs.AI}, url={https://arxiv.org/abs/2505.24478}, }