Settings

Theme

Comparing how 3 AI assistants implement memory

maximem.ai

1 points by gdad a month ago · 1 comment

Reader

gdadOP a month ago

Author here. I have been learning about how ChatGPT, Claude, and OpenClaw handle persistent memory; what actually gets stored, how it is retrieved, etc.

Quick summary of the findings: - ChatGPT pre-computes lightweight summaries of your ~15 most recent conversations and injects them into every prompt. No vector search, no RAG. Simpler than I expected. - Claude takes an on-demand approach; it has search tools it can invoke to query your past conversations, but only fires them when it judges the context is relevant. More flexible, less consistent. - OpenClaw stores memory as plain Markdown on your local machine with hybrid search (semantic + BM25, 70/30 weighting). Fully transparent, but single-platform.

Full disclosure: I'm building in this space (Maximem Vity — a private, secure vault for cross-LLM memory). The comparison stands on its own, but that context motivated the research.

Happy to discuss the architectural differences or answer questions.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection