Settings

Theme

Ask HN: End-to-end encrypted LLM chat (open- and closed-model)

3 points by 5F7bGnd6fWJ66xN 2 months ago · 2 comments · 1 min read


I’m exploring a software layer—analogous to public/private-key crypto—so a user can converse with an LLM where prompts and responses remain unreadable to all intermediaries, including the model host. (I mean “cipher” in the cryptographic sense.)

Two cases: Open-weights model: ensure the operator still can’t read prompts/responses. Closed, hosted model: true E2EE so even the provider can’t inspect content.

Topics we can discuss: Best near-term path: TEEs with attestation, FHE/HE, MPC/split inference, PIR for retrieval, differential privacy, or hybrids? How to handle key exchange/rotation for forward secrecy? Practical performance/accuracy limits (e.g., non-linearities, KV-cache, streaming)? Minimal viable architecture and realistic threat model? Any prior art or teams you’d point me to?

Please DM if you are interested in working with me.

KashyapArjun 2 months ago

I was thinking about this a while ago… the loophole I see is that the provider is the one who sets the public/private key of the LLM. And because the LLM is not a person who remembers its password (unless we come up with a way for it), we cannot guarantee that the provider doesn’t use keys to decrypt and read the messages.

kiririn7 2 months ago

sounds interesting. use case though? i cant imagine there is a large demand for this

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection