Settings

Theme

Mojo-V: Secret Computation for RISC-V

github.com

66 points by fork-bomber a month ago · 30 comments

Reader

tromp a month ago

This should not (so much) be compared with Fully Homomorphic Encryption (FHE) but with a Trusted Execution Environment (TEE). It is a very elegant and minimal way to implement TEEs, but suffers from the same drawbacks: a data owner has to trust the service provider to publish the public keys of actual properly constructed Mojo-V hardware rather than arbitrary public keys or public keys of maliciously constructed Mojo-V hardware.

[1] https://en.wikipedia.org/wiki/Trusted_execution_environment

  • api a month ago

    You could have the keys signed by a chip maker, which cuts the hosting provider out and reduces the trust surface to the manufacturer only. Unless your adversary is someone sophisticated enough to do surgery on chips.

    It’s still not FHE but it’s about as good as you can get otherwise.

    • goku12 a month ago

      > Unless your adversary is someone sophisticated enough to do surgery on chips.

      Since the threat assessment is important for deciding the strength of countermeasures, let me just add that this isn't as uncommon as you may believe. A company that I worked for had a decent capability to do this, and they were using it just to investigate the failures of electronic subsystems in their projects. Imagine what a more dedicated entity could achieve. This is why standards like FIPS 140-2/3 level-3/4 are very relevant in a significant number of corporate cases.

      Talking about chip surgeries, I wish our distinguished expert Ken Shirrif could throw some light on the process. His work on legacy chips is one of the most noteworthy in the field.

      • todd_austin a month ago

        I created Mojo-V

        I agree that side channel and physical attacks are crucial to stop. The predecessor to Mojo-V (Agita Labs TrustForge) was red teamed for three months including differential physical measurement attacks, and the system was never penetrated. So where there is a will there is a way!

        Mojo-V stops software, inst timing, microarchitectural, and ciphertext side channels. Vendors can stop analog attacks if they choose to, but the reference design, which I am building, is meant to be really simple to integrate into an existing RISC-V core. Adding Mojo-V only requires changes to the Instruction Decoder and the Load-Store Queue, regardless of the complexity of the microarchitecture.

    • todd_austin a month ago

      I created Mojo-V

      Yes exactly, because it is a privacy tech, the key/control channel tunnels through all software into the Mojo-V trusted H/W.

      In the spec, I've been working on new Appendices comparing Mojo-V to TEEs, FHE, CHERI, and other high security tech. Mojo-V is a new thing, so absorbing it will take a while! :-)

      I see it as a new design point between TEEs and FHE but much closer to FHE. TEEs are fast but they are not good at establishing trust with untrustworthy service providers, FHE is the ultimate in zero trust as all trust is in the math. Mojo-V eliminates all software, programmer, IT staff, attacker, malware trust with trusted hardware, and it runs near native speed.

      And yeah, my mission is to snuggle as close to FHE as hardware can get!

    • childintime a month ago

      Couldn't the keys be loaded once, in private write-only flash memory, by the user of the chip?

      • todd_austin a month ago

        I created Mojo-V.

        IMHO, the service provider is the last one that should ever be able to see the keys :-). It's them we want to keep sensitive data away from

        Keys are injected into the HW with public-key encryption. This requires that the HW have keys that only the HW knows (it's secret key). This key is made by a weak PUF circuit, which is basically a circuit that measures silicon process variation. So the keys are born in the silicon fab, through the natural variability of the silicon fabrication process. I didn't invent this, it is an old idea. Intel SGX uses the same approach.

      • tromp a month ago

        The intended use case is for remote execution where the user (data owner) pays a service provider to run services on their hardware. It could still work if the user somehow prepares the chip herself and ships it to the service provider to be used on their future data, but most users would not want to bother with that first step.

  • todd_austin a month ago

    I created Mojo-V.

    For me, I see Mojo-V more like FHE than a TEE, for three primary reasons: 1) Like FHE, the tech is applied to variables and computation that doesn't touch protected variables is not affected. TEEs protect processes. 2) Like FHE, Mojo-V lacks software, timing, and microarchitectural side channels. TEEs are riddled with side channels. 3) Like FHE, no trust is extended to software because it cannot see the data it is processing. TEEs require that clients trust that the attested software has their best interests in mind.

    Public key signing is like SGX, the vendor signs the public to certify that it is from real Mojo-V hardware.

  • technocrat8080 a month ago

    To be clear, it's not a TEE replacement but does address one of the most common use cases of TEEs

Manfred a month ago

After skimming through the documentation this seems like a nice solution, but I'm not sure if this is a problem we want to solve.

Consumers are finding out the issue with cloud computing when their heating system can't turn on because Cloudflare is down. A cheaper and more reliable solution is still on-premises computing.

Large social network and content platforms don't have any incentive to keep your data safe because they want to monitor and own everything.

Maybe this is for something like a government running a public service?

  • nl a month ago

    > I'm not sure if this is a problem we want to solve

    Who is this we you speak of?

    I for one much prefer my cloud services and would love TEE I can control.

    > A cheaper and more reliable solution is still on-premises computing.

    I assure you that my use of Cloudflare services ($0 in nearly 10 years) is much more reliable and much cheaper than hardware I run.

    • Manfred a month ago

      I was genuinely asking, what cloud service do you use where trusted computing is essential for the core functionality of that service? What elements of the computational process do you not trust those services to perform for you?

      My point about Cloudflare was more about them taking down essential services that could run just as well on-premises like a heating controller.

      • nl a month ago

        For a while I was running LLMs in secure enclaves on AWS so I could do E2E encryption. Privacy without having to run a local LLM.

  • throawayonthe a month ago

    i want good confidential compute for cases where e2ee is impractical, like an email server or immich with server-side ml/processing etc

    • Manfred a month ago

      Who are you protecting data access from in those cases? My suggestion was that it's probably more practical to run those kinds of solutions on a hardware stack you trust; in our basement or in a small box on the wall in your living room.

      Besides, the specific extension we're talking about protect registers and computation and not shared memory.

      • tonetegeatinst a month ago

        Issue is, unless you can be 100% sure you hardware has not been built with a vulnerability or backdoor, or subject to an evil maid attack....then you can't be sure its trustworthy.

LarsDu88 a month ago

Was it really wise to name this Mojo when Chris Lattner, former Head 9f Engineering at SiFive also called his well funded programming language Mojo?

shakna a month ago

This could be:

Great for security - Being able to safely compute secrets is a very difficult problem.

Fucking awful for security - More OEM secret controls and "analytics" that devolve into backdoors after someone yet again post keys online.

  • todd_austin a month ago

    I created Mojo-V.

    There's no back doors, but there's no integrity checking either, so a Mojo-V voting machine could take an encrypted vote and throw it away and add +1 to the attacker's favorite candidate.

    A computational integrity checking mechanism will appear soon that will add a concise proof to every encrypted Mojo-V value, that will prove to the data owner that their requested computation was faithfully performed. And the mechanism also supports safe disclosures, too.

    This should give data owners strong controls over what can be done with their data

    • shakna a month ago

      By backdoor, I meant the capability to implement one.

      The OEM looking for greater controls on their platform, would be considered the "data owner", here.

      There's no real way to prevent that.

  • Manfred a month ago

    The platform owner can manage keys and data contracts in the processor, that should enable them to rotate secrets constantly.

    In other hardware there is an OEM secret because the manufacturer is trying to keep users out of "their hardware", in this case we're trying to keep everyone except the data owner out.

pjmlp a month ago

And the relationship to Mojo programming language is?

snvzz a month ago

RISC-V is inevitable.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection