Settings

Theme

Ask HN: How to run analytics on data without access to the data?

42 points by michealr 5 years ago · 45 comments · 2 min read

Reader

I have little service for personal use, and I was considering opening it up to a general audience. Right now its processing some of my personal data for fun little personal report, in particular chat data. Since its information I have access to already, I don't mind running the program locally. What I would like to be able to do is run an analysis for anyone and return the little report that I get for myself. Without having access to their data or storing it in the first place. I know with for example oauth scopes, you can grant access, which sort of fits the criteria. But I'm thinking more exported data from an application, that doesn't have delegated access functionality

How I envisioned a solution would be some trusted third party takes my analysis script, returns the report and that is it. I never see the underlying data and recieve only one time token to access it.

I know it will never be hundred percent leak proof, and there is still a level of user trust, I realise that, but just thinking conceptually, is there any existing service out there, that does such a thing or attempts to offer something similar? Or what would an alternative approach look like?

BelenusMordred 5 years ago

> I know it will never be hundred percent leak proof

A slow leaking ship will still sink. Attempts so far to anonymise public datasets have been terrible and turned into a garbage fire by attackers every time with minimal effort. Don't hand out false promises.

Guess you are looking for fully homomorphic encryption. A long-outstanding problem with lots of smart people working on it, some are doing ok at getting there.

https://github.com/ibm/fhe-toolkit-linux

  • dataewan 5 years ago

    Differential privacy is an area that makes some guarantees about not letting personal information about individuals escape. Might be a useful technique as well.

    https://en.wikipedia.org/wiki/Differential_privacy

    Agree that strong guarantees about privacy aren't achievable.

  • michealrOP 5 years ago

    Thats true, a poorly chosen description on my part.

    Very cool, had read about homomorphic systems. For fully homomorphic systems has there been successful SAAS like offering allowing use of such a systems? Or do you think its still in the research oriented phase?

    • maest 5 years ago

      Not really SaaS, and debatable whether it's successful, but I hear numerai uses homomorphic encryption when providing data for quants to backtest.

      https://numer.ai/ https://en.wikipedia.org/wiki/Numerai

      EDIT: added qualifier, since i do not know for sure if numerai is using homomorphic encryption.

    • kasperni 5 years ago

      Definitely still in the research phase for what you are looking for. Performance is anywhere from 5-15 magnitudes slower than ordinary computations. Yes, magnitudes...

  • cipherboy 5 years ago

    I'm a little rusty but I swear I saw a partial homomorphic encryption scheme for aggregates and analytics. I want to say Enigma conference, '16 or '17? Maybe by Boston University.

    The benefit being that while you can run any computatio with a FHE, PHEs are generally faster.

    IIRC Microsoft was also doing research on PHEs.

meowface 5 years ago

Your best bet is probably to just do all the processing locally in the browser. The issue is 1) from most end users' perspectives, they have no idea if it's actually running locally or talking to a server, or how to verify it, or probably what that difference even means in the first place, so a skeptical user won't necessarily gain that much additional peace of mind, and 2) hypothetically a compromise could still result in the local data being siphoned off by an attacker. The latter's still a risk for regular desktop applications, but a bit less so (since you can get a signed binary).

The homomorphic encryption approach probably isn't worth the effort. There's always going to be a trade-off between doing something useful and sufficiently/securely obfuscating/anonymizing the data. So I'd recommend the local approach, with a prominent explanation of how you don't and can't see any of the data.

  • hunter2_ 5 years ago

    This could work if the analytics engine is free and (ported to) JavaScript, but not if it's closed source. In the latter case, a trusted third party (escrow, one might call it) as OP described does seem like the way to go.

    The problem is, why would end users trust the third party more than the analytics developer? Are there companies that specialize in being this third party and have amassed mutual trust of the general public (akin to a notary public) for handling data and code without leaking either?

    • michealrOP 5 years ago

      Analytics wise, I'm ok with being restricted, other commenters have mentioned looking at WASM as a possible workaround. So local does seem to make the most sense, practicality wise

      A thought, the possible scope of services in the data notary or data escrow side of things does seem like an underexplored product category.

      • meowface 5 years ago

        Any such data notary/escrow company has a pretty good shot of eventually getting breached (they'd naturally be a prime target, since the attackers could get tons of data from tons of people on behalf of tons of different companies), and that'll possibly destroy that company and maybe also your app. There's also the risk they may eventually have rogue employees, etc.

        • hunter2_ 5 years ago

          Regular notaries could be as crooked as rogue employees, yet we still use them because imperfect barriers are still barriers (as with security).

          But yeah, when computer-related vulnerabilities are thrown into the mix, it could get ugly.

          • meowface 5 years ago

            Sure, there's often going to be some centralized source one needs to trust. The issue with a digital escrow vendor is kind of like the issue with cryptocurrency exchanges - one single breach and you immediately walk out with an unfathomably huge treasure trove.

            A rogue notary employee can do some damage and notarize things in exchange for bribes, and a rogue bank employee could help siphon some money away, but a rogue digital escrow employee could be bribed to hand over terabytes of extremely sensitive data on lots of big customers, and a rogue cryptocurrency exchange employee could possibly help someone steal hundreds of millions of dollars pretty easily. It's a huge house of cards.

    • nindalf 5 years ago

      It doesn't need to be in JavaScript. Any language that can compile to WebAssembly would work too. But I agree with the broader point - the code needs to execute on the client, not the server.

  • satyrnein 5 years ago

    Maybe a browser extension with limited permissions? Say the tool looks at Slack and counts how often you use the ROFL emoji. The extension could be granted access to *.slack.com but no other domains.

    • meowface 5 years ago

      That could work, but then you have the additional barrier of having to convince people to install your browser extension, and for people who are already worried about privacy, that comes with its own can of worms. Especially if they don't necessarily understand or trust the permission model.

franky47 5 years ago

I asked myself a similar question for web analytics a year ago [1]: how to provide a service without having access to the underlying data. It requires shifting the processing onto the client side, so it limits what you can do, but it's best for privacy, and security (since the data never leaves the native app or browser).

[1] https://chiffre.io

  • dumbfounder 5 years ago

    Client side is the first answer, but is there a second? Is there a way to peer review a piece of code that can run in a 3rd party container (peer review and cryptographically signed), such that the actual container running the code is encrypted itself and can run anywhere?

    I am imagining you download the "container", put the data in, encrypt the container with the data inside, and have that run anywhere.

    But I have no idea if that is possible.

    • michealrOP 5 years ago

      I wonder myself the same thing.

      Thinking through issues, the external script could still repeatable run on the hidden data, slowly building an idea of the information. There are techniques like homomorphic encryption that go in the direction of allowing analysis on encrypted data.

      Musing on possible other solutions, I wonder if simply ratching up the cost and repeated access and limiting data output would discourage this profile building.

      Another possibility is it possible to concieve of the service, that takes in a script, runs it, and then tests the returned data for the level of information entropy. Blocking anything above a certain threshold. FYI not sure if that is complete nonsense, but conceptually, with much hand waving, maybe it works.

      Going local though does help too

    • noisenotsignal 5 years ago

      It's not really "run anywhere", but you can write apps for a trusted execution environment like Intel SGX enclaves; not even the OS can look at what's running. Enclave code is cryptographically signed so that you can both validate the identity of the signer as well as the code contents. In the latter, you'd have to compare the MRENCLAVE value to a published value, which you could reproduce by building from source if it's open.

      Microsoft calls this "confidential computing" and has some related Azure products, including providing VMs standalone and in Kubernetes.

    • franky47 5 years ago

      That would be feasible with homomorphic encryption, however current implementations are very far from practical applications (extreme resource consumption, terrible performance).

      • dumbfounder 5 years ago

        I am not talking about just encrypting the data and performing computation on the encrypted data, but encrypting the entire container with data inside and running that to produce a result with no way to view what’s going on inside. You can get around the limitations with how to run an algorithm on encrypted data because the data is not encrypted with respect to the program itself.

        Theoretically it would work like this: you download a docker image, you load your data into it, you encrypt the entire image with data inside, you send that whole package to the cloud where it is run and it produces an output.

        • franky47 5 years ago

          Yes I thought of that for my analytics SaaS (to calculate weekly reports), the issue is that the image has to be decrypted on the Docker host before execution, which requires it having access to the key somehow, breaking the end-to-end encryption promise (ie: "we have no way to access your data").

          This could be mitigated by having that worker host self-hosted by your clients, it depends how practical that might be.

  • jhoechtl 5 years ago

    Doable if you think client side of Ethereum or IPFS

stelfer 5 years ago

Take a look at Google Private Join and Compute[1]. But be aware that the problem you frame is an unsolved research problem with an active global community. The topics you are looking for are applications of secure multiparty computation and homomorphic encryption. Also, be ready for something as simple as a column join to take 24 hours per query.

[1] https://github.com/Google/private-join-and-compute

rjmunro 5 years ago

This reminds me of https://opensafely.org/, which analyses NHS medical data for research purposes by asking Doctors to run queries on their patient databases and send back only summaries, e.g. "How many of your patients with HIV also had Covid19" https://github.com/opensafely/hiv-research

gopty 5 years ago

https://www.darpa.mil/program/programming-computation-on-enc...

Syzygies 5 years ago

https://mathscinet.ams.org/mathscinet/help/about.html "MathSciNet® is an electronic publication offering access to a carefully maintained and easily searchable database of reviews, abstracts and bibliographic information for much of the mathematical sciences literature. Over 125,000 new items are added each year..."

The stakes are lower when money, not privacy, is at risk. I have attempted to argue for years that the MathSciNet catalog of the mathematical literature should be open to all forms of machine learning and mind mapping software experiments. It remains a cash cow for the American Mathematical Society, and they're fiercely proud of its human curation by 19th century methods. Meanwhile, mathematicians continue to believe that math remains separated into tribes, with number theorists lobbying to hire their own at departmental meetings. The true connections between ideas defy these ancient categories. I see a generation of potential advances squandered by not letting third-party tools in to study MathSciNet.

The right ideas could help here. One isn't protecting individual privacy, just a cash cow. The bar is lower.

syats 5 years ago

I'll tell you about International Data Spaces Assocation, just for the sake of completeness, and because others have mentioned some sort of certification of apps, etc. Finding a general solution to the problem posed by OP is quite difficult, as it requires a lot of extra infrastructure, technical and non-technical.

One idea would be:

1. distribute to the data owners a base system (something that can "run" stuff on their premises). People here have mentioned browsers, but for a more intensive processing this might not be enough.. so think of a docker daemon, keys for some docker registries, etc.

2. have a trusted "app store" (e.g. a docker registry where images are built in a reproducible manner from code which is inspected and certified, and then are cryptographically signed)

3. make a well described interface to the apps to consume the data (thinking of the general use case here.. if you just want to analyze fb info then you can make an adhoc parser...)

4. Have the data owner download, check the signature of, configure and run the app on their premises.

Things get even more interesting when the analytics need data from different non-trusting partners, so that Homeomorphic Encryption becomes necessary.

There is at least one specification that aims at supporting all of this: https://www.internationaldataspaces.org/wp-content/uploads/2... although implementation is, so far, lagging behind.

alfl 5 years ago

We [0] are getting quite far decomposing algorithms symbolically and then doing some fancy footwork with private set intersection. It ends up being better/faster/cheaper than homomorphic in a lot of use cases.

Shoot us a note -- would love to hear more details.

[0]: https://proofzero.io

amai 5 years ago

It sounds like Federated Learning might be of interest for you:

https://federated.withgoogle.com/ https://en.wikipedia.org/wiki/Federated_learning https://github.com/poga/awesome-federated-learning

cedricd 5 years ago

There's another approach you can do -- make the analysis portable instead.

Assuming data is in a standard format then you can share your script for people to run themselves. Obviously this is fairly difficult in practice unless you can bundle everything into a client-side script on a website.

For reference Narrator [1] does this -- it puts data into a standard format so that analyses written for one company can be run for another. I'm not suggesting you build your stuff on that platform, but it's an interesting approach that does exist.

[1] https://www.narrator.ai

jedimastert 5 years ago

Either the first party (i.e. the client) runs the data on their own turf or they hand the data to someone else (you or whatever third-party you use) and trust that the other end is going to treat your data right.

I'm sure there's some sort of homomorphic encryption[0] magic scheme that might let you process the data on other servers or something, but I could not even begin to tell you how. Really, it's just trust.

brian_spiering 5 years ago

Differential privacy is the field of study for sharing sensitive data in a way that allows analysis while retaining some guarantees of privacy.

  • lmkg 5 years ago

    Agreed, Differential Privacy is the name for this problem.

    Quick summary of important results: You will always leak a small amount of information. But it is possible to bound this leak to whatever level you consider "acceptable." The trade-off is statistical validity of the results (the usual approach adds "noise" to the data and/or analysis).

JosephRedfern 5 years ago

How is the service written? I'd look to compile it down to WASM or otherwise run it in the browser, if possible.

  • michealrOP 5 years ago

    Client side does make sense. I guess the user could upload there chat data zip file to the client side app. Which then locally would do the processing. The report itself could be saved, but not the data.

gostsamo 5 years ago

Adding the third party only complicates the issue because the user will have to trust you and the proxy, and the proxy will have to trust your code. Best case, let the user download your code as a mobile or desktop app and run the analysis themselves.

tjanez 5 years ago

You might want to check out Oasis' Parcel SDK: https://www.oasislabs.com/parcelsdk.

jhoechtl 5 years ago

What about Fully Homomorphic encryption? Would a FHE scheme enable to discover patterns without seeing the data?

  • michealrOP 5 years ago

    I theory I assume it would, my bottleneck would be just knowledge. Just don't know enough about FHE to comfortably work with it. FHE as a service would be my little mini dream.

    • jhoechtl 5 years ago

      There used to be an MIT CISAL research project I find no more traces which had as a project goal to establish FHE as a service.

      Apparently it failed.

sgt101 5 years ago

Could you send your code to their execution environment for a one time run (unlocked with a code?)

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection