Settings

Theme

23andMe told victims of data breach that suing is futile

arstechnica.com

52 points by leemailll 2 years ago · 33 comments

Reader

noduerme 2 years ago

To be clear, the victims of this breach may be targeted for much more than just identity theft or monetary losses, given the way the stolen data has been deployed. This was the first ever industrial-scale attempt to target people of a specific ethnicity using genetic data to identify and "doxx" them for potential hate crimes.

https://www.nbcnews.com/news/us-news/23andme-user-data-targe...

One could easily see, e.g. a citizen of a middle eastern country who had some surprising Ashkenazi background being targeted for death as a result of this.

  • jacquesm 2 years ago

    Exactly. The number of people making light of this in this thread is unsettling, to put it mildly. If that's the 'tech savvy' crowd then you have to really worry about everybody else. The parallels with the Dutch citizen registry story from WWII are just too much to ignore.

    • noduerme 2 years ago

      Jacques, I respect you, but I'm already apparently living in a world where my genetic markers make me feel like a rabbit-on-the-run, or at least they force me to pre-emptively apologize for other people with similar genes when I want to travel in most countries. I'm not going to be overly concerned about the constituents of this thread; their views are incoherent, but if they had much power to influence anything, they wouldn't be wasting their time writing here, and neither would we.

      Perhaps a better way to say this is that 'tech savvy' posters are the least likely to have a good handle on mainstream feelings, and I definitely feel safer among normal people than those who consider themselves arguers for technological progress without a supporting philosophy.

      [edit] I had a dream last night that the front of my house was all glass, including the ceiling. And I was on the couch with my girlfriend reading books when a Hamas protest came down the street, waiving green flags. They took positions on the roof of the school across from me, and they were pumping the air with AK47s and shouting slogans, looking down into our glass house and waiting for us to respond. We looked at each other and tried to make ourselves small and we did not want to respond to their slogans. They became more and more agitated that we were refusing to agree with them. We knew that we were marked for death. Then the police showed up and they moved down the street, chanting the same things.

      This is a small story. My father, stupidly, put his genes on this website without asking his children; thus I was a victim of this data breach (we're not surprisingly 90% Ukrainian/Belarusian/Ahskenazi Jewish and oddly 10% Irish). But what does it mean to dream people shouting a slogan that you'll either shout with them or die, into the glass living room of your house? It feels like a perfect metaphor for the time we're living in. I wish everyone in the world had that dream so they could understand what it feels like to be a true rebel who is alone against a mob.

      • jacquesm 2 years ago

        > My father, stupidly, put his genes on this website without asking his children

        Oh shit. That really sucks. My mother was about to do it but I talked her out of it.

jacquesm 2 years ago

We'll see how the EU data protection offices feel about that. Just imagine having something like this happen and then giving your customers the finger. The lack of ethics is impressive. I sincerely hope they get fined into oblivion as a nice example to the next medical company that doesn't understand their responsibilities towards their users.

  • ticulatedspline 2 years ago

    And what exactly where their responsibilities that they failed to understand?

    • jacquesm 2 years ago

      That they should have offered (and enforced) 2FA from day #1 because users will re-use passwords because they are utterly unaware of the implications of doing that. A company the size of 23andme in charge of a very large amount of medical data and PII should be aware of those implications. To blame the users here is beyond stupid and irresponsible.

      You don't engineer a service like 23andme without doing some risk assessment and one of the risks they should have identified and mitigated is password re-use by Joe Average because Joe Average (and his mom) were exactly the demographic that they targeted. Anybody that was somewhat sensitive to the privacy risks wouldn't have used the service in the first place.

      • ticulatedspline 2 years ago

        they do offer 2fa. Personally I do blame the users, it's like if I robbed your house and then you sued the city because there wasn't a law that required you to put steel bars on your windows and have 3 locks and your argument is "I moved into a area where crime could occur, the city should have known I was too stupid to secure my stuff, we want a nanny state!"

        as long as they weren't actively inhibiting security by not offering 2 factor or disallowing strong passwords, I don't think it's legally a company's responsibility to make their users eat their vegetables. good idea? maybe, but not required.

        • jacquesm 2 years ago

          Offering != mandating. You don't offer a service like this to the general public without ensuring that their data is protected from the most obvious attacks, and password reuse is probably the #1 candidate for that unless using very weak passwords is a better #1. Either of these should be very explicitly guarded against. If you can't do that you shouldn't be operating a service like this.

          What they are doing with this response is letting their legal department drive their car away from the scene of the hit-and-run. At least, that's what they hope.

        • synicalx 2 years ago

          While it might not be "legally" required (or maybe it is, courts haven't decided yet) it's in 23andme's own best interest to at least take some steps to ensure the technically illiterate users aren't leaving the front door wide open because if they don't then they end up in situations like this.

          They can blame anyone they want but at the end of the day it's their brand that's getting dragged through the mud right now and after this NO ONE will trust them ever again.

          • ticulatedspline 2 years ago

            oh absolutely they look bad, and they could certainly have chosen a more tactful response. Most people won't even understand the nature of the data loss, and it's likely to affect their bottom line. And honestly IMHO that's more than enough lesson to start forcing security down their customer's throats.

            But as I see it right now they have no legal culpability and calling for them to be drawn and quartered over it isn't exactly productive. Honestly I'd worry more about an industry knee-jerk reaction slapping crappy but CYA security on all kinds of sites if they lose the legal battle over this.

            • jacquesm 2 years ago

              Negligence is a perfectly valid reason for culpability and I see the fact that they offered a service with this kind of data to the general public without mandatory 2FA as negligent. If only because their users are more than likely to be unaware of the kinds of risks they are taking whereas 23andme knows exactly what kind of risk those users are taking: that's why they wanted their data in the first place.

              In my opinion the real reason why they didn't mandate 2FA is very simple: it would have alerted users to the fact that what they were doing was significant and it would have been a point of friction in setting up the account. But all they wanted is the data, the rest was infotainment and a sideshow from the POV of 23andme. The words 'duty of care' probably mean absolutely nothing to them.

              • michaeljx 2 years ago

                They could have mandated 2fa only at the point where they present the results.

                • jacquesm 2 years ago

                  No, they should have done it right from day #1 so that users (1) have confidence they are treating this with the seriousness that it requires, (2) to minimize the 'surprise' factor, (3) to ensure that also the users other data is properly protected. They also should have ensured HIPAA compliance for their US based customers and compliance with whatever local legislation was applicable for their customers elsewhere and to track any changes in that legislation. This includes full consent management, the option to withdraw consent at any point in time and to be able to deal with requests for removal of data, especially relevant given that the suppliers of the DNA material may later on have second thoughts about all this. Note that you don't just give DNA to a service like this on your own behalf but also on behalf of all of your siblings, descendants and ancestors.

                  Recognize the potential for actual damage before you decide to blame the victims here and then wonder why 23andme apparently did not recognized that potential. Also recognize that you can't exactly change your DNA, it is your identity.

        • philistine 2 years ago

          Blame is irrelevant. Do you honestly believe that 23andMe is reacting appropriately to the massive problem ?

          The only reasonable reason they are reacting this way is not a question of belief, it's their legal defence as PR.

      • synicalx 2 years ago

        As another commenter pointed out elsewhere, they do offer MFA. However from what I can gather it looks like they, like most other companies, don't mandate it's usage. Like you've said though given the kind of data they have they 100% needed to do better here and their response is bonkers.

chii 2 years ago

even if there's no financial compensation for the victims, it makes sense to make an example out of a company that doesn't actually take data privacy and security seriously.

  • ticulatedspline 2 years ago

    It would be dangerous precedent though. assuming they have a reasonable password policy it seems the breach was in no way related to a failure by 23 and me.

    they even offer 2 factor https://customercare.23andme.com/hc/en-us/articles/360034119...

    sure they could do better, but are they legally required to be better? They could force 2fa, or 3fa, or 4fa, and disable accounts that go inactive for more than a week and require a validating DNA sample in the mail to reactivate.

    if they're "made an example of" what exactly does that mean? at what point is an entity legally responsible for the irresponsibility of it's users?

    • noduerme 2 years ago

      I think it's more a question of encrypting data on the backend. The data wasn't stolen by phishing 16 million individual users' passwords. Companies that deal with sensitive genetic data should be subject to the same level of HIPAA compliance as those that deal with medical data, for instance.

      • beej71 2 years ago

        Weren't users willingly sharing that data with eachother? Encrypting it wouldn't make sense in that use case.

        • noduerme 2 years ago

          I don't actually know. But if a user wanted to share personal data with another user, I'd make a one-time key. I'm relatively certain that they took no precautions against someone with access to their database. In some scenarios for tiny companies that might be okay, if you don't store sensitive data; but not when it might get whole groups of people slaughtered based on their genetic profile.

          • beej71 2 years ago

            Might be a hard sell, though. People on Facebook share personal data with their friends in their profiles all the time.

            • noduerme 2 years ago

              Ugh. I'm so divorced from social media, I didn't even consider the marketing use case for "share your genetic data with your friends"... I wonder if this hack was just someone scraping an API for that (?!!)

              It's gross. On a side note, when I asked my father (an educated man in his 80s with a law degree) why he put our genetic information online without asking us, his response was that he didn't put it online, he mailed it, and it was just his own. I only say this to illustrate that the entire setup here resembled a con game to collect genetic data from unwitting people - which if they represented only 25% of the population would be enough to let you deduce the rest. The abhorrent fact that the was handled so flippantly is just icing on the cake.

              • beej71 2 years ago

                In this case, to be fair, it's not "share your genetic data with your friends", exactly.

  • tjpnz 2 years ago

    The allegation is that they weren't taking reasonable steps to safeguard customer data under California law, the problem is that it's not stated what reasonable is. What's needed here are clearer regulations.

    • jacquesm 2 years ago

      Common sense tells you that if you set up a service for the gullible to send you their DNA that none of your customers are going to be security and privacy conscious. You need to engineer your service accordingly.

anotherhue 2 years ago

If some authority doesn't roll out the guillotine for this one then we should all just give up believing citizens are important in the eyes of the state.

I think we all know the answer already.

synicalx 2 years ago

From what I understand, the hack was due to a large number of people re-using passwords and the company doing nothing to prevent or detect this.

Security practices and their ludicrously bad response aside, I cannot fathom why someone would send their literal DNA to a company and then take no steps to secure that information. Is technical literacy really this poor amongst the general population? Even my retiree dad who can't reliably turn on his TV on knows about MFA.

  • cassianoleal 2 years ago

    > the company doing nothing to prevent or detect this.

    How would they do that?

    I'm not defending 23andMe but I really don't see how a service can detect that the password I chose on their website is the same I chose on a different one. Not without: a) them knowing what my chosen password is; and b) them knowing my passwords on other websites.

    • ticulatedspline 2 years ago

      I have been defending them but there are things they could do, though I don't think they should be legally required to do so.

      Where I work the security team monitors PW leaks and run them against our userbase if we find matches we lock their accounts and force a reset, that password also goes into a file and becomes pema-banned from being chosen.

      we also force multifactor, which isn't bullet proof (heck if you used the same TOTP in 2 sites your hex key could get stolen) but it does go a long way. 2 factor is super annoying though and lots of places only offer crap methods like SMS (I loath to give out my phone number). personally I'd rather use just a strong site-specific password than be forced to provide my phone number.

    • devinegan 2 years ago

      Use a previously breached password database like the one haveibeenpwned offers. https://haveibeenpwned.com/Passwords

  • jacquesm 2 years ago

    Because users are idiots. Just like the people that build services. We all get it wrong and we all underestimate the risks. Professionals get phished and people will re-use passwords because it's easy to do and they simply don't understand or perceive the risk involved. They are unaware of how many breaches have already happened and that that password that they think is secure and only known to them is also known to hackers the world over due to previous dumps. It's not as if companies in general never pretended the breaches that they had didn't happen, that's very common practice to the point that it had to be outlawed in the EU.

ChrisArchitect 2 years ago

[dupe]

Lots more discussion earlier: https://news.ycombinator.com/item?id=38856412

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection