Settings

Theme

Facebook says new bug allowed apps access to private photos of up to 6.8M users

washingtonpost.com

512 points by chrisseldo 7 years ago · 279 comments

Reader

makecheck 7 years ago

I never assume that “settings” guarantee what they claim. It’s just not practical even with good intentions, for a single non-public code base.

As a developer, I know it is hard to implement something once, harder to implement consistently across multiple interfaces, and damn near impossible to keep correct years later after employee turnover and other twists.

The sad thing is that it costs a ton more money to do things really well, and companies can basically take advantage of the low price of doing things poorly until finally forced. And by then, they have tons of money so they can comply but any startup is screwed because now it costs more for everyone, even those entering the game.

  • jdc0589 7 years ago

    even moreso when you remember that SO MANY COMPANIES enforce most of their auth z/n at the edge, and are a lot looser between internal services

ucarion 7 years ago

Facebook is a global database of political dissidents, queer persons, apostates, and other categories of people whose physical safety is put in peril when their personal lives are leaked.

Facebook surely must be heavily fined and regulated for their misbehavior, because to fail to keep Facebook data safe is to put lives at risk.

  • bad_user 7 years ago

    Going to play the devil’s advocate. If you fine Facebook, you have to fine the small companies too, and even individual developers developing OSS, since the law should apply to everyone equally. Of course the fines have to be proportional to the number of affected users.

    So would you like a fine for your bugs? And note that contrary to other professions, software development doesn’t have generally agreed recipes for building bug-free software, so was that really negligence? Was it malpractice?

    Being fined for a contribution to an OSS project would be terrible, wouldn’t it? And no, the size of the company doesn’t and shouldn’t matter in the eyes of the law, only the impact.

    Also people uploading stuff on the Internet should really expect a best effort privacy. If you expect secrecy, then uploading shit on a platform meant for sharing is pretty dumb.

    Note that I will blame Facebook for willful privacy violations. And I hope to see them suffer under GDPR. But a bug doesn’t fall in the same category.

    • fhrow4484 7 years ago

      > If you fine Facebook, you have to fine the small companies too, and even individual developers developing OSS, since the law should apply to everyone equally.

      I would agree regarding small companies, but I wouldn't put oss developers in the same boat, fining the entity that provides a service makes more sense. It doesn't matter if that service relies on OSS or not.

      It's the company providing the service to the consumer who is responsible to vet the final product.

      A OSS developer has no idea if her/his code is going to be used by a gaming app or by NASA for mission critical stuff and shouldn't be made responsible if a bug in the oss project caused a rocket failure.

      Similarly a construction company providing wood (and that company isn't making any false claims about the level of quality): it should not be the company's fault if someone decides to use that wood for a bridge where concrete is needed. The bridge builder is responsible of picking a good material.

      • bacon_waffle 7 years ago

        > It's the company providing the service to the consumer who is responsible to vet the final product.

        I agree with your post, but I tend to think of facebook's users as providing the product (their attention). If the consumer is a company buying advertising, then where's facebook's motivation to be careful with a user's "private" data?

        • WA 7 years ago

          Sorry, but you’re overthinking this. Facebooks product is not advertising. It’s a platform that brings users and advertisers together. Just because it’s free for some or most users of the platform doesn’t mean that only paying people (advertisers) need to be protected.

          Under GDPR, it actually doesn’t matter if you charge money for your product or not. If you process personal data, you’re responsible for it. This also applies to private people with no commercial interest who start to gather data from strangers (in exchange for some service or whatever).

          Edit: The motivation should’ve been there from the beginning, if only for ethical reasons. Now the motivation is probably enforced by hefty fines.

          • bacon_waffle 7 years ago

            Overthinking or not, it seems like you're agreeing with what I was getting at...

            The post I responded to, I think, made a very good point about responsibility being on service providers rather than OSS contributors.

            However, the wording about "providing the service to the consumer" seems a bit problematic; it leaves the door open to discussions about who the consumer is, and thereby who is accountable. I'm glad you brought up GPDR - it seems to take the right approach, with regards to protecting personal data no matter who's holding it.

          • jachee 7 years ago

            Answer this: If facebook's product is not advertising, then how do they make money?

            • WA 7 years ago

              Answer this: if Facebook was only a product for advertisers, whom would they show the ads?

              Facebook‘s product is a platform. It can’t exist without users, and it can’t exist without advertisers (presumably).

              Since both end users and advertisers are part of the product, the data of all of them needs to be protected. It doesn’t matter who the paying party is.

              • mcny 7 years ago

                If we fine Facebook.com for bugs, do we also fine Mastodon.social? Gitlab.com? Movim.SE? I have a test instance of hasura running on heroku so... if there are bugs in hasura, will you fine me?

                If you say I can avoid penalities by saying my services are "as is", what stops Facebook from doing the same thing?

                • WA 7 years ago

                  Where do you draw the line between bug, neglect and outright malpractice?

                  Obviously, it’s still not clear for many people: All services that process personal data became more regulated through GDPR.

                  And yes, if any service loses its customers data, there will be a fine. The fine depends on many factors. And yes, even Mastodon.social or Gitlab.com (the service, not the OSS). The advantage of these platforms is that they actually don’t process that much personal data.

                  Behind any service is a legal entity that asks people for their data, to provide a service. These legal entities are subject to the same laws.

                  However, since the GDPR apparently determines fines on a case-by-case basis, they might give a low or no fine at all, if the service is non-commercial and had no intention to collect user data for commercial purposes. But the law still applies.

                  If you put a web service online that handles personal data, you must make sure to keep that data safe. It doesn’t matter if your service is free or not.

                  Turn this around: just because you as a user signed up for a non-commercial free service like Mastodon.social (the service, not the OSS you can host yourself), you wouldn’t want the admins of Mastodon.social to mess around with your data, no?

      • saagarjha 7 years ago

        Plus OSS software is often provided with not liability or warranty, which should protect developers from legislation.

    • 794CD01 7 years ago

      Absolutely. Fine everyone into the ground. Doesn't look like there is any other way to make people take security seriously.

      I'm not a fan of the overregulation of industries like aviation, but consumer software has gone too far in the other direction and is long overdue for an adjustment.

      • kaybe 7 years ago

        Maybe what is said about the aviation industry regulation - it is written in blood [0] - will be true about the software industry regulation as well.

        [0] As in: Every rule was included because someone (nearly) died because of it not being there before.

      • UncleMeat 7 years ago

        The end result of this is that the number of software companies drops by 99.99%. Does your company run anything on Linux? Too bad, there are vulns in the kernel and now you are fined into the ground.

        • rapind 7 years ago

          Let's assume you're not being dramatic. It would be a pretty lucrative market if 99.99% of current software companies went under. New businesses would show up with a much greater focus on security and quality. That's a bad thing?

          The realm problem is the inevitable regulatory capture that occurs in every market with even an ounce of complexity.

          Given the number of high profile breaches we see every month, I definitely think we're due for some consequences.

        • 794CD01 7 years ago

          I would hope for 100.

          We deserve it. Though of course others deserve it more.

      • Retra 7 years ago

        Fine everyone? Oh look, data breaches stop being reported. I guess we succeeded in reducing them?

        • pixl97 7 years ago

          Um yea, not the way it works. First external services, such as those provided by Krebs would find your data on the darknet. Second, offer employees a cut from the fines.

          • Retra 7 years ago

            Pay employees when their company is fined for security breaches? Damn, good thing I'm a software developer.

      • bad_user 7 years ago

        Really? How has "consumer software gone too far in the other direction"?

        Does it ... kill people? Does it enforce bad policies like the healthcare industry did for the past couple of decades, causing an epidemic of obesity, diabetes and heart disease, which are the top causes of death?

        Yeah, regulation there definitely helped /s

        • bosie 7 years ago

          > Does it ... kill people?

          Facebook asked users to upload nude photos. what if those get leaked and users commit suicide because of it? Would you (partially) blame facebook for their death?

          > Does it enforce bad policies like the healthcare industry did for the past couple of decades, causing an epidemic of obesity, diabetes and heart disease, which are the top causes of death?

          Genuine question but what policies are the reasons for the epidemic of the three death causes you just mentioned?

          • tomp 7 years ago

            They asked you to upload nude photos to help them identify revenge porn... so presumably this only makes sense for people whose nude photos are already online. Can’t really blame Facebook for that...

          • TallGuyShort 7 years ago

            >> Facebook asked users to upload nude photos

            I missed that one. By now even lay people should know that's a recipe for disaster.

          • bad_user 7 years ago

            > "Would you (partially) blame facebook for their death?"

            No, because doing nude pictures of yourself and then distributing them, no matter where, is just stupid. Parents should educate their kids to know better, or seek counseling if that mistake was made.

            You're also talking of a hypothetical situation. When planes crash, people die, guaranteed. And yearly there are more than 100 plane crashes.

            > "Genuine question but what policies are the reasons for the epidemic of the three death causes you just mentioned?"

            The recommendation for a diet high in sugar, high in wheat and other grains, high in vegetable oils / polyunsaturated fats (e.g. Omega-6), low in saturated fat, low in dietary cholesterol, low in salt.

            Children were fed in schools, diets were set in hospitals, foods where preferred in supermarkets according to these guidelines. That's not a debate I want to get into though.

            • mynameisvlad 7 years ago

              > You're also talking of a hypothetical situation

              Considering this article is about Facebook leaking 6+million photos to third parties, including photos that were uploaded but never shared, it's well within the realm of possibility that at least one of those millions of photos was a nude. In fact, I'd bet there were quite a few nudes in the leaked set. It only takes one more step to turn that hypothetical of yours into a reality.

        • sifoobar 7 years ago

          There have been several cases of bullying people to commit suicide where it's not obvious how the same thing could have been accomplished without without the leverage Facebook provides.

          I'm grateful I didn't have to live through this as a teenager, it's a shark pool.

          So how long do we keep pretending that allowing this to go on is a viable way forward?

        • 794CD01 7 years ago

          Others sufficiently cover the actual killing. I would only add that wasting peoples' time and/or money at scale is just as bad. Waste 30 seconds for each of 100 million people and that's an above average human lifetime.

          BTW, how do you think anti-vaccination, healthy at any size, and minor attracted people ideas became popular? I specify those only because they are particularly heinous, but if you want official policy, just look at literally any election, though the 2016 US presidential election and the brexit referendum are the standouts in terms of memes.

          • bad_user 7 years ago

            > "Others sufficiently cover the actual killing"

            I haven't seen any response yet. Does Facebook kill people, yes or no, it's a simple answer.

            > "wasting peoples' time and/or money at scale is just as bad"

            What?

            > "how do you think anti-vaccination, healthy at any size, and minor attracted people ideas became popular?"

            In that regard all Facebook does is giving people the tools to exercise their freedom of speech, possibly with an algorithm for that feed whose effects they couldn't predict, because it was built to maximize profits, not sanity ... and that will never be illegal ;-)

            I understand some of the arguments that Facebook encouraged fake news, however speaking as somebody that was born in communism, I can tell you that fake news isn't new, it happened before WW I, it happened before WW II, it happened at the east of the Iron Curtain (at least) during the Cold War, and it happened just as well afterwards.

            In my country distributing news via Facebook isn't even that popular, yet fake news is flourishing ... on TV. People are always looking for a scapegoat, for an easy answer, for an easy fix. It's only natural, but it doesn't make it right.

            No, I don't think Facebook is to blame for fake news, even if it might have contributed. Facebook can't be responsible for the poor education that people are given.

    • JustSomeNobody 7 years ago

      > If you fine Facebook, you have to fine the small companies too...

      Absolutely nothing wrong with that. If a small trucking company has a driver that speeds, that driver gets fined the same way a driver for a large trucking company does.

      > Of course the fines have to be proportional to the number of affected users.

      Of course.

      • bad_user 7 years ago

        Personally I hate analogies.

        The recipe for how a driver should not go over the speed limit is well known. Nowadays you even have the GPS apps alerting you and many trucks get monitored in real time from the dispatch center, drivers risking to be fired if not exactly on schedule.

        Most software projects are greenfield ... people reuse previous work when available and for a good price, but all custom changes are greenfield.

        Do you really think that the guy responsible for Heartbleed [1] was aware when he introduced that bug, just like a truck driver going over the speed limit?

        It's really not the same thing, lets not pretend that it is and regulation in this field would have a chilling effect for open source or startups, because only big companies like Facebook will still be willing to develop critical software, which is definitely not what we want.

        [1] http://heartbleed.com/

        • Spooky23 7 years ago

          No, you’re wrong. Speeding is one minor factor in driver or truck safety. Motor Carrier regulation is about much more than speeding. It’s a difficult problem that is addressed via a federal/state/local framework of regulation and enforcement.

          I’ve worked in engineering roles where law made me potentially criminally liable for negligent handling of certain data. We took things more seriously than Facebook.

        • JustSomeNobody 7 years ago

          > It's really not the same thing, lets not pretend that it is...

          You're right. Look, software is complicated, and there's no way, yet, to make it bug free for any meaningful system. I get that. But at the same time, let's stop calling ourselves engineers if we keep hiding behind 'bugs happen'. We need to be a LOT more responsible than that. Does that mean regulation? If we keep going down the road we're on, yes. Because I gotta be honest, I'm tired of hearing, 'bugs happen' and I, the consumer, am the one who suffers.

    • spiderPig 7 years ago

      Hoarding data is different from OSS. We’re talking about infra providers. IMO, they should be regulated like public utility companies.

      • bad_user 7 years ago

        Really, so does Mastodon qualify?

        https://joinmastodon.org/

        How about a blog commenting system that leaks emails due to a bug, something like Isso:

        https://posativ.org/isso/

        Basically I don't like these arguments because it's about the company's size. Facebook should be punished because they are big, have a lot of data and we don't like them, right? No matter how you look at it, it's a Pandora's box.

        • Apocryphon 7 years ago

          Maybe there should be a general regulatory framework which all data-storing entities should be subjected to, with stiff penalties for the largest violators, as they can shoulder the burden of the biggest burdens.

          Is this not how it works for every other industry? Up until the 2008 bank bailouts, that is.

          • bad_user 7 years ago

            That does not answer my questions.

            So what should the penalty be for a 14 year old that contributes a bug into a project like Mastodon or OpenSSH or whatever, which then leaks the data of tens of millions of people?

            All this would do is to have a chilling effect on the industry such that only big companies like Facebook will be able to develop critical software, due to being able to afford it. And yes, this happens in all the industries you're talking about. And it did not stop the market from crashing, it did not stop malpractice.

            Also this regulation will probably not stop Facebook from lawfully violating privacy.

            • Apocryphon 7 years ago

              Maybe there should be a chilling effect on the industry, and the drive to consume ever more personal data is harmful to society and wrong.

              • bad_user 7 years ago

                Oh, but that's the thing, there's no regulation that can stop the consumption of personal data. Let's be clear, we are talking about bugs. The consumption of personal data will continue, because:

                1. consumers want it

                2. governments want it

                The only thing regulation will accomplish is that only companies like Facebook will be able to do it. Yeah, big win.

    • drb91 7 years ago

      > Being fined for a contribution to an OSS project would be terrible, wouldn’t it?

      Not if your contribution causes harm. A fine would be a more than welcome addition to consumer protections.

      • bad_user 7 years ago

        In other words you'd rather have companies like Facebook develop critical software, because that's exactly what would happen, because only companies with big pockets would be able to afford it, is that right?

        Funny, because community-driven open source is the only hope for replacing Facebook with something that is privacy oriented.

        • drb91 7 years ago

          I think you have responsibility for your creations. I do not believe that people build things because it is risk free.

          However, there needs to be SOME distinction beyond intent, which is often impossible to discern.

  • TaylorAlexander 7 years ago

    I don’t get it. If we agree it sucks, why not just stop using it? Why do you want to use something made by people who don’t care about you?

    • EForEndeavour 7 years ago

      The criticism of OP implies nothing about whether they personally use Facebook. The point is that a large fraction of people do use it, and leaking "private" photos of some people can endanger them.

      That said, you and I can agree that FB sucks, and delete our accounts. It is up to other people whether they follow suit.

  • mav3rick 7 years ago

    These people have an option to not use Facebook.

    • bluetidepro 7 years ago

      You have an option not to use a ton of industries that are still heavily fined and regulated for their misbehavior. That logic doesn't matter in this context.

    • denlekke 7 years ago

      they track and keep data on non-users as well though

    • abrahamepton 7 years ago

      You never know when, one day, you'll find yourself in a similar group of targeted people based on some otherwise-mundane characteristic. And by then it'll be too late for you to "not use Facebook".

bluetidepro 7 years ago

> "We're sorry this happened."

That about sums it up for all these privacy breaches these days. It's getting to the same level of "thoughts and prayers" for tragedies. No actual change or consequences for the problems happening, just empty "sorries" and "promises" that it won't happen again/they'll get it fixed. I don't know if this is a GDPR violation or not (as someone else asked), but if it is, I hope we start actually seeing action of these sorts of things.

  • canttestthis 7 years ago

    > I don't know if this is a GDPR violation or not (as someone else asked), but if it is, I hope we start actually seeing action of these sorts of things.

    Sounds like you're suggesting that we criminalize software bugs.

    • bluetidepro 7 years ago

      Yes, I am suggesting that. I don't necessarily think jail time is the right thing, but I do think something like meaningful fines are more than reasonable for major software bugs that cause these kinds of breaches of privacy. It will make larger companies like this be much more careful when money is on the table for them to lose.

      To me, if we can criminalize something like a major oil spill such as BP/Deepwater Horizon, how is this much different? It's not like they did the oil spill on purpose, but they still need had consequences for those risks that they were taking. Software companies, esp larger ones like Facebook, should have the same kind of consequences for their risks of software bugs that cause these kinds of privacy breaches.

      Also, as someone else below pointed out to someone else with a similar tone as your phrasing of "criminalize software bugs": "intentionally obscuring the debate. Gross negligence is an entirely different standard than just software bugs."

      • jetrink 7 years ago

        It's not unprecedented either. Under HIPAA, the Department of Health and Human Services has fined organizations millions of dollars for data breaches resulting from unpatched software and inadequate security practices.

        • creaghpatr 7 years ago

          And on that note, you see a lot less (though not zero) breaches of healthcare data and most HIPAA violations are due to analog errors rather than digital exposure.

          The government does a good job in this area forgiving innocuous violations, as long as all parties disclose it immediately and follow procedure.

          • repolfx 7 years ago

            Do we see less? Or are the serious ones never reported. Breaches of health data are not exactly trackable back to the source, assuming they're even abused at all.

      • elliekelly 7 years ago

        It doesn't necessarily need to be criminalized but when a company's software results in damages they should be responsible for compensating people for those damages. It's no different than consumer product liability.

        The problem is that we're all giving our data away to these "free" platforms. That makes it difficult for a user to argue that they've "lost" something of value when there's a breach. But of course the user has lost something of value. Facebook has built their entire company around the value of our information but we let them have it both ways. It's valuable when they're selling it but worthless when they fail to protect it. Statutory damages for data breaches would deter negligence and (partially) compensate users who have been victims of data breaches.

      • jammygit 7 years ago

        Canada recently passed a law that adds fines to data breach incidents iirc. A professor mentioned it and its why I'm researching auth on my winter break.

        Come to think of it, does anyone know of good auth resources for a mean stack that isn't a copy paste blog? I'm trying the udacity auth course as a starting point (uses oauth2)

      • stef25 7 years ago

        Jail would be full of WordPress devs

      • newsopt 7 years ago

        Just a quick question, do you write software? Do you have a legal or economic background? It seems pretty clear to me that anyone suggesting that software bugs in applications that have no risk of causing physical harm should have criminal liability has no idea what they are talking about and what damage such a law would cause.

        Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.

        • eropple 7 years ago

          Hi. I've worked in medical software repeatedly. I totally want to deal with HIPAA. It's a good idea for clients (the people who actually matter) and it's not nearly as difficult a prospect to work with as people say. The set of demands it makes upon you are small and reasonably constrained and are nearly all process-based rather than technical. Where it is technical, plenty of folks will sign a BAA for you to take big chunks of the technical stack off your hands, too.

          "But HIPAA" has never, in my experience, been employed except by people who find the idea of doing the right thing inconvenient or inconveniently expensive. (It is virtually never that hard and its benefits are clear.)

          There are reasons for not modernizing tech stacks in the medical space. HIPAA is, in every case I've ever observed, not a meaningful one.

          • newsopt 7 years ago

            >"But HIPAA" has never, in my experience, been employed except by people who find the idea of doing the right thing inconvenient or inconveniently expensive. (It is virtually never that hard and its benefits are clear.)

            Thank you for directly attacking my character without even addressing my actual argument.

            I'm not arguing against HIPAA, I'm arguing against such regulations in spaces that don't require that kind of sensitivity. I think that medical data absolutely requires the protections it has. But it absolutely has had the unintended consequence of making current medical data more insecure and stifling innovation in the space. Most doctors don't even follow HIPAA compliance sending patient medical records over email.

            I would estimate that 40% of doctors today are not compliant with HIPAA, sending X-rays and other similar patient information over email with providers that they haven't signed BAAs with.

            >There are reasons for not modernizing tech stacks in the medical space. HIPAA is, in every case I've ever observed, not a meaningful one.

            Then please enlighten us. Up until a few years ago (maybe even just a year) you couldn't use AWS to host medical data. Today you can't use Google Cloud to host medical data unless you are a large enough business to be able to get into contact with one of their sales reps. Can you even sign a business associate agreement with digital ocean? So up until a year ago you could not even have a small healthcare startup hosted on the cloud. Please explain to me how this hasn't stifled medical software innovation.

            If it isn't HIPAA it's some other outdated regulation.

            https://slate.com/technology/2018/06/why-doctors-offices-sti...

            • eropple 7 years ago

              > Thank you for directly attacking my character without even addressing my actual argument.

              "But it's hard, for no actual reason I will define" is not a meaningful argument. So--when one hears hoofbeats, think horses, not zebras.

              > I would estimate that 40% of doctors today are not compliant with HIPAA, sending X-rays and other similar patient information over email with providers that they haven't signed BAAs with.

              Probably true! But that's their own damned fault. Medical has Artesys and similar, dental has Apteryx and similar. This problem is largely solved but for hands-on unwillingness to use them.

              Those providers should be nailed to the wall, the wall should not be torn down for them.

              > Up until a few years ago (maybe even just a year) you couldn't use AWS to host medical data.

              AWS has been signing BAAs since at least...2013? I believe the first time I looked into it was 2014. But, regardless--if your innovation was so tremendously stifled by this, I'm not particularly sympathetic. I've been running my own services and writing them too for at least a decade and you can do thou likewise, I promise. I am, however, saying that today it's very easy to do so 'cause Amazon is all-too-happy to sign one.

              Also, I haven't had to use GCP for HIPAA-covered entities--found their BAA pretty easily though!--but even assuming you're correct the idea that you have to, hiss, talk to somebody before getting them to take some legal responsibility for your held PHI, I don't find that to be a particularly nasty requirement. I still find it odd that AWS will just let you sign right through with AWS Artifact.

              Azure's all-too-happy to sign one, too. Not that I'd recommend it.

        • EpicEng 7 years ago

          > It seems pretty clear to me that anyone suggesting that software bugs in applications that have no risk of causing physical harm should have criminal liability has no idea what they are talking about and what damage such a law would cause.

          So you're fine with financial losses, loss of privacy, and the material harm that goes along with both? Disregarding the impact that data breaches imply is just naive.

          > Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.

          I wrote medical device software for more than a decade. HIPAA has nothing to do with it. Many systems run on outdated platforms because the cost of replacing them is deemed to outweigh the benefits. That determination is debatable on a case by case basis, but in practice we see a hell of a lot more damage being caused by breaches of companies running on modern technology than we do e.g. hospital systems or LIMS.

          • newsopt 7 years ago

            > and the material harm

            please, if there is provable material harm they can take it to civil court.

            • EpicEng 7 years ago

              Uh huh, I'm sure it's just that easy, right? I mean, I'm certain it's an even playing field even for someone like me who has the money to hire an attorney. Hell, why regulate these industries at all? We can just file civil suits, right? Even if you win it costs less for them to settle than it does to change the way the do business/security.

              We regulate the finance industry not because of a risk of physical harm, but because financial harm can be equally serious and civil suits do not act as a sufficient deterrent to bad behavior by the powerful. Why do you feel this sort of thing is different? I believe the only real difference is that this sort of thing is new, not well understood by most, and we just haven't caught up.

            • nradov 7 years ago

              It's not practical for individual users to litigate, but hopefully someone will bring a civil class action suit.

            • EpicEng 7 years ago

              Additionally, you can't always show a direct link between a data breach and e.g. subsequent bank fraud.

        • elliekelly 7 years ago

          HIPAA only carries criminal penalties when someone knowingly discloses covered information - not a software bug. Until the bug is identified at least. For the most part HIPAA is enforced with civil penalties.

          And your "nightmare" scenario of (civil) liability flowing from programming bugs already exists in the investment world and it hasn't come apart at the seams. Google Axa Rosenberg. A coding error in their trading algorithm went undiscovered for two years. Negligent for sure, but not why the SEC went after them. The problem was they didn't promptly disclose the error to investors and they didn't promptly correct it. Algorithmic trading firms should have mechanisms to catch errors, correct errors, and disclose those errors to investors. And after seeing Axa Rosenberg's $250 million fine and Rosenberg's lifetime ban from the industry guess what they all implemented?

          • reaperducer 7 years ago

            HIPAA only carries criminal penalties when someone knowingly discloses covered information

            This is false.

            Source: Works for a company that has mandatory HIPAA training for every employee every six months.

            • jahlove 7 years ago

              > This is false.

              citation please. Here's mine:

              > Criminal penalties

              >

              > Covered entities and specified individuals, as explained below, who "knowingly" obtain or disclose individually identifiable health information, in violation of the Administrative Simplification Regulations, face a fine of up to $50,000, as well as imprisonment up to 1 year.

              >

              > Offenses committed under false pretenses allow penalties to be increased to a $100,000 fine, with up to 5 years in prison.

              >

              > Finally, offenses committed with the intent to sell, transfer or use individually identifiable health information for commercial advantage, personal gain or malicious harm permit fines of $250,000 and imprisonment up to 10 years.

              Source: American Medical Association

              https://www.ama-assn.org/practice-management/hipaa/hipaa-vio...

              • reaperducer 7 years ago

                My company's lawyers disagree. I'll go with my company's lawyers' judgement over a group that exists solely to protect the interests of its member doctors.

                • eropple 7 years ago

                  Are these lawyers you have talked to and gotten meaningful and nuanced advice from, or are they lawyers your bosses have talked to and derived maximally avoidant policies from? I'm not saying that you shouldn't have policies that fit your risk profile, but I ask because I have been in those former conversations (and I have done a nontrivial amount of auditing+compliance work in this space) and have never come away with such an impression, while at the same time the level of perceived risk that your bosses derive from those conversations can be entirely untethered from the level of risk that actually exists. (This space is full of people saying "oh, HIPAA means we can't do that" as shorthand for "I don't want to do that," after all.)

                  • reaperducer 7 years ago

                    They are lawyers who personally do our training and put together testing material based on that training.

                    To me that trumps a non-lawyer’s interpretation of a non-legal web site.

                    • eropple 7 years ago

                      If you read the sibling comment where Spooky23 cites the HHS page on HIPAA, it might be worth ruminating on that versus your interpretation of why your company's lawyers lay out the training in the way that they do.

                      That they have a different company risk profile doesn't necessarily change the facts at hand. And, TBH, they don't have to tell you the truth if it helps achieve their immediate goals. (They can tell you you'd be personally and criminally liable. It might make you do what they want better. It might also not be true.) Or it may all be in good faith. But what you describe doesn't square with anything I've ever worked with, at multiple clients and employers.

                • Spooky23 7 years ago

                  They are wrong generally speaking. Willful conduct is the standard for criminal liability. A developer in good faith introducing a bug or inheriting one from a third party is not in that situations

                  My guess as to why the draconian position is more about the internal process. You have to identify and disclose breaches in a timely way; if you don’t the company is at risk.

                  From HHS summary of the rules:

                  (See: https://www.hhs.gov/hipaa/for-professionals/privacy/laws-reg... ) (it’s also laid out in the regulation which I don’t have time to find.)

                  “Criminal Penalties. A person who knowingly obtains or discloses individually identifiable health information in violation of the Privacy Rule may face a criminal penalty of up to $50,000 and up to one-year imprisonment. The criminal penalties increase to $100,000 and up to five years imprisonment if the wrongful conduct involves false pretenses, and to $250,000 and up to 10 years imprisonment if the wrongful conduct involves the intent to sell, transfer, or use identifiable health information for commercial advantage, personal gain or malicious harm.”

          • selimthegrim 7 years ago

            Static analysis?

    • pjmorris 7 years ago

      Hammurabi's code (~1700 BC) includes this about building:

      Building Code

      229. If a builder builds a house for a man and does not make its construction sound, and the house which he has built collapses and causes the death of the owner of the house, the builder shall be put to death.

      233. If a builder builds a house for a man and does not make its construction sound, and a wall cracks, that builder shall strengthen that wall at his own expense.

      Bugs in houses have been criminalized for a very long time. Online data may be less fundamental than safe housing, but housing our data safely becomes proportionally more important as more of modern life depends on it.

      [0] http://www.wright.edu/~christopher.oldstone-moore/Hamm.htm

      • zxcvvcxz 7 years ago

        "Skin in the Game":

        If a social media company leaks private photos of its users, the company's executives and senior staff shall have its photos leaked.

        I would love something like that. Nobody protects anyone else's interests in this modern world unless there's Skin in the Game. Would highly recommend reading Nassim Taleb's book of the same name; he is popularizing this term, and its implications to society.

        • vbuwivbiu 7 years ago

          the same "eye for an eye" goes for "if you have nothing to hide" - well then, you first

          • reaperducer 7 years ago

            the same "eye for an eye" goes for "if you have nothing to hide" - well then, you first

            You misunderstand what "eye for an eye" means.

            "Eye for an eye" means let the punishment fit the crime.

            Before "eye for an eye" was established by religious texts, the common retaliation for poking someone's eye out was death.

            "Eye for an eye" was a step towards a more civilized justice system.

            • vbuwivbiu 7 years ago

              I use it in the context of Taleb's book, mentioned in the parent, wherein he discusses this - symmetrical risk

          • fhbdukfrh 7 years ago

            Yes, this approach assumes universal parity in the result of the act, which is far from true. For example the leaked health records of a healthy individual are not as damaging as those of someone with a lifelong condition, yet both cases are very, very bad.

      • dkrich 7 years ago

        But they aren’t. Most home sales in the US follow caveat emptor. If you buy a house from a private seller and then later discover mold in the walls or a crack in the foundation I wish you luck in getting the seller to pay for the repair.

        • biztos 7 years ago

          As far as builders are concerned, isn't that precisely why they are usually bonded[0] in the US?

          [0]: https://www.angieslist.com/articles/hiring-contractor-whats-...

        • fhbdukfrh 7 years ago

          They do not carry the as-is, where-is clause you describe unless explicitly. If you can prove prior knowledge they are still liable. Proving the prior is the hard part, but private sales most certainly enforce a good faith clause

        • josho 7 years ago

          New home sales include warranty which is the modern equivalent of the parent’s story. While buying used if you can prove that the seller knew of a mold problem and didn’t disclose it then you have a legal case.

        • perfmode 7 years ago

          The parent was establishing precedent.

          • dkrich 7 years ago

            And I'm establishing that the precedent cited doesn't exist, not in modern times anyway. If the argument is that home sellers are punished if they don't protect the buyer and so data sellers should be punished if they don't protect the data, that doesn't really hold up because home sellers aren't typically punished.

            • pjmorris 7 years ago

              Note that I (OP) am referring to builders, not sellers. With respect to the systems that hold FB's data, I'd argue they are more like builders than sellers.

        • Spooky23 7 years ago

          Builder is the key. If you have a house built and the foundation is cracked or mold is growing, you’ll be able to successfully sue.

      • sonnyblarney 7 years ago

        When houses fall they fall they kill people.

        And houses are not free.

        • Barrin92 7 years ago

          Need I remind you of the genocide in Myanmar organised to a large degree on social media, or the threat to political dissidents all over the world when authorities or malicious actors can get their hands on leaked, private data of individuals?

          Are you seriously suggesting that information in this day and age does not have the power to directly cause harm, including lethal harm, to people?

          • sonnyblarney 7 years ago

            I am suggesting that Facebook is free, fleeting software, and that people shouldn't put a huge host of stock in it. Even legally.

            Also - that it can be used for organizing bad behaviours is an entirely different subject that has little to do with quality or security.

            • fiblye 7 years ago

              If Facebook were free and its company had a completely hands-off approach with your data, much like a company that makes paper notepads never looks at what you write, I'd agree.

              But Facebook is a company that actively snoops and uses the data of its resources, the end-user. It's like a security guard who's paid to prevent shoplifting actively ignoring violent crimes because it's not related to stealing, or a baby food company ignoring that some metal got into their food because the food is still OK if you pick out the metal bits.

              • sonnyblarney 7 years ago

                I think you make a good point, at the same time, it's nary impossible to decide right from wrong on the internet, there are many sides to every story.

                If the government wants to hire people to decide what counts as what - they should do that.

    • lalaithion 7 years ago

      If a plane crashed, and the company that manufactured the plane was fined because they had an engineering bug, no one would blink an eye.

    • ProAm 7 years ago

      > Sounds like you're suggesting that we criminalize software bugs.

      When there is irreparable damage I believe it should be criminalized. You cannot regain privacy after an incident such as this, it is irrevocably taken from you against your will.

      • UncleMeat 7 years ago

        Suppose there is a bug in the Linux kernel. Some business runs their webservers on Linux. They have user email addresses (PII). Is Linus responsible for breaches? If so, then OSS dies. If not, then how do you intend to prove that their are no vulns in any of your dependencies for the rest of time?

        • AlexandrB 7 years ago

          This is silly. If I build my bridge with equations I find on mathoverflow, the forum is not responsible for my bridge collapsing.

          If you’re using OSS for mission-critical software you must either ensure that it’s fit for purpose or pay someone to do it for you. Nothing in the Linux Kernel documentation suggests that it can/should be used for flying airplanes of securing PII without doing additional due diligence.

        • pixl97 7 years ago

          The person storing the data is the one responsible for securing the data. Everyone keeps trying to push data security up the stack, but the company/ individual collecting it is the responsible party.

    • ColinDabritz 7 years ago

      software bugs cause mass harm. We can't ask people to never make mistakes, but we can ask that they have appropriate practices, standards, quality controls, and care. Not taking appropriate measures to mitigate risks that can significantly affect millions of users is willful negligence, and should be a crime.

    • inetknght 7 years ago

      We criminalize other actions which result in harm to people. Why not software bugs too?

      • sdenton4 7 years ago

        Well, there's two big differences... Planes have far fewer unknown unknowns than software: the specter bug in Intel chips is a great example of a place where the standard operating procedure was wrong, but no one ever knew it. It wasn't a case of negligence, though it had real world impact.

        The other big difference is that (for the most part) keeping a passenger plane in the air isn't an adversarial task. Actual breaches are the result of active bad actors, which is completely different from the problems you encounter in designing a plane.

        So criminal action seems crazy to me, though I can definitely see a great case for changing the incentives around storing user data. Could definitely see a good case for fines (and even an ongoing per-user tax, to make it an up front cost) for storing PII.

        • orhmeh09 7 years ago

          I think it’s disputable that “no one ever knew” (or could have guessed) regarding Spectre and speculative execution generally. Intel took a risk for the sake of performance. This did not take long to find:

          Wang&already, 2006: “Information leakage through covert channels and side channels is becoming a serious problem, especially when these are enhanced by modern processor architecture features. We show how processor architecture features such as simultaneous multithreading, control speculation and shared caches can inadvertently accelerate such covert channels or enable new covert channels and side channels. We first illustrate the reality and severity of this problem by describing concrete attacks. We identify two new covert channels. We show orders of magnitude increases in covert channel capacities. We then present two solutions, Selective Partitioning and the novel Random Permutation Cache (RPCache). The RPCache can thwart most cache-based software side channel attacks, with minimal hardware costs and negligible performance impact.”

          http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.190...

    • _dw7s 7 years ago

      > Sounds like you're suggesting that we criminalize software bugs.

      When my dad went to college, a very old and bitter professor (this was Civil Engineering, communist Eastern Europe) told the students on the first day in class something along the lines of: "If you know you're stupid or don't give a shit about your work, just go home and save everyone the trouble of dealing with your future fuckups. Mistakes here can cause deaths or losses of huge amounts of money".

      I believe we've reached the point in which negligence in the software world can cause loss of lives, even when the software is not operating a crane or an airplane (think Grindr leaking account data over http in Saudi Arabia).

      So you're minimizing the issue by asking if we should criminalize software bugs. We should and currently do criminalize negligence. If bugs are a result of negligence (you know, 'move fast and break things', 'better to ask for forgiveness than for permission') then fines, jailtime and criminal records should be a'coming. This is no longer child-play, this is the new world which runs on software.

    • joering2 7 years ago

      Why shouldnt we? If you are proven in court of law you knew about certain bugs or you underspend on engineering while your public outreach grew expenantially, how come you shouldnt be liable?

      Anywhere else its a plain case of malpractice whether its law or medicine, etc.

    • JKCalhoun 7 years ago

      Do we have anything more than the company's word for it that it was a bug?

    • village-idiot 7 years ago

      The alternative is to allow the least competent actors continue to lose user data without any consequences.

    • pasxizeis 7 years ago

      Sounds like you're suggesting we take no action when law violations happen.

    • fiblye 7 years ago

      Facebook sure does have a lot of "bugs" that grant them access to things they shouldn't have. Things that allow them to profit immensely.

      It happens with such regularity that I'm amazed anyone here would be kind enough to accept their fake apologies for clearly malicious actions.

    • rhizome 7 years ago

      I think criminalizing commercial data leaks is a more workable idea.

    • dudul 7 years ago

      When we're entering an era where everything will involve software, yes, bad software and careless bugs should be severely punished.

    • alxlaz 7 years ago

      But it's not the mere existence of software bugs that is at issue here. Everyone's first attempt at solving a problem in software is going to have bugs. Everyone's last attempt at solving a problem in software is likely to have bugs -- that's why we design systems with safelocks in place.

      There is risk in any human endeavour that touches upon someone else's life, in every domain. But, for example, only some of the deaths that occur in a hospital are the result of malpractice. That is the type of mistake for which we hold others accountable: not the mere act of providing insufficient care, but the act of providing insufficient care as a result of a dereliction of professional duty or a failure to exercise an ordinary degree of professional skill or learning.

      IMHO:

      1. If this was a novel, or very complicated breach, that Facebook did everything possible to avoid, but avoiding it was beyond the knowledge and skills of their security, engineering and QA teams, who otherwise did their absolute best, then it's at the very least defensible. One could argue that you shouldn't handle private data if you can't do it securely, but risk is inherent to anything, and perhaps worth it under the right circumstances.

      2. If this was just "move fast and break things" policy, then a big fine is in order, and if no insurance is in place, whoever approved it should get to pay it out of their own pocket. This is the equivalent of a civil engineering company designing a collapsing bridge because everyone showed up at work hungover, or skipped safety calculations because they just take too damn long and time to market is critical.

      If you think gee, this was just a bunch of photos, man, it's not like a bridge collapsed, how certain are you they didn't end up traded on the black market, or used for blackmail? Bet-your-company's-profits certain they weren't?

      3. If this was deliberate policy -- not just accident, but a conscious business decision that was then reverted and declared a breach -- then whoever came up with it and/or approved it should be facing jail time.

      Edit: also, it pisses me off that people are trying to decide how responsible we should about what we do based on other fields. They don't fine companies that write crashing firwmares for planes or cars or they fine it X amount, clearly we're only doing computer stuff so we should be fined less, no?

      What the hell? First, they are fined (see, for instance, Toyota, who were fined 1.2B for their infamous acceleration firmware bug). And second, even if they weren't, we shouldn't be aspiring to do the worst thing that's still acceptable! We should be striving for better than anything else, not for well, at least we're not worse than civil engineers...

  • izzydata 7 years ago

    The punishment should be mass loss of users due to loss of trust, but for some reason people still use it.

  • LinuxBender 7 years ago

    This may change when lives are actually lost. Self crashing cars have the potential to destroy the potential saving of life by intentional or accidental software bugs and security vulnerabilities. A congressman I was speaking with rephrased my statement and said that I was suggesting that self driving cars be treated as medical devices. I wholeheartedly agreed with his wording.

    That said, the same changes won't likely occur with sites like FB unless it can be proven that the data leaked lead to loss of life or physical harm. They create incentive's for people to happily be the product. How do we prove that damage has occurred to the product? Have any forums popped up where people share stories of harm to their family as a result of data leaked from FB?

    I can imagine GDPR being useful in the EU for corporate FB accounts. Wasn't FB working on a work-specific version of their site? If so, corporate legal teams would get involved in leaks, I would imagine.

    • r00fus 7 years ago

      It will happen if FB have to suffer financial consequences. GDPR will help, but we need companies to understand that personal data is not an asset, they are also liabilities.

      • LinuxBender 7 years ago

        You are probably right. I would be curious to see a review from a lawyer on the AUP that users agree to. Some things can not be waived in certain jurisdictions. Otherwise people may have given permission and ownership of some of their data to FB to do with as they please.

    • eitland 7 years ago

      > I can imagine GDPR being useful in the EU for corporate FB accounts. Wasn't FB working on a work-specific version of their site? If so, corporate legal teams would get involved in leaks, I would imagine.

      I can imagine GDPR working very well for consumers as well, and it seems we are up for some real legal entertainment in the next few months/years :-)

      Edit: It also wouldn't surprise me if it gets worse before it gets better. If I was a publisher right now I'd seriously consider blocking access from EU countries. (But that would of course be an invitation for a small, agile publisher who'd succeed either with a micropayments based approach or a context based ads approach.)

  • fullshark 7 years ago

    Until consumers reveal that they care this is how it will be unless governments regulate/punish.

    • neurobashing 7 years ago

      my response to this is always in the vein of, "how exactly should customers show they care?"

      "Well, leave!" isn't an option. They can't leave. Quitting Facebook when you're an active user means you lose a huge amount of social contact. I can think of a dozen people I know who are there because it's how they send baby pics and the like to family. They're non-technical and don't care about federated mastodons, they just want to see their niece and go to their high school reunion.

      So yeah they get really mad at this stuff but the network effect is so strong, you can't simultaneously convince the entire graduating class of whatever to plan reunions via some new thing when 1)everyone's already on facebook and 2)they've been using it for so long it's part of their workflow.

      • kodablah 7 years ago

        > my response to this is always in the vein of, "how exactly should customers show they care?"

        The answer is the same with any other foul business practice you oppose. The problem is not unique to digital businesses and I really hope those demanding justice don't request something more brash than they otherwise would in a non-digital situation.

        And yes they can leave. There are real things you can't leave like your only ISP (internet is essential in modern society and no alternatives), then there are websites you can choose not to leave like FB (not essential in modern society). Your misuse of the word "can't" instead of "won't" just discourages any level of consumer responsibility.

        • neurobashing 7 years ago

          can't and won't are effectively the same thing in this context; you're picking nits about free will.

          Yes, of course everyone can always leave. The only price - for more than a few of those 2 billion - is the complete disruption of their social lives. Telling someone they can never see cute pics of their lil cousin again is punishment, not a serious decision about consumer responsibility.

          You can tell because a history of malfeasance and yet, 2 billion active users.

      • ams6110 7 years ago

        Most people I know are getting off of Facebook, or were never on it. The only people I know who are really still active are people using it to market themselves/their business, and are not there because they care about Facebook, but because they want to be findable there (and everywhere).

        I guess I'm old, but I find that email is great for sending baby pics to friends and family, and for planning things.

        • scarface74 7 years ago

          Well anecdotally in your small social group that may be true. But Facebook has 2.27 Billion active users...

          • hazeii 7 years ago

            ...for a certain value of 'active' (do they say how it's defined?) My experience of internet companies has generally been that user figures are somewhat exaggerated (to put it politely).

            • scarface74 7 years ago

              No matter how you define “active”, there is no indication that in the aggregate people are fleeing Facebook - no matter if a few anecdotes are posted on HN.

              • Spooky23 7 years ago

                By Facebook’s generous standards, active users have been flat for a year or more.

        • ucosty 7 years ago

          I had this problem recently, I wanted to get in contact with an old friend I hadn't seen in years. Because it had been so long I no longer had a current phone number or email address. At this stage we didn't even live in the same country as each other. Solving this problem, or problems like it, might go a long way to reducing the appeal of Facebook.

        • nradov 7 years ago

          Email is such a failure for sending family pictures. My relatives keep changing their addresses without telling anyone, and many email providers have small message size limits so if you attach several pictures then the message may bounce or just not get delivered. For all its faults, Facebook is a much more reliable and usable delivery channel.

        • neurobashing 7 years ago

          I feel you but a quick search finds "As of the third quarter of 2018, Facebook had 2.27 billion monthly active users"

          so solve for that, not your dozen or so social group.

      • zentiggr 7 years ago

        Let's make the next mandated change to Facebook's operations a red bordered, 90% screen coverage dialog, modal:

        "We are required to notify you that we have leaked information from your account, please be advised that we have no idea who has your profile information, pictures, post history or any other information contained in your posts. Please consider resetiing your entire online persona to avoid financial and/or social consequences."

        With two buttons: "Erase me from Facebook" or "I get it, I don't care."

        Just planting seeds, Bill Hicks style...

      • PhDuck 7 years ago

        Leaving is of course an option.

        The fact that they don't leave mearly show that they value the gained social contact higher than the cost of data breaches.

        • neurobashing 7 years ago

          that's my point, normal people value social interaction over an absolutist position of "well at least my data is secure!". normal people view never seeing their lil' cousin again as punishment, not the correct position to adopt bc it prevents being caught in a data breach

  • enjo 7 years ago

    Genuinely curious on your view: what is an appropriate response?

    • ColinDabritz 7 years ago

      Not the op, but meaningful fines, executive jail time for gross negligence and especially for intentionally taking inappropriate risks, breaking up or closing companies that are shown over time to be unable to safely handle sensitive information. Proper regulation. Consequences that can't be cynically taken as the cost of doing business.

      • UncleMeat 7 years ago

        Jail time for bugs? Have people here every worked on products?

        Bugs and security vulns are literally inevitable. Security is important but it this was the standard I'm not sure that any company would still exist.

        • ColinDabritz 7 years ago

          Jail time for bugs that should have been preventible and caused harm to users. Mistakes and bugs happen, but we also have methods of mitigating them. Standards, quality controls, tests, analysis, and other care. I specifically said jail time for gross negligence because that means not taking care and allowing harm to users.

          If you had an error that leaked private information, it's worth an investigation. If it made it through despite controls, that's understandable. If they find you failed to do analysis on the risk to users privacy, if you failed to have controls in place, if you didn't code review or test the code, then you have made specific choices that harmed users. That should be criminal.

          We need to take software engineering seriously as a discipline. We have the potential to do more wide scale aggregate harm than any structural engineering collapse. We need to start acting like it.

          • UncleMeat 7 years ago

            What is "should have been preventable"? Mandatory continuous fuzzing of all apis? Interprocedural static analysis to detect all of the owasp top ten? Manual audits of all dependencies and transitive dependencies on every update? Hire world class auditors to manually inspect code?

            I'm a huge security person. It's my job. But its unbelievably difficult to secure programs even if there are clear steps in hindsight that could have prevented a bug.

            • AlexandrB 7 years ago

              > What is "should have been preventable"? Mandatory continuous fuzzing of all apis? Interprocedural static analysis to detect all of the owasp top ten? ...

              All of the above, possibly. Other engineering disciplines seem to have defined what constitutes due diligence just fine. This isn’t a novel problem.

              It’s obviously not possible to make anything perfectly safe or perfectly secure. But it’s certainly possible to define a minimum amount of effort that must be put towards these goals in the form of best practices, required oversight, and paper trails.

              Edit: Even “fuzzy” disciplines like law have standards for what constitutes malpractice or negligence when representing a client.

        • mirashii 7 years ago

          Nobody said jail time for bugs, and phrasing that way is intentionally obscuring the debate. Gross negligence is an entirely different standard than just software bugs.

          • UncleMeat 7 years ago

            Lots of people in this thread are explicitly saying jail time for bugs.

            What evidence is there that this was gross negligence?

        • vwcx 7 years ago

          But why do financial services bugs garner a higher penalty than one that exposes private photos? This is an argument for regulation.

        • Veen 7 years ago

          > Security is important but it this was the standard I'm not sure that any company would still exist.

          This is true and it's also the reason why there are more software vulnerabilities than necessary. Software could be a lot more secure. There will always be bugs, but its is possible to build software and platforms with many fewer vulnerabilities. But it's expensive, so we don't, and users suffer the consequences while the companies shrug their shoulders and count their money.

    • 794CD01 7 years ago

      Execution. Facebook is big enough that it would serve as a useful warning.

    • onetimemanytime 7 years ago

      >>Genuinely curious on your view: what is an appropriate response?

      CRIPPLING fines to start. Or shut down the freaking company if you can't secure it. Not everything is forgiven with a "we're sorry for 27845th time." Private pictures can and do ruin lives. (We can question the wisdom of posting private pics on FB but after all it's a huge company and they said they're private)

  • bitxbitxbitcoin 7 years ago

    Useless, repeatedly broken promises. It is time to see how much teeth the GDPR really has.

    • kodablah 7 years ago

      Can we look to any other similar regulation to estimate how much teeth it will have when actually enforced?

  • buboard 7 years ago

    Considering that gdpr doesnt contain a single technical requirement (in contrast to all food / safety / medical regulations) , typically anyone and anything or nothing can be a violation. Your guess is as good as mine

  • gerdesj 7 years ago

    Whoops soz .... lol, l8erz.

    Kerching

  • jeklj 7 years ago

    Nothing bad ever comes to companies as a consequence of these leaks, so what is their incentive to stop them? It happens so often that it goes down the memory hole after maybe a week or two, so even that isn't much of an incentive. We shouldn't be surprised about this.

    • ritchiea 7 years ago

      There's a crowd here on HN that hates regulation but this is exactly why regulation exists. Massive, wealthy, powerful industries just aren't held accountable by average consumers or markets. There's no serious competitor that benefits if your data isn't safe at Facebook. And average people not only aren't powerful but have their own lives to look after.

      Without regulation massive companies are entirely unchecked, there is virtually never market pressure to fix problems like this.

      • TangoTrotFox 7 years ago

        As one aside on this, the main issue people have with regulations is not regulations in and of themselves, but the negative effect they have on small businesses and competition/entrepreneurship more generally. I think you'd find extremely few genuine voices against regulations that only start to apply once a company (and all associated entities) grosses in excess of e.g. $100 million annual revenue. By that point companies the costs (as a fraction of their total revenue) in ensuring compliance, and complying itself is only going to be a minuscule fraction of revenue. By contrast these same costs can, and do, destroy or simply prevent the formation of smaller companies that could greatly expand the market to the benefit of all.

        • bredren 7 years ago

          Seems straight forward to write that compliance to certain measures is based on gaap annual revenue.

          Though in practice, enforcement of regulations in general is already handled this way.

          For example a new startup in the valley doesn’t get a note from the gov because they launched without a privacy policy.

          Same in the App Store, a minor app breaking App Store rules doesn’t get knocked out because the downloads are too small to bother.

        • fhbdukfrh 7 years ago

          So basically regulation that impacts them directly or materially?

          The ideal of some arbitrary cut off point has been tried in lots of scenarios, and is gamed by all parties.

          Example: Copyright will protect new works for x years, at which point Disney lobbies for the arbitrary goal posts to be moved.

          • TangoTrotFox 7 years ago

            Exactly. Large companies love regulations when they affect everybody since they can easily abide the regulations while they help to destroy potential competitors.

            Keeping regulations focused on big players serves the dual purpose of focusing regulation where its affect will be most significant, while also ensuring it doesn't negatively affect the market. But yeah, like you're mentioning the big problem is that once companies reach a certain size they begin to develop the political connections necessary for them to simply kill, or at least castrate, any potential regulation that might genuinely require them to behave in a way that is inconvenient - even if it's better for society.

        • Apocryphon 7 years ago

          Libertarian hackers always think of regulations as pesky pinpricks from the nanny state but in this case, in their domain of software, regulations would actually serve more of something along the lines of industry standards to ensure that software is created up to code. Hackers want good code, don't they?

          • TangoTrotFox 7 years ago

            I don't understand why people have to otherize each other. No, again opposition to regulations has nothing to do with some ambiguous opposition to nanny stating in and of itself. That is tangential to the real issue. Before getting to that, let's take a really quick digression.

            Tax payer funded systems are one of the most controversial things we have. You'll find sharp disagreement on topics like e.g. public vs private funding for everything from education to medical and a wide array of other issues. Yet you'll find most of nobody that wants to privatize e.g. the fire department. This is because most of everybody would agree that the fire department does a good job, does it efficiently, and does it cheaply.

            The point of this is that if there were a regulatory framework that was unambiguously and intrinsically superior to any alternative you'd find next to no opposition to it. Everybody wants the same thing in the end -- we just disagree on what's more likely to get you there. In many ways, I think lemonade stands are just a timeless and perfect example. In many states in the US today it is literally illegal, or at least unlawful, for a kid to go sell lemonade in their front yard. They can [and have] faced ticketing, confiscation, and so on. This is clearly idiotic by any standard, yet the very rules and regulations the produced this were all at some point created with good intentions. Perhaps ensuring food safety, or avoiding money laundering, or whatever other rule they happen to be breaking by selling a cup of lemonade for a quarter.

            A rule that would generally stand to impose substantial penalties for writing bad code is something that would have unimaginably vast consequences at the lower level. And I think you're looking more at destroying small business in the tech industry than in suddenly having a world where all code is "good". By contrast the companies at the top can afford to greatly expand their staff and create factory lines of code review, extensive internal penetration testing, general audits, and so on. And perhaps most importantly, when they do end up violating the rule they have the resources to manage this just fine. And so it's very possible that the regulation could have an overall net positive effect there. But if it were applied to society as a whole (instead of just large companies), I think you'd be effectively killing off tech industry competition.

    • clubm8 7 years ago

      >Nothing bad ever comes to companies as a consequence of these leaks, so what is their incentive to stop them?

      I could probably get away with murder, but for some reason I'm not out on the town strangling prostitutes.

      Why do companies always need an "incentive" to not be anti-social? Why can't CEOs simply derive pleasure from delivering a quality service in exchange for some advertising eyeballs?

jasonkester 7 years ago

“Private” photos that people uploaded to Facebook.

Sounds like a good time to reiterate the advice: Don’t upload things to the internet that you don’t want to be on the internet. That way there won’t be any of your things on the internet that you didn’t want to be there.

  • spinach 7 years ago

    Except that your friends, family, and others can upload private photos with you in them.

    • InclinedPlane 7 years ago

      Just stop having a face. And friends, or family. Become an unperson. The solution is so obvious.

      It's always hilarious how people try to pretend that it's easy to just drop out of society and the systems that people use to keep in touch. Sure, you can live like the unabomber in a shed in Montana with no phone service, but having that be the only option to keep your personal data safe from leaks is a bit much of an ask. People should be able to live their lives and take reasonable but not extraordinary precautions to safeguard their privacy and be able to have some expectation of privacy as a result. Unfortunately, there is so much data being collected on everyone, so many intrusions to our private lives, and so little care being taken by the stewards of that private data that it is not, it turns out, a reasonable expectation. And the onus for solving that problem shouldn't be on individuals. We shouldn't be forced to live our lives in fear of digital representations of our appearance being leaked onto the internet as someone might have once feared an ordinary photograph could steal a soul. Rather, those who are going to great efforts to destroy the boundaries of personal privacy should be heavily regulated to prevent them from doing so and heavily incentivized to safeguard private data whenever they are in possession of it.

    • jakear 7 years ago

      I left FB when they made reverted a policy that let you opt to confirm all tags before they showed up in searches for you.

      This means anyone in the world can upload an image, tag you in it, and it will show up in searches for you. It still won’t show up on your profile if you have confirmations for that enabled, but still.

      • isostatic 7 years ago

        No need to tag, just facial recognition will get you from previous tags and other metadata

        • jakear 7 years ago

          Will it show up in searches from just facial recognition? That would be very bad for anyone trying to live in a way incongruent with their culture’s standards. (I personally left in solidarity with a Muslim friend who no longer wears Hijab, but would prefer her family didn’t know that; pictures of her without Hijab started showing up in searches without her approval suddenly and without warning when they removed the old “confirm before search results” option)

    • jpwgarrison 7 years ago

      Sadly, this is the moment that those photos stop being private. I get that this is hard for the general public to understand - but at this point, uploading anything to Facebook = obfuscated, maybe, but not private.

  • Raphmedia 7 years ago

    > including images that people began uploading to the site but didn’t post publicly.

    This means that if you started to upload a photo to the uploader wizard and then thought better, the photo is still out there.

  • adtac 7 years ago

    Unfortunately, that advice is nearly useless on HN as the people who need it aren't on this platform.

  • AlexandrB 7 years ago

    This ship hasn’t just sailed, but its masts are no longer even visible over the horizon. Both major phone operating systems actively encourage synchronizing all photos with a server on the internet.

  • tracker1 7 years ago

    Also, never take nude photos with your face/head in the photo. Never let someone take photos/video of you drunk. Unless you want kids with this person, always use a condom.

synthmeat 7 years ago

People here are calling for draconian measures without considering low-hanging fruits first - why not just require the platform to disclose this within its primary medium?

Bright big popup right over main facebook.com (and peripheral webs/apps) dismissable only if you scrolled it all the way down, confirmed to have read it, saying "private photos of millions of users were leaked" in big bold letters, would go a long way.

newscracker 7 years ago

If there’s one thing that Facebook has been highly successful at, it’s making people numb and uncaring about any of these “bugs”.

Like the saying goes, “One death is a tragedy; one million is a statistic” — Facebook has made all its privacy blunders and issues over many years a statistic...something people may nod their head at, feel bad for a moment and go back happily to the same company’s platforms.

Unless lawmakers around the world do something, nothing will materially affect Facebook (the company). Even if they do, I personally have no faith that the company is capable of changing unless people at the top, like Mark Zuckerberg and Sheryl Sandberg, are out.

imgabe 7 years ago

I think it should be clear to everyone at this point that nothing on Facebook is private. Don't put anything there you wouldn't post publicly.

  • SamWhited 7 years ago

    Any company that talks about their old "move fast and break things" motto as if it's a good idea should be treated this way (or not used at all).

    In industry perspective: We call ourselves "engineers", but real engineers are held accountable when they sign off on using an untested metal alloy in bridge joists and then people die when the bridge collapses. Facebook's constant bad engineering may not kill people directly, but it does lead to a lot of really important information stolen, peoples financial future being ruined, and who knows what other consequences for their users. If you still work for Facebook in this day and age you should be ashamed of yourself; I know people can justify just about anything while claiming that they'll "make it better from the inside" or because they just need to collect a fat paycheck and are comfortable and don't want to look for something new, but we need to fight these impulses anywhere we work.

  • ams6110 7 years ago

    Beyond that, nothing online is private. And generally, nothing can be removed. There will always be bugs, mistakes, new vulnerabilities. Eventually it will get out.

    • lucb1e 7 years ago

      Two can keep a secret if one of them is dead, sure. But that doesn't mean you have to assume that having something on the internet means it's going to leak all by itself. The advice we should be giving is not putting all of our eggs in centralized baskets

      Especially if we know the baskets have goals not aligned with our own, despite it being oh-so-convenient, but also not centralized in the first place.

rhegart 7 years ago

Remember when Facebook wanted you to upload nudes so they could help keep them off of Facebook and the internet...yeahhh hopefully no one trusted them with that. Also are there even any safeguards preventing private photos like these or even nudes from not being able to be viewed by any admin? I hope there is...

inetknght 7 years ago

> The bug also impacted photos that people uploaded to Facebook but chose not to post.

What about, for example, pictures sent in a private message?

I'm so very glad I deleted my account months ago.

  • perfmode 7 years ago

    For pictures present on your phone which Facebook uploaded just in case you’d want to post them later. To hide latency essentially

  • nscalf 7 years ago

    In so much that they let you delete your account.

  • elliekelly 7 years ago

    >I'm so very glad I deleted my account months ago.

    I did as well. One thing that stood out to me in the article was that users who were impacted by the breach would be notified via a Facebook message. What about people who were impacted by the breach who no longer have an account?

tareqak 7 years ago

The Irish Data Protection Commission says that it opened a broad investigation into Facebook's GDPR compliance in light of numerous data breaches - https://www.ft.com/content/d796b5a8-ffc1-11e8-ac00-57a2a8264...

Rjevski 7 years ago

As usual, I'd like to point out how scummy this site really is.

The paywall advertises a "Premium EU Ad-Free Subscription" which is more expensive than the standard subscription and explicitly states "No on-site advertising or third-party ad tracking" as one of the perks.

Trying to buy it has the following:

> By subscribing, you agree to the above terms, the Terms of Service, Digital Products Terms of Sale & Privacy Policy.

On the privacy policy, we have this:

> hen you use our Services, third parties may collect or receive certain information about you and/or your use of the Services (e.g., hashed data, click stream information, browser type, time and date, information about your interactions with advertisements and other content), including through the use of cookies, beacons, mobile ad identifiers, and similar technologies, in order to provide content, advertising, or functionality or to measure and analyze ad performance, on our Services or other websites or platforms. This information may be combined with information collected across different websites, online services, and other linked or associated devices. These third parties may use your information to improve their own services and consistent with their own privacy policies.

There is absolutely no mention of the "Premium" ad-free subscription in the privacy policy at all, so they are still granting themselves the right to stalk you all over the place even with the premium, more expensive subscription.

Not to mention, the privacy policy page itself loads a handful of different trackers before any kind of consent was even granted. I can see Google Analytics, something from "c.go-mpulse.net", something else from "bam.nr-data.net" explicitly sending my user-agent in the URL (why? They'd get it in the headers anyway), Google News JS, Google Pay and the New Relic JS agent.

My only response to this is a big "fuck you" and this link: https://outline.com/zd5du7 so you can read the content without any of that garbage and without paying them since they don't even deserve a single penny.

cmurf 7 years ago

I needed to change my phone number for an online account for a major well known transportation company. The app offers a way to do this, and receive a text message containing a verification code. Upon receipt the code is autoentered into the app, but immediately got an error that said I had to open a support ticket which can only be done with a web browser, not in app.

Customer support by email says I have to provide a copy of my driver's license or passport to "secure the account". I said that's not reasonable, companies leak too much personal data so you can't have anymore of mine, I'll just open a new account. They replied they'd just change the phone number (now no longer requiring the required photo ID). They did and the end.

- No explanation why the verification code process would not work.

- None of my ID's have either my email address, account number, or phone number, and the account doesn't even have my name on it. Giving them photo ID does jack shit for the purpose claimed.

- If the account security is questionable, you should not only require text verification of the new phone number, but they should have removed my stored payment accounts, requiring me to reenter them. AFAIK the credit card verification requires CVV and phone number matching the credit card account. That seems like the right way to secure the account rather than bullshit photo IDs.

cody3222 7 years ago

Didn't they just launch a feature earlier this year telling people to upload their nudes so they could better detect when an ex miss-used them?

https://www.theguardian.com/technology/2017/nov/07/facebook-...

graeme 7 years ago

On this topic, does anyone know if photo access granted to facebook apps on ios means facebook will upload all photos in the background?

Have never seen an analysis of it.

saulrh 7 years ago

It's too much to hope that Facebook takes a hint from Google and shuts down its social network to preserve user privacy, right?

  • aylmao 7 years ago

    Google didn't shut down Google+ to preserve user privacy. Not sure if that's what you're implying with your comment-- I hope it's not.

    • dragonwriter 7 years ago

      > Google didn't shut down Google+ to preserve user privacy

      They accelerated the planned shutdown for exactly that reason.

      • rohan1024 7 years ago

        They did that because cost of maintaining platform was higher than its ROI. If Google+ had like 300M-400M monthly active users I don't think they would have shut down Google+

      • lucb1e 7 years ago

        They claim that, though, and that's the joke GGP was making.

bob_theslob646 7 years ago

How come Google never has had a breach? Do they do a better job with security? Is Facebook more of a target than Google?

  • tqi 7 years ago
    • Ivoirians 7 years ago

      Technically, these aren't breaches or leaks, they're vulnerabilities.

      Whether you believe data was exfiltrated is essentially a reflection of how much you trust or distrust Google.

  • libdjml 7 years ago

    Good question, to which I don’t know the full answer. But if you look at their motto “move fast and break things”, insistence on pushing new features as fast as possible, and the recent clash and resignation of their CSO, I’d say google are just more mature about security, and understand their products are entirely reliant on trust of their users.

  • sifoobar 7 years ago

    Does it matter? I have a hard time thinking of a worse company to entrust with personal information. The reach of Google makes Facebook look like a joke. At least when it's leaking they're not making more money.

  • QML 7 years ago

    Probably the latter: the larger the value of a network, the more likely it is to attract attackers.

chrisseldoOP 7 years ago

Facebook's release: https://developers.facebook.com/blog/post/2018/12/14/notifyi...

foobaw 7 years ago

Where are the technical details on what the bug was and how it was possible? Shouldn't this be disclosed?

addicted 7 years ago

I’ll be interested in seeing what the number of affected users actually ends up being. As John Gruber at Daring Fireball has pointed out, Facebook has a rich history of giving initial numbers which tend to grow by orders of magnitudes over the coming weeks.

connorgutman 7 years ago

Someone needs to go Mr. Robot and 5/9 Facebook's servers. This is getting ridiculous.

Jaruzel 7 years ago

As IT people, we owe it to our families to offer to self-host their social data on one of the many open-source platforms that are available.

Maybe spend some time over the Xmas period having 'The Conversation' with our loved ones about their data safety?

  • treve 7 years ago

    Why do you think that self-hosting is safer?

  • joering2 7 years ago

    You gotta be kidding me. How such conversation would go, and what would you expect to get out of it?

polskibus 7 years ago

Does it fall under GDPR violation?

  • megous 7 years ago

    No. Unless they didn't report it to the regulators.

    • nneonneo 7 years ago

      Article 34 clearly states that the breached organization must inform the data subject "without undue delay". Given that the event occurred in September, and it is now December, I would characterize that as an undue delay.

      There should be GDPR consequences of this - it's time that law got properly put to the test.

      • jsnell 7 years ago

        I'd imagine what matters is the delay from when you learn about the issue, not the delay from when it happened. This blog post looks a lot more like something they discovered now than something they discovered in September. (E.g. the way they'll have "tools for figuring out who was affected next week").

    • WA 7 years ago

      What? You get fined under GDPR for a breach. If you don’t report it, the fine will be a lot higher if they find out.

      We will see how this plays out, but there should be a fine nevertheless (because others have been fined and they reported it).

      • megous 7 years ago

        Based on what article?

        • WA 7 years ago

          If you process personal data, you must make sure to protect the data. If you fail to protect the data, you can get a fine. How high it is depends on various factors. Reporting the breach timely to the authorities can help to have a lower fine.

          But reporting it doesn’t make the fine go away. After all, you started to process personal data and are responsible for it. Alternatively, you could’ve opted for not processing personal data if you think you can’t protect the data adequately.

          You can read all this here: https://gdpr-info.eu/issues/fines-penalties/

spiderPig 7 years ago

Turns out solving 3000 Leetcode questions doesn't teach you now to do security right

ben174 7 years ago

Unrelated, but I'd love to know how that article managed to get a picture of that Facebook sign without people standing in front of it. I drive by it daily and I've never seen it without people posing in front of it :)

  • techsin101 7 years ago

    If you take multiple pictures then run algorithm that only keeps mode ( most occurring ) pixels then stationary object will stay and moving people or objects will disappear. Photoshop has this function. Tutorials on YouTube.

sakisv 7 years ago

At which point should we stop treating these things as bugs and start treating them like features instead?

Not this particular thing per se but, you know, it's Facebook. As the recent history has proven these things kind of come with the package.

Oras 7 years ago

Since Facebook is walking away all the time without any consequences, this will happen again and again.

The long-term solution to this mess should come from users abandoning it which is happening gradually based on recent reports.

  • jamesrcole 7 years ago

    > The long-term solution to this mess should come from users abandoning it

    Where will the people go? If it's other software it might end being as bad or worse.

    • Oras 7 years ago

      What’s the value of Facebook? Serious question as you think people should have alternative

      • jamesrcole 7 years ago

        I didn't say an alternative that is like Facebook.

        People want to communicate with others. If they use software for that then.... my original question applies.

        And you've avoided answering that question.

snovv_crash 7 years ago

The more leaks there are, the more I feel that the mindset will shift from user data being an asset to a liability.

sammycdubs 7 years ago

That privacy popup in NY really worked!

annadane 7 years ago

I mean, it's a bug. Happens to everyone. Criticize them for the things they should be but don't make a case out of everything.

Mc_Big_G 7 years ago

Why is anyone still using FB/Whatsapp/Instagram? It seems the vast majority just don't care at all about privacy.

  • dqhAR 7 years ago

    Too big to avoid.

    Data leaks happen to every tech company. As users/customers, won't have knowledge of the leaks unless they are publicly reported.

    How can you "socialize" these days without using at least one of these internet social/media platforms?

    Ways to avoid givin them your data are either to be totally reclusive or to be a tech geek who relies only on niche tech products that aren't mainstream.

    What if they are used as highly valuable networking platforms for your job? Some people live off some kind of business model taking advantage of the sites. Also they work hard at maintaining their audience captivated and engaged.

  • WA 7 years ago

    WhatsApp: Because the market share outside the US is insane. Germany has 70% Android users and they don’t use iMessage. Nor any of the other ones unfortunately.

keyboardmowing 7 years ago

Wasn’t there a point in time that fb wanted users to submit their nude photos so that they could better detect fake profiles ? Lol

jhowell 7 years ago

Not very good at the data security thing. In other industries such as health care, there are tables that define fines and penalties. Maybe the same is needed here.

yumraj 7 years ago

Most of the comments below are echoing the statement "jail time for bugs!!!!!" and similar sentiments, and therein lies the problem.

"bugs" is a catch all word, it covers everything from a pesky typo in UI to bugs like this, severe security issues, meltdown/spectre, VW bugs, and so and so forth.

Of course no jail time for a typo, but why not a jail time or severe financial and career consequences for severe bugs especially when it can be shown that a bug was caused due to intentional decisions, malicious intents, sloppy testing, rushed product etc. and not due to genuine mistakes - similar to medical malpractices.

Of course lawyers will love it, but it can improve the overall situation.

And yes, I'm a software engineers and do know what I'm talking about.

  • walrus1066 7 years ago

    Who would be held responsible? Coder? QA? Code reviewer? PM? person putting pressure on PM?

    • yumraj 7 years ago

      Depends...

      If malicious intent, most likely the business owners, PM or engineering management, but in some cases software engineers.

      If due to rushed product, certainly the management and not software engineers or QA.

      and so on..

      It depends..

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection