Apple's Tim Cook defends encryption. When will other tech CEOs do so?
theguardian.comYou mean like how Google got ECC forward-secure TLS deployed across the whole Internet?
I have nothing but respect for Apple's stance with regard to cryptography, but Google has been more instrumental in getting strong crypto deployed on the Internet, and, just as importantly, in sweeping the minefield of crappy 90s crypto that defined most Internet crypto until recently.
Google's also had a positive impact on TLS usage in email: https://www.google.com/transparencyreport/saferemail/
TLS for email is still in pretty bad shape but it's getting better. (Funny, I just noticed that Google's page says "Safe Browsing" while only "Safer Email".) I know you're not a fan of DNSSEC, but something like Secure SMTP via DANE is probably needed for meaningful improvement: https://tools.ietf.org/html/draft-ietf-dane-smtp-01 (though it won't help with the chicken-and-egg problem of domain ownership validation by email)
Agreed, but the article is focused on why these companies that have done significant things to protect users with encryption technologies haven’t been a lot more vocal like Tim Cook has. This issue is so important to them and to everyone that they could spare a little time to speak their minds. Otherwise it just looks like that "kooky Apple" going against the grain. Who cares, they are going out of business soon, right?
Why are we supposed to play dumb about the subtext behind Cook's comment?
I'm sure Cook believes what he's saying, but the real marketing strategy here isn't "crypto versus plaintext"; it's "consumer product company" versus "online service provider".
Seen through this lens, there's an argument that what Cook is doing is counterproductive. He's making an argument that Google can't sign on to, and using crypto as a wedge to drive the argument home. "Be a consumer product company, because then you can protect users with crypto".
Also: the kind of encryption that Apple is really making a stand for? They do a better job of it than Android, but Android provides the same encryption: what scares the USG about Apple is that iPhones are locked by default, and when they're locked, they can't be imaged easily. That's true of Google's phones as well.
Meanwhile, Google is doing a much better job of securing browser crypto than Apple is; Apple is almost an obstacle to better browser crypto.
I disagree that what Cook is doing is counterproductive. I think Google could take a stronger line to secure user data if they wanted to. They don't have to become a consumer product company to run a messaging system which they cannot read. If Google can't sign on to that, maybe they should change something so that they can.
iMessage is better than Google's chat offerings in this regard, but not that much better.
If you want secure messaging, you need to be using OTR or Signal. Apple isn't really helping you here.
I'd just like to add: NOT Telegram.
Because some people need it explicitly stated.
I don't claim that Apple is as good as OTR or Signal, only that Google could do more, Google should do more, and Google should be out there helping Tim Cook make a case that back doors are a terrible idea.
edit: Microsoft, Apple, Google, they all need to step up their game and make their case in public. Apple's not perfect but they're slightly ahead of the other two major OS vendors here.
But everyone can do more, including Apple. Meanwhile, I think if you build a scoreboard for this, it's not at all clear that Apple is ahead of Google.
Google's not making a public case. And Google, as far as I know, can read messages you send on Google services. Those are both Big Deals. I don't disagree (I don't have the expertise to!) with what you said on Apple and browser security. There are many parts.
Everybody can step up their game, I absolutely agree there.
Ugh, scoreboards. Historically, they've only caused confusion and muddied the waters. :(
Erm, Google's business model is to perform MITM for economic advantage. The popularity of this business model is what rekindled this "debate".
Their use of TLS is like an amateur's use of XOR - a secure primitive in a very narrow context that ignores the big picture. Google is the type of backdoor the skinjobs are grooming us to accept.
I think it's only a matter of time until Apple Inc lands on "Game Over", but Google is playing an entirely different game.
I think the claim is just that it makes perfect sense for Google to not want to take that stronger line, and perfect sense for Apple to want to do so because of the differences in their businesses.
Google can't show relevant ads for content they cannot read. Nor can they index it.
> He's making an argument that Google can't sign on to
That's only true if Google is unwilling to trade ad revenue for subscription revenue. Google Apps for Work is a product area where Google is making that trade-off, and is presumably not cannibalizing their ad business. There is no reason Google can't offer a consumer-friendly subscription service that would be unbreakably private at similar pricing.
Google can't sign onto a no back door policy?
> You mean like how Google got ECC forward-secure TLS deployed across the whole Internet?
Did you finish reading the article?
> Facebook’s WhatsApp has brought end-to-end encryption to more people – over 800 million – than any other service; and Google’s engineering team has been a leader in securing much of the web in the post-Snowden era.
And then they go on saying:
> But this is much more than an engineering fight – it’s a political one where public opinion is crucial.
Do you think the average voter knows what ECC forward-secure TLS is? Heck, I'd like to think I kind of know a little about the subject but I know _nothing_ compared to you and a bunch of other HNers.
But unfortunately, we live in a society where people who can vote are really scared of terrorism and lack an understanding of how technology works. If a politician tells them we need to decrypt "all the things" for their safety they'll happily vote for them[0].
We need the celebrities of the tech world to reach out and explain why we need crypto in a way they can understand.
[0]: No link really, just watch any of Donald Trump's rallies and tell me if you think those people care about encryption.
The article is really about CEOs publicly arguing that security back doors are a bad idea. It's unfortunate that mainstream press seems to be conflating "encryption" and "security back doors", but here we are.
As to the argument in the article, are there other examples of non-Tim Cook CEOs of big tech companies saying anything like this?
"But the reality is if you put a back door in, that back door's for everybody, for good guys and bad guys"
The closest I've found was a letter from many companies [1] which says "introducing intentional vulnerabilities into secure products for the government’s use will make those products less secure against other attackers." Google, Apple, Microsoft, Facebook and many others were signatories to that letter. So it certainly sounds like the companies might feel that way.
[1] https://static.newamerica.org/attachments/3138--113/Encrypti...
Eric Schmidt needs to be explaining this & why it's important to his politician friends and sphere of influence. Perhaps he already is, but it would be meaningful politically if he (and others) said something publicly. Cook is kind of the lone ranger on the matter as far as public discourse is concerned, post Paris & San Bernardino.
Exactly. It's incredibly valuable to have the CEO of the biggest tech company in the world (the beloved Apple no less) making the counter argument in a debate that has quickly regressed to the early 90s Crypto war levels.
This has nothing to do with being vocal about security. Yea, they're helping the technical cause, but if you don't want backdoors in everything, the CEO's need to talk to the public so they're aware, and sending letters and phone calls.
I'm sure what Google did helps curtail mass surveillance, but they still hold all the data unencrypted, no?
> but they still hold all the data unencrypted, no?
I think no
Even if the user-data is encrypted, they have the keys to decrypt it.
And? Like all the usuals suspects like Microsoft and Apple.
I was not implying that anyone was doing it right (end-to-end encryption)
You just described what google has done in slightly more technical detail than the article. I don't know why. It is definitely not what was meant. Supporting encrypted communications is a prerequisite to publicly advocating for it, but it is not publicly advocating for it.
Bruce Schneier has one of the best posts I've ever read on why Encryption is important here: https://www.schneier.com/blog/archives/2015/06/why_we_encryp... -- great resource to share to people who don't understand it.
And Martin Fowler has one of the best posts I've ever read on why privacy is important for a democracy: http://martinfowler.com/articles/bothersome-privacy.html
The issue I have with Cook's proclaiming support for strong encryption is that Apple still has control over what can and can't install on the user's device. So imagine if some strong agency came and said to a company you can't allow certain apps to install and you can't tell your customers we told you this. "You can allow these apps that claim to encrypt user's messages [list here], but not these [list here]". So some state could still strong arm Apple into compromising privacy and Apple would have their hands clean.
It seems that if you really want to guarantee privacy, you have to give the individual control over what they can install. Telling people to just "trust us" is not really good enough. And Cook is saying they are giving the user ultimate control by not having keys to their encryption but in reality that's nonsense... they are still requiring people to trust them.
I use a lot of web apps on my iphone. They don't have access to all the phone's apis, but they do everything I need, without any hindrance from apple oversight.
This is probably the most native looking one of the bunch: https://forecast.io/
From an encryption point of view though, they're relatively useless. Said three-letter agency now doesn't need to block the app, they can instead MITM the traffic to it or compel the organization to inject additional client-side or server-side code to complete the backdoor.
Certificate pinning helps against the MITM problem, but code integrity for downloaded client-side code is pretty tricky. Browsers could add some form of signed code pinning for power users, but it'd be tricky to be able to distinguish between legitimate updates and nefarious activity.
I recently had this conversation:
Me, to CEO: Hey, think we should ever build a backdoor into any of our
products that employ encryption to help the US government
and law enforcement?
CEO, to me: No, that's a terrible idea.
Me, to CEO: Okay good, just making sure we're on the same page.
I don't think there are many honest and competent technology CEOs who would rally against encryption.That scenario becomes a lot more dire when the CEO says
CEO, to me: Yes, because we are compelled by law backed by jail time or hefty fines.I already discussed that scenario. We'd shut our doors, release all of our code as CC0+WTFPL, and start a new company fresh.
At the end of the day, we care about our integrity more than we do dollars.
I'd commend you for doing that but it's very easy to say early in the process when you're not faced with the situation.
Once three years of your life have been invested in the product and you have tens or hundreds of thousands of users, will your answer be the same?
I appreciate the vote of confidence on our eventual success.
I can't predict the future. I'd like to say our answer won't change. If it does, we deserve to fail.
I get the feeling you are underestimating how difficult of a decision this is.
If you really care about doing the right thing down the line, I recommend thinking very long and very hard about it because, down the line if you do succeed, the time you have to take the decision will come and you will not be expecting it by then. Underestimating the situation is a sure-fire way to fail your own expectations.
Governments don't care about small startups with little reach. They will ask you when it's hard.
Well, it's not my call. I'm not a decisionmaker ;P
If you have shareholders, you can't legally make that call.
We don't. :D
Easy to say now. Much harder to follow through when you're relying on your paycheck to pay your rent or buy you food.
One should always ensure their financial situation does not preclude one from exercising their ethics.
Should, but unfortunately one can't always forsee things that come up. If your spouse comes down with cancer, for instance, and you're reliant upon company insurance to pay for treatment, one might be tempted to put their spouse's treatment ahead of those ethics.
It's hard to not parse your response as putting profits before people.
I'm not even really talking about profits, though. Just the basic amount needed to get by.
I can't be the only one who thinks it is pessimistic to say "if you put a back door in, that back doors for everybody, for good guys and bad guys." Very few people even seem to recognize this as a problem let alone are working to solve it. Maybe we should stop laughing at Clinton and her "Manhattan Project" comment; that might be the only way to get enough tech people on the problem to actually solve it.
What you think is a problem - is broken cryptography to experts.
There is no shortage of minds working on to create backdoors, or develop cryptographic methods that have backdoors, just look at Dual_EC_DRBG. It was a backdoor for the "good guys", but now its backdoor for everyone - eventually people will study the code and see the backdoor exists.
The crux of the issue is mathematics has no concept of good guys or bad guys, so as far as mathematics is concerned a back door for anyone is a backdoor for everyone.
If we can make encryption that is nearly foolproof, why can't we make a backdoor that is nearly foolproof? Why is a Manhattan Project of backdoors not a possible solution?
Also can't the role of the good guy be split up among a group? Similar to the two man rule to prevent rogue agents from launching missiles, can't we have some sort of process that requires agreement among a majority of a few parties including the end user, the company who owns the software, law enforcement, and the (public) judicial system. If all it takes to break down the door to my home are a judge and law enforcement to agree, why can't we accept similar when it comes to data?
You seem to be under the impression that if you just work hard enough you can violate the fundamental constraints of reality.
I can't comment on the mathematics involved but let us assume it's mathematically possible. You engineer this mythical nearly foolproof backdoor. You can decrypt this text with any of two keys. (It's my understanding that such algorithms actually exist already.) Congratulations you have achieved your goal. You have a working algorithm.
Now let's examine the results of actually using this algorithm:
You now have twice the opsec problem you had before. You have transmit this second key to a Government agency securely. You have to trust that Government agency to securely store, use, and dispose? of this key when they obtain it.
And what is the number one threat to secure systems? Operational Security. In fact many security professionals will tell you that the hardest part of security isn't the math behind the encryption. It's the opsec. In one fell swoop you double the threat in the most fragile part of your security.
You are correct in that the more keys that exist, the harder it will be to secure all of them. However, the more keys that are required the less valuable any one key becomes. Multiple keys means there is no longer a single point a failure. If you need 3 keys to get data, you can have an entire database of keys leak and the information is still safe.
I would also love a more detailed description of just "it is impossible because math" that everyone seems to be giving.
If you want a more detailed description, go to Wikipedia and read up on the difference between public and private key cryptography. What politicians are arguing for isn't just adding another private key to private key cryptosystems; a backdoor eliminates the biggest advantage of public key systems by adding a private key that could crack any of them. Once you add that, it's just a matter of time before someone cracks it.
Really, it's inevitable. Someone doesn't even need to crack it, you just need a single careless or corrupt government employee to compromise the whole system for everyone for all time. People are proposing adding a single point of failure to systems whose usefulness is currently defined by their lack of such a single point of failure. Put that in there and we may as well all go back to using DES for everything.
But you are simply pointing out problems with our current techniques and not why we can't come up with new and better approaches. That is the problem we should be working on. Politicians don't understand it, but that is why we need people from our community to work with them. Our response shouldn't be "no, you are an idiot, that is impossible, you are a fascist for even suggesting it". It should be "I know what you are looking to do, here is why it is not currently possible, lets see if we can work together on a solution."
Nothing anyone posted here has said why there can't be a multikey solution that allows access to data in a reliable way that would not be susceptible to a single point of failure or abuse. That sounds like a very hard problem, but I'm not convinced it is an impossible problem.
Again, no one is arguing that it isn't impossible. Like I said, in my original post, look at Dual_EC_DRBG - it was a cryptographic solution used by the NSA, that purposely had a backdoor - it was discovered by an outdoor party and now its worthless. (Look at Juniper systems (used by the USG), a very recent example of how this backdoor has failed).
I'm not a cryptographer, but lets assume a multikey solution is 100% possible.
The very notion that you can trust the government with a global key to all encryption is the crux of the issue. How do you know that Donald Trump won't wake up tomorrow and sell that key China? What do you then do if Germany then demands that key? What if Congress decides that giving Israel he private key is important to stability in the middle east? Then what do you do is some nationstate sells this key to a blackhat organization? Welp, all of Google's encryption is now worthless because this "multikey" that was supposed to be for the USG ended up in the hands of a blackhat - and now we have another Fappening 3.0 on our hands.
Great now the whole world has this "multikey" making it virtually worthless because the entire world can decrypt it. If you as an end user cannot control who can and cannot decrypt your messages, then its worthless as an encryption scheme.
Its not a technical issue, and the solution isn't limited because we aren't smart enough. The fundamental problem is that you cannot trust any third party with such a multikey.
We aren't talking about a system that requires 3 keys to get the data though. In order to be useful to the government they need a system whereby they can decrypt without my key. That means conceptually they need a second key that works all by itself.
You could split the second key so no single party has the whole key which would mitigate but you still have the same problem where you have effectively doubled your opsec problem.
Additionally if half the key is compromised that still greatly reduces the work required to decrypt the text.
The answer is in the complexity in creating secret technology that is also foolproof.
To make a car analogy, we can make a submarines that are waterproof, and we can make cars which looks like a car and you can drive on the road. However, to make a car that is also a submarine is quite hard, and close to impossible if you also had to make it look like a normal car. It would even be harder if it need perfect obscurity so that you couldn't even tell if you opened up the hood or started to disassemble the car.
So the answer is "we should give up because it is hard"?
You seem to be of the belief that engineering is constrained not by reality, but by imagination. Are you a product manager, by chance?
No, we should give up because the goal is bad.
Well, more that we should give up because it's hard, and the benefits are almost non-existent.
It's not impossible, just as how requiring registration of all typewriters is not impossible. It's just that the mechanics of doing either are so invasive that we characterize governments attempting them as totalitarian, and they tend to end poorly.
I don't think people are any more pessimistic about back doors as they are about perpetual motion machines.
Let's pause to consider this. Math works, that's why even the NSA can't break encryption. I wouldn't want to tell you wrong and say that it's impossible, it's not, but it would take something like ten billion years to crack. Needless to say, there's a reason why they need a backdoor, and that's because math works.
However, if a backdoor were put into all electronic products, the strength of the encryption is now meaningless as any would-be attacker (government or otherwise) would just target the backdoor instead of trying to break the encryption. Why wait ten billion years for a computer to brute force the message when you could just find a flaw in something designed by the government?
I suggest reading this piece on the recent Jupiter vulnerability: http://blog.cryptographyengineering.com/2015/12/on-juniper-b...
"The problem with cryptographic backdoors isn't that they're the only way that an attacker can break into our cryptographic systems. It's merely that they're one of the best. They take care of the hard work, the laying of plumbing and electrical wiring, so attackers can simply walk in and change the drapes."
That's not correct.
It's perfectly possible and trivial to put in a backdoor that only works for people who have access to a specific private key.
Obviously if that private key gets stolen anyone can then access the backdoor, but that's true for anything, and you can mitigate it by storing the key in self-destructing immovable hardware with access limitations, as well as periodically changing the keypair (with signed updates).
The real problem is that there is no single "good guy" to entrust with that private key: in particular humans are inherently not fully trustable or good and both individual consumers and other governments have no interest in using or allowing backdoored products.
You might expect Amazon to take a stance to reassure AWS customers. Their AWS sales people like to tout AWS's encryption capabilities and the fact that they weren't part of Snowdens leak
It's ironic Tim Cook is defending encryption when Apple gives backdoors with iMessage - http://www.digitaltrends.com/mobile/fbi-imessage-encryption/
That's not what that document says. iMessage has a design flaw (which is pretty obvious if you think about how it works) that allows it to theoretically be backdoored. In other words, they made a trade-off between usability and security, and (in my opinion, and clearly yours) fucked up. That's very different from saying that they deliberately built a backdoor into the system, and I think that some of the things they've done (like explicitly noting when someone else has been added to your iCloud account, and will be able to decrypt upcoming iMessage messages) goes to some length to mitigate those issues, and make clear that the existing design is more incompetence than malice.
That said, all Apple would have to do to fix this is to allow advanced users to see all keys listed as authorized for their account. I'm getting increasingly annoyed Apple hasn't done that.
They voluntarily signed up for PRISM. That's all you need to know.
Thanks for the correction. I doubt they'd want to go Signal for millions of users. It doesn't bode well with their NSA friendship.
Nowhere in the article does it say that Apple actually compromised the end to end nature of an iMessage conversation. All I see is this:
> Apple could collaborate with law enforcement to provide a false key, thereby intercepting a specific user’s messages, and the user would be none the wiser.
Key word is "could". Apple "could" also use its signing keys to install any kind of software on your phone to do whatever it wants. For example, to read your keychain and pull your private keys.
And due to the design of CALEA (it dates back to 1994) they can't force Apple to do this easier, which is one of the things that started the entire encryption backdoor debate in the first place.
they can't force Apple to do this easier
*either
In the link I provided Nicholas Weaver explains how iMessage's encryption is compromised.
It's compromise-able, meaning Apple could decide to MITM a conversation going forward.
But for conversations that have already occurred – Apple does not have the private keys.
> …against the constant threat of criminal hackers and foreign governments.
Foreign govts? Rather, "against the constant threat of criminal governments and hackers."
The only reason why Apple is defending encryption is because they're afraid Android (which is open source and thus can be inspected/hardened) could take away iPhone sales from security minded folks.
I doubt security minded folks would be choosing Android, because of the lack of updates.
Apple has a very weak service portfolio (edit: for a company of their stature. When compared to e.g. Yahoo they are doing great!). Their strength is in client UX. Of course they will defend encryption, it's in their financial interest do so.
Weak Service Portfolio?
Off the top of my head:
SSO ID Service / Cloud Photo Storage / Cloud Document Sync / Cloud Backup / Email / Instant Messaging / Music Store / Music Streaming Service / Cloud Music Service / Movie/TV Store / App Store / Push Notifications / Payments / Video Conferencing / Game Centre / eBook Store / Shared Calendaring / Notes / Large File Sharing / Personal Assistant / Maps
Weak?
For many people, everything that doesn't have a flashy Javascript-frontend does not count as a service.
Yes, compared to e.g. Google and Microsoft.
It’s in the financial interest of much of the technology industry to vociferious campaign for encryption, but they don’t. If Google stopped using any encryption, don't you think their business would suffer?
It may be in their Apple’s financial interests to do so, but it’s also the right thing to do. As you (partially) say, they care about the user experience. That experience includes taking steps to protect user information and they’ve had a long track record of doing just that. They did this long before it probably had any noticeable effect on the bottom line.