Apple Platform Security February 2021
support.apple.comStill waiting for Apple to provide end-to-end encryption on iCloud Backup for devices. Their documentation on this has always seemed intentionally vague.
https://support.apple.com/en-us/HT202303
End-to-end encrypted data -> - Apple Card transactions (requires iOS 12.4 or later) - Home data - Health data (requires iOS 12 or later) - iCloud Keychain (includes all of your saved accounts and passwords) - Maps Favorites, Collections and search history (requires iOS 13 or later) - Memoji (requires iOS 12.1 or later) - Payment information - QuickType Keyboard learned vocabulary (requires iOS 11 or later) - Safari History and iCloud Tabs (requires iOS 13 or later) - Screen Time - Siri information - Wi-Fi passwords - W1 and H1 Bluetooth keys (requires iOS 13 or later)
They won't do this. Its their run-around to giving law enforcement access to the devices.
They can claim that the device is secure and always encrypted, and all the messaging is encrypted, and they don't collect user data. This is all true (i assume, but did not audit).
If you care about security, all you have to do is turn off iCloud backup, and everything is secure. If you don't care, well then you have a great feature.
They upload plain-text versions of messages, etc to iCloud so if law enforcement asks, they can still comply with the juicy data. They don't need to back-door the iphone for the Gov. which was a major PR issue a few years ago.
> If you care about security, all you have to do is turn off iCloud backup, and everything is secure.
No, each conversation has at least two endpoints, and it's unlikely that the people you iMessage with have disabled iCloud Backup.
It's sort of like switching from gmail to avoid Google having access to your correspondence: they'll get it from the mailbox of the people still using gmail (so, everyone) that you correspond with.
Ok yeah, i should have been way more clear here. I just meant that your data can't be snooped from the cloud, due to encryption, if backup is turned off.
Of course, this also assumes you trust apple and the implementations of encryption, blah blah blah typical security-depends-on-trust-someone-somewhere warnings.
Very good point. In addition to iCloud Backup for messages, people could also have Messages in iCloud turned on as well
Messages in iCloud is end to end encrypted.
It's intentionally vague because they want people to read that page and think "oh, it's all encrypted, it's safe", and not realize that they intentionally preserve this backdoor so that they can provide data to the FBI at any time, with or without a warrant, at the FBI's explicit request:
https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...
Apple provided user data on over 30,000 users in 2019 to the US federal government without a warrant or probable cause, per Apple's own transparency report (see FISA orders). All the feds have to do is order the data from Apple, and they get all of it, on anyone they like.
You're going to be waiting a long time; it's a design goal for Apple (and by extension the feds) to be able to read your every stored text, iMessage, and iMessage attachment out of your device backup without your consent/knowledge.
It's not really that different from the situation in China, where Apple provides the same sort of backdoors to the CCP to be able to sell devices there. (There, the CCP requires that it be physically stored on state-owned and state-operated hardware, as I understand it.)
> "the US federal government without a warrant or probable cause, per Apple's own transparency report (see FISA orders)."
Do you not know a FISA order is a court order?
https://en.wikipedia.org/wiki/United_States_Foreign_Intellig...
I said without a warrant or probable cause, which is accurate.
The FISA court is a bullshit, rubberstamp farce, to allow the state to pretend that they give a shit about the rule of law. The fact that they surveil everyone, inside and outside of the country, without warrants or probable cause, is evidence that they do not.
The FISA court issues orders without a requirement of probable cause, and its decisions and targets are classified. They are not warrants, and there is no due process. Calling it a "court" at all is a stretch.
Here's the FISA "court order" demanding 100% of all call records, every day, from Verizon, even local calls that start and end wholly within the USA:
https://epic.org/privacy/nsa/Section-215-Order-to-Verizon.pd...
This kind of overbroad stuff is precisely why we have the fourth amendment:
> The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
That's exactly the opposite of what the FISA "court" does.
EVERY US company is legally required to comply with a FISA warrant. Stop acting like Apple has a choice, they don't. And also they are legally considered warrants. Did you read your link?
Apple has a choice about whether or not backups are end-to-end encrypted, using keys unknown to Apple.
Apple, at the request of the FBI, chose to preserve this surveillance backdoor by not deploying their end-to-end encryption system for iCloud Backup, thus making everyone's data available to Apple and potentially responsive to FISA orders. Seriously, read the link:
https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...
They absolutely had a choice.
If that backup data (which includes all your iMessages and attachments thereto) were end-to-end encrypted, which was Apple's original plan, then FISA orders, real warrants, and all the rest would be fruitless as Apple could not decrypt the data. They'd be turning over opaque encrypted data in response to FISA orders and real warrants.
You can use clouds like these with your own cryptography software. A matter of using something standard while not giving the cloud provider your public key. As long as they allow you to specify the backup location (which I don't know if they do), this should be doable. If they don't allow this that is a more severe issue.
It’s well known that they don’t encrypt backups in iCloud. That’s how they’re able to reset access in case you lose access.
You're being downvoted, presumably because of the parallel discussion about the FBI. But I think this is most likely a combination of both:
1) The vast majority of Apple's users care more about getting their data back than they do E2E encryption. Encrypting backups does introduce failure modes that put more burden on the user (to have an emergency key, etc). Apple also cares deeply about things "just working", and so this is a space that was always going to be incredibly difficult to balance.
2) The FBI thing is also true. Given Apple's former plans for true E2E encryption somewhat gave way to what they have now, with little explanation, it's hard not to speculate that they backed away from the original initiative after some...involvement...from the feds.
Lots of interesting stuff this time. Short list that I’ll update as I go:
Some sort of “checked C” in iBoot: https://support.apple.com/guide/security/memory-safe-iboot-i...
Data is encrypted with your security policy, so if that changes (e.g. you disable SIP) it doesn’t expose it: https://support.apple.com/guide/security/sealed-key-protecti...
Details on what the SRD is and how it works: https://support.apple.com/guide/security/apple-security-rese...
"For certain sensitive information, Apple uses end-to-end encryption" - there's a lot of important user generated data from Apple apps that is not end-to-end encrypted.
Frankly, I'd like to see them go even further and put in place a policy that all user-created-and-consumable content can only leave the device in end-to-end encrypted format and have those keys managed by my AppleID so not even Apple can decrypt.
They can introduce it at an API level without having to dictate storage providers. If a web-version of an app needs show my photos they can let the end-user browser decrypt it. This works for private data, 1:1 and 1:Many shared data.
I should have a choice with who hosts my encrypted data, who manages my keys/identity and who provides a service that uses that data. Let's get back to providing value through services and away from leaching value through hoarding data and controlling protocols.
Yes - this will force companies to change their business models if they rely on access to my data. Will it make for better software - Yes hands down. More companies can compete and we'll start to see more creative solutions.
I don't see anything about the "Unlock your iPhone with your Watch" feature that 14.5 is going to have[0] - i'd be interested in reading the in-depth security considerations they had. It's also currently a mystery if this feature does a partial Face ID scan in addition to requiring an unlocked Watch.
0: https://www.macrumors.com/2021/02/01/iphone-apple-watch-unlo...
To enable the Unlock with Apple Watch feature, open the Settings app on your iPhone, then look for the “Face ID & Passcode” setting. Once you flip this toggle, your Apple Watch will be able to authenticate your iPhone as long as the following conditions are met:
- Face ID detects a mask - Your Apple Watch is nearby - Your Apple Watch is on your wrist - Your Apple Watch is unlocked - Your Apple Watch has a passcode enabled
https://9to5mac.com/2021/02/04/iphone-face-id-unlock-apple-w...
I think the unknown is if it uses any face ID data as part of the unlock still, like a partial scan of the top of your face.
In my anecdotal experience, it does not. The logic goes:
- Assert that a face is present. - Is wearing mask? > If yes, proceed with Watch unlock (irrespective of top of face). > If no, attempt Face ID scan.
It does not do a partial FaceID scan - I had a friend unlock it for me, and she's a different gender, 15 years younger and Asian. If it does, it's completely ineffectual.
It's nice to see that the Apple Security Research Device (i.e. the iPhone with root access) hasn't been forgotten about[0]. They even describe the additional security protections they had to do to ensure an attacker didn't give this device to someone that thought it was a regular iPhone (for example, the phone won't cold boot without being plugged into a charger, and if you plug it in, it shows the words "Security Research Device" before booting XNU in verbose mode)
0: https://support.apple.com/guide/security/apple-security-rese...
I’m bummed as an admin that the new M1s remove a function as an admin I always loved with remote management.
From what a sales/dev person for a Saas MDM app for macOs told me, the M1s do not have a lock device feature. You can only wipe the device.
If an employee was terminated, we could remote send a lock command with a numeric code. The only way to remove the lock is to get the code from us or have Apple reset it in person. The in person visit you have to prove you’re the owner or have authorization from the company to have Apple unlock it.
My only option now is to wipe it. So now I have to find a cloud backup provider to back these devices up in case I need an important file from an employee who decides to go rogue.
I’d like to know how I’m still logged in in Twitch even after deleting and installing the app. Or how Spotify offered me to link it to an Alexia device I was setting up after I installed the Alexa app.
Twitch must have saved your login details/Tokens in Keychain. Unfortunately, unless the App deleted these entries from the Keychain, iOS does not delete this information upon app uninstall automatically. That is a way for Apps to check if User is installing app for the first time or not.
Coming to Alexa, it might be totally different approach, Ability to find the devices on your network and may be with a combination of bluetooth Beacons.
Fortunately, you need to install full app to read this information. Unlike a Facebook, Twitter or Google Analytics library(Framework) can track you across other apps with the same Library or Framework.
For Second One, with iOS14 Apple prompts a Privacy Alert for Connecting to Other devices on network, You can simply turn it off.
Detecting Alexa App on the device used to be possible before, but it does not go unnoticed by Apple these days without some co-ordination between Amazon and Spotify.
For the Twitch issue, it's likely that Twitch stored a secret in your Keychain that persists. If you have a Mac, you can enable iCloud Keychain on your devices to sync and explore the contents. Search for Twitch and delete the entry(ies).
Keychain items persist even after you delete the app, and probably app URLs?
There’s also the iCloud key-value store they can use.
Currently I have no non-apple kext running, not sure this is a big problem any more other than old legacy hardware or mostly esoteric usage.
The big one for me is ZFS. Mac has a fantastic ZFS port, and you're never going to run that in user space outside of some terribly crippled implementation.
Is there a separate Law enforcement guide?
Any news about the T2 chip ending up being a way to silently implant malware in all Intel-based Macs that have it? Refunds? Replacements? Anything? Bueller? https://arstechnica.com/information-technology/2020/10/apple...
I don't really know why anyone would take Apple's hardware security claims at face value after this.
edit: more links, though they're all pretty similar.
https://www.wired.com/story/apple-t2-chip-unfixable-flaw-jai...
https://appleinsider.com/articles/20/10/05/apples-mac-t2-chi...
https://www.zdnet.com/article/hackers-claim-they-can-now-jai...
https://www.theregister.com/2020/10/08/apple_t2_security_chi...
edit 2:
If this is wrong, I'd like to know the truth! Really! Was it a hoax? Is there a patch? What happened?
What is really egregious is that apple still touts the T2 security benefits on their site and completely ignores the fact that it can be compromised. This in fact does make it harder to take Apple's hardware security claims at face value knowing what they know about T2 vs what they put out in their resources.
Apple silicon Macs are not vulnerable.
OK. But what about the Intel Macs they sold to millions of people, with the claim that they had hardware security that instead turned out to be a liability? Why should anyone believe the M1 Macs won't end up the same way? That seems pretty relevant to me. Do they take this seriously, or are they just posturing?
Consumer protection doesn't apply to broad statements like "secure". Just because a kwikset advertises "For use on exterior doors where keyed entry and security is needed" doesn't mean you're entitled to a refund if someone picks it, even if kwiksets are usually seen as low-security locks.
https://www.kwikset.com/products/detail/780-deadbolt-keyed-o...
Yes, that's why I'm saying the people need to pay attention to the track record of the organization and their past credibility.
> Refunds? Replacements? Anything? Bueller?
Their track record includes their responses to issues like these. If they ignore it, that's worse than trying to rectify it or address or mitigate its severity.
“Why should anyone believe..” Vulnerabilities are found constantly, that’s a feature not a bug. Apple has earned a decent amount of respect in this area. They have also earned a healthy verification of whatever security claims where are due to some pretty high profile bugs.
If you're talking about the iPhone, yes. But they lost a bunch with their desktop computers with this unaddressed, apparently very real problem. Unless there is some news and it turned out to be a hoax? But it seems real.
It does not reduce the security level to at or below the one of any regular PC.
The bootrom bug requires DFU and physical access to be triggered, which is already game over on most systems. Apple also doesn’t solely rely on measured boot for the encryption keys (unlike default BitLocker configuration with TPM).
It was specifically a selling point of these computers. One of the headlining features. (As I said in my earlier comment, "with the claim that they had hardware security")