Part 1 of 2. Part of the ongoing Big Tech's War on Users series.
Correction: A reader notes that "Offload Unused Apps" is not strictly on by default — Apple's support documentation describes it as something you turn on. In practice the setting gets enabled for many users through setup prompts or low storage warnings, which is why it often feels like a default. The core point stands — the notification that surfaces when you're low on storage routes you to iCloud+ rather than surfacing this free option — but the framing was imprecise. Thanks for the catch.
Let me be upfront about something before we start.
I've been using iPhones since the 4S. I'm writing this on a Mac I bought specifically to develop an iOS app. I have Apple Music. I have iCloud storage. I'm not here to tell you to switch to Android — Android has its own war on users and Google is arguably worse on most of the axes we're about to discuss. Microsoft is in a dedicated category of its own right now, as I covered recently.
I'm here because a decade-plus of living in this ecosystem, and now building for the platform, gives you a particular view of the gap between what Apple says and what Apple does. And once you see the pattern, you can't really unsee it.
Apple isn't evil. They're sneaky. That's actually harder to call out, because every hostile move comes wrapped in a genuine user benefit, every money grab has a plausible technical justification, and every wall has a security rationale attached. The opt-out always exists. It's just always buried, always friction-heavy, always slightly broken by design.
This post is about the privacy story Apple tells — and what's actually behind it. Part 2 covers the walled garden, the lock-in, and the developer tax. But the privacy brand is where it starts, because it's the foundation everything else is built on.
None of what follows required Apple to lie. It's all disclosed — in documents designed to be clicked through without reading. That's the whole trick.
"Privacy. That's iPhone."
Apple has spent years positioning themselves as the privacy-respecting alternative to Google and Meta. And to their credit, they're genuinely better than the alternatives on several meaningful axes. App Tracking Transparency meaningfully hurt Meta's cross-app tracking. On-device processing for Siri and some AI features is real. Third-party data brokers largely can't hoover your data the way they can on Android or the open web. These aren't nothing. Apple has taken real positions, absorbed real political heat, and built real technical architecture around privacy.
But here's what the campaign doesn't mention.
Apple's default search engine deal with Google is worth an estimated $20 billion a year. The entire "privacy vs. surveillance capitalism" brand pitch is financially underwritten by the world's largest surveillance capitalist. Every Safari search that doesn't go through a privacy-respecting alternative — and the default is Google, and most people never change defaults — routes through the company Apple's marketing implicitly positions itself against. The deal came up prominently in the DOJ's antitrust case against Google as a key anticompetitive arrangement — a federal judge ruled Google's $20 billion annual payment to be Apple's default search engine violated antitrust law. Apple's defense was essentially "users can change the default." Which is true. It's also exactly the kind of answer Apple mocks when Facebook says it about data collection.
Then there's Apple Search Ads — Apple's own advertising business, which is conveniently exempt from the ATT restrictions Apple imposed on competitors. When ATT launched, Meta lost billions in revenue as their cross-app tracking collapsed. In the same period, Apple's ad business grew significantly. They didn't end surveillance capitalism on iOS. They nationalized it. Your data isn't being sold to the highest bidder — it's being kept in house, where it funds Apple Search Ads, informs App Store curation, and generates transaction data from every purchase made on the platform. The difference between Apple and Google isn't that one profits from your data and one doesn't. It's that Apple cut out the middlemen and kept the margin for themselves.
That's genuinely better for you in some ways. It's just not what "Privacy. That's iPhone." implies.
The Google Deal Has a New Chapter
Apple Intelligence can route Siri queries it can't handle to ChatGPT — opt-in, with a toggle, and Apple says queries go through a privacy-preserving architecture that limits data retention. Google Gemini integration is announced and coming, presumably under similar terms. The toggle is real. The opt-in requirement is real. Apple's negotiated restrictions on how these services handle your data are a genuine attempt at mitigation. That's meaningfully better than quietly routing everything to OpenAI by default.
But "we've designed it so they're not supposed to vacuum your data" and "we've designed it so they can't" are two very different statements. Apple can negotiate contract terms. They can't audit OpenAI's servers. You definitely can't. The privacy guarantee on any third-party AI integration is only as strong as your ability to verify what's actually happening server-side — and in the age of the Facebook Pixel, "trust but can't verify" has a well-documented track record.
The Pixel is worth dwelling on for a moment. Meta embedded tracking code into external websites viewed through Meta's in-app browser — continuing to track users who had explicitly opted out of ATT, the very feature Apple marketed as giving users control over their data. A whistleblower complaint alleged this was deliberate, systematic, and known internally. Multiple class-action lawsuits followed. The apps remain on the App Store. Which raises the obvious question about what "privacy-preserving architecture" actually guarantees when the other party has both the incentive and the demonstrated willingness to find workarounds.
There are conversations I've had that I can't cite — with people who'd have reason to know — suggesting Meta's reach into iOS data goes considerably broader than any single permission. I'm not going to name sources or platforms. What I will do is point to what's actually documented, because the documented scope is already extensive enough to make the point without the uncitable parts.
A UK Employment Tribunal filing by former Meta product manager Samujjal Purkayastha alleges Meta used "deterministic matching" — linking identifiable user data across platforms — to reconstruct tracking profiles even after users opted out via ATT. Security researcher Mysk documented Meta's apps using push notifications to send device fingerprinting data — battery level, timezone, CPU — back to Meta's servers regardless of what tracking permissions you'd granted. Camera roll scanning without explicit consent has been separately alleged. A former WhatsApp head of security alleged approximately 1,500 engineers had unrestricted access to user data and could move or copy it without detection or audit trail.
The microphone specifically — deny permissions, talk about something you've never searched, watch the recommendations roll in — remains in the uncitable category in terms of hard evidence. My wife and I used to do this as a game with Instagram and YouTube. But if you're running deterministic matching, push notification fingerprinting, and camera roll access simultaneously, you arguably don't need the mic. You're already reassembling the tracking picture from enough other angles that the audio is redundant. "Confirmation bias" starts requiring more faith than the alternative at some point.
Apply the Beeper test. In December 2023, Apple identified, responded to, and technically neutralized a small startup called Beeper within days of their launch. Beeper's product let Android users send blue bubble iMessages. Apple cited security and ecosystem integrity, played Whac-a-Mole with every workaround Beeper found, and eventually even started banning users' Mac computers from iMessage if they were used to register Beeper. Beeper eventually shut down. The whole thing took weeks from launch to defeat.
Beeper's crime: trying to give users more choice in how they communicate, using their own Apple hardware.
Now: if Meta is systematically bypassing iOS permissions consistently enough that employees admit it internally and users reproduce it as a party trick — ask yourself where the patch is. Where's the cease and desist. Where's the App Store removal. Apple acts fast and hard when something threatens Apple's ecosystem control. The silence here isn't absence of evidence. It's evidence of something else entirely. Maybe technical inability to detect it cleanly. Maybe genuine indifference. Or maybe 30% of Meta's iOS revenue forwarded to Cupertino every month buys a particular quality of attention.
iCloud: The Storage Bait
Apple spent years running "Shot on iPhone" campaigns, pushed ProRes video recording, and now sells you a phone capable of shooting footage at roughly 6GB per minute. A few minutes of a kid's recital, a vacation clip, a family gathering — and you've burned through the entire free iCloud tier on a single recording session. Heaven forbid you shoot a full event. The free tier hasn't moved in over a decade while the camera Apple is marketing to you has gotten dramatically more capable.
The storage notifications that surface when you're running low route to iCloud+ first. There's an "Offload Unused Apps" option that costs you nothing — in Settings > Apps > App Store, enabled or not depending on whether you said yes to the setup prompt. It is not surfaced in the low storage notification itself. The paid option is one tap from the alert. The free option requires knowing it exists and going to find it. That's a UI decision, not an oversight.
Want to use Ente, Google Photos, or Dropbox instead? Technically permitted. Practically: iOS background suspension rules mean you need to keep the app open and the screen unlocked while it syncs, or iOS suspends it mid-backup. We're not talking a quick sync. For any meaningful amount of footage you're talking hours — hours of keeping the screen on and unlocked, which means either disabling screen timeout entirely or physically sitting with the device. Which means the battery is draining. Which means you plug it in. Which means if your phone gets warm — and it will, screen on, syncing, charging simultaneously — iOS may thermally throttle or temporarily pause functions to cool down, potentially interrupting the backup you're running. Meanwhile iCloud backs up silently, in the background, screen off, because Apple wrote the rules and also wrote an exception for their own service.
The "privacy protection" that hamstrings third-party backup solutions isn't protecting you from anything. It's protecting Apple's storage revenue from competition.
"We Keep Your Data Safe"
Let's go through what that actually means across different threat models, because the brand implies all of them simultaneously and delivers on them very unevenly.
From third-party advertisers and data brokers? Mostly true. This is the real, earned part of the brand. Apple has genuinely made it harder for the surveillance capitalism ecosystem to operate on their platform. ATT is real. On-device processing is real. This part of the claim holds up reasonably well.
From hackers and malware? Better than average — but the "virus-free" mythology was always mostly market share math. Malware writers go where the users are. When Mac market share was in the single digits through most of the 2000s, it simply wasn't worth writing malware for. That's not superior security architecture. That's obscurity. When Apple became valuable enough to target — state actors, sophisticated criminal operations, security researchers — the vulnerabilities were there.
Apple patches them. Quietly. With less disclosure than Microsoft or Google typically provide. Security update notes that say "addresses vulnerabilities" without detail. The Pegasus/NSO situation was revealing — zero-click iMessage exploits, patched quietly, with Apple only acknowledging the severity after Citizen Lab went public with their research. Apple didn't find it. Researchers did.
And now something more systemic. Security researchers at Google, iVerify, and Lookout recently documented broad-scale hacking campaigns using tools called Coruna and DarkSword, targeting iPhones not running the latest software. Some of those tools have since leaked publicly — available to anyone who wants to launch their own attacks against Apple users on older iOS versions. There are now effectively two security classes of iPhone: iPhone 17 running iOS 26 with Memory Integrity Enforcement, which is designed to stop the memory corruption bugs these tools rely on — and everyone else. Apple security expert Patrick Wardle put the "rare and sophisticated" framing plainly: calling iPhone attacks highly advanced is "a bit like calling tanks or missiles advanced. It's true, but it misses the point. That's simply the baseline capability at that level."
The full security guarantee now requires current hardware and current software. Which loops directly into the upgrade pressure Apple applies — that pressure is now genuinely warranted in a way it wasn't before. They've made the threat real while making the solution require a purchase.
From governments and law enforcement? Mostly no — and that's in the ToS. The meaningful exception is data encrypted solely on-device, where Apple publicly maintains they hold no key and therefore have nothing to hand over. They've gone to court over this. The San Bernardino case in 2016 — Apple refusing to build a backdoor for the FBI — was real resistance that cost them real political capital. That commitment deserves genuine credit.
The catch is that "on-device encrypted" covers a narrower slice of your data than most people assume. Standard iCloud backups — unless you've specifically enabled Advanced Data Protection, opted in deliberately, and aren't in the UK — are not in that category. Your messages, photos, and app data sitting in a standard iCloud backup are accessible under a warrant. Advanced Data Protection, the feature that would extend encryption to those backups, was just removed for UK users when the government asked. Apple cited legal requirements. The same company that cited global privacy principles to India.
Hide My Email has now been used by federal agents in at least two documented cases — one involving a threatening message, one an identity fraud investigation where an HSI agent noted the alleged fraudster had created several anonymized addresses across multiple Apple accounts. In both cases Apple complied with warrants, exactly as their terms of service said they would. This is correct behavior. It's also precisely the opposite of what "hide my email" implies to a normal person who isn't reading the fine print. The feature protects you from spam and corporate data harvesting. It was never designed to be anonymous from Apple, because Apple needs to know the forwarding address to deliver your mail.
And then there's RCS. Apple promised end-to-end encrypted messaging between iPhones and Android devices over a year ago. It appeared in the iOS 26.4 beta, was stripped before the public release shipped, and is now back in the 26.5 beta. A privacy feature, promised publicly, that keeps getting pulled before it reaches users. Meanwhile Google has had encrypted RCS between Android devices for years. The protocol exists. The standard is ready. The finish line keeps moving.
From Apple themselves? Not entirely. Apple knows what you buy, what apps you use, how you use them, your location patterns, and enough about your behavior to run a multi-billion dollar ad targeting business. The data isn't being sold. It's being kept and used. That's different from Google's model in degree, not in kind.
The Pattern
The privacy protections are real. The privacy brand implies more comprehensive protection than any of them actually deliver — and stops exactly where Apple's business interests begin. That's not a lie. It's a carefully curated truth that benefits enormously from most people not reading past it.
"We keep your data safe" is the load-bearing wall of Apple's brand. Safe from advertisers — mostly. Safe from hackers — better than average but hardware and software dependent, and patched more quietly than the mythology implies. Safe from governments — no, and that's in the ToS. Safe from Apple — not entirely, and that funds a multi-billion dollar ad business.
None of it required Apple to lie. The terms of service disclose cooperation with law enforcement. The privacy nutrition labels are technically accurate. The iCloud limitations are documented. Everything is above board, disclosed, and buried in documents designed to be agreed to without reading — somewhere past the part you stopped, let's call it page forty-seven. Apple's genius isn't deception. It's knowing exactly how wide the gap is between what people read and what people remember, and building a premium brand in that gap.
Privacy. That's iPhone.
Just read the asterisk first.
Part 2 — The Walled Garden Is a Business Model — covers the upgrade pressure machine, the App Store tollbooth, the developer tax, and the lock-in architecture Apple has spent twenty years building. Coming soon.
Thoughts on Part 1? Find me on Mastodon at @ppb1701@ppb.social.