Settings

Theme

WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

theverge.com

489 points by timurtime 4 years ago · 371 comments

Reader

zpeti 4 years ago

I really hope there is enough momentum to stop this. It's definitely the last straw for me in terms of apple products. I haven't bought a new iphone for 4 years, I'm slowly trying to switch to a lightphone (non smartphone). My mac needs a change but I will probably switch to linux.

It's absolutely ridiculous what apple has become. The exact opposite of what they used to represent, when I loved them. God rest Steve Jobs soul, his 1984 ad is exactly what apple is now. Screwed devs on app store, strongarmed into compliance, cooperation with china, worse and worse UX on phones, and now this...

Really disappointing.

  • hughrr 4 years ago

    It’s too late either way. The fact that it even got this far through implementation says the vendor is not on my side. Not only that once capability is revealed it can be required by governments and manipulated under warrant. Particularly in some regimes, mine included, the vendor can be compelled to do something and not say anything. My comment doesn’t even cover how monumentally flawed the entire thing is either.

    So I’ve been on the verge of doing this for years so this was the final push and motivation.

    Yesterday I sold my iPad and Apple Watch. They are being shipped today. I’m just waiting on refunds on my AppleCare for my MacBook and iPhone now and I will sell them.

    Yesterday I had a Nokia 215 arrive as a replacement phone. Also a monster pile of PC bits arrived which have been assembled into a Ubuntu running desktop. I am spending today migrating my data over carefully. When the MacBook sells I will buy a Nikon DSLR.

    At the end of this I lose perhaps 20% convenience for an immeasurable privacy gain, lose a big chunk of the distractions from my life and end up with some cash left over which I will use to go on holiday.

    The only thing I will miss is Apple Music but it’ll give me a chance to curate my music collection without distraction again.

    • f3d46600-b66e 4 years ago

      How does Nokia 215 solve the problem? All messages, including any images, wi now go through the carrier's network (and the recipient's carrier) and be subject to intercept, analysis, and sale by those two entities.

      I'm not sure how is that better?

      Wouldn't AOSP/Lineage with Signal installed be better?

      • ursugardaddy 4 years ago

        I don't know if the 215 supports tethering, but there are feature phones that do paired with a laptop or even something like a steam deck (running linux) is the way to go if you really need mobile internet

        you get privacy and freedom from the smartphone-service-based-everything-forever lifesytle, it's nice

      • hughrr 4 years ago

        I only need to make calls on it and send SMS. I never assume those are private.

      • strzibny 4 years ago

        Carriers, at least in the Czech Republic, are very restricted in what they can do. Regular phone calls and SMS messages are much more private than anything else out there.

        • ryan_lane 4 years ago

          It's not the carriers you need to worry about. Regular calls and SMS are absolutely not private.

    • vel0city 4 years ago

      Before buying the DSLR, I'd recommend at least checking out the 1" sensor size market. These cameras still take excellent quality photos in a wide range of lighting conditions but are so much more compact than having an interchangeable lens system.

      I sold my DSLR a couple of years after getting my G9x Mark II. The DSLR was always gathering dust compared to the G9x which with a small belt case could easily be taken anywhere.

      That said these cameras are definitely not as flexible as a full SLR nor will you get the same performance. Its a large sensor when compared to a camera or other point and shoots but its still nothing compared to APS-C.

      • dylan604 4 years ago

        There are plenty of Micro 4/3 sensors that have interchangeable lenses too though, so you don't have to be locked into a compact camera with built-in lenses. You can have the best of both worlds.

        • noahjk 4 years ago

          I've heard good things about Panasonic's LUMIX G7, and it looks to be at a very reasonable price point. I personally use a Fuji X-T3 which is also more reasonably priced than full-frame, between that and Micro 4/3 price-wise. The main differentiators for a smaller sensor are low-light performance & dynamic range, but 99% of the time these won't matter. Megapixels shouldn't really even be considered - any of these cameras can take a photo to put on a billboard, DPI scales in relation to optimal viewing distance

        • vel0city 4 years ago

          Agreed, those are also an excellent middle ground from a full-sized DSLR to a compact shooter but still give most of the performance and about all the flexibility of a DSLR.

    • blacktriangle 4 years ago

      Even Apple Music has gone to shit. iTunes keeps managing to get worse and worse, and they're pushing hard to pretend you don't have your own collection to get you to buy into the subscription model.

      At this point I'm going back to owning a seaparate dedicated music device that is totally divorced from the computer. There's just something intentional about walking over to a CD player or record player, picking out an album, and putting it on compared to mindlessly browing Spotify playlists.

      • Cwizard 4 years ago

        I recently started looking into buying my music again since 90% of my Apple Music listen time goes to the same generic piano playlist. I figure I could have bought the playlist 10 times over by now... My question is where do you buy music these days? I would like it digitally but I would like to own it (no DRM stuff). Any suggestions on a good store to buy music online? Last time I bought music I did it on itunes...

        • ryantgtg 4 years ago

          Bandcamp! No DRM, and also unlimited streaming of your purchases.

          If most of your music is on major labels, then bandcamp may not be great for you.

          I love it. I think I have around 300 purchases. It’s also great for discovering music. Much better than spotify in that regard.

      • altantiprocrast 4 years ago

        https://www.navidrome.org/ might help

        (I'm unaffiliated, just sharing)

        • ryantgtg 4 years ago

          No sure if the parent (grandparent?) would find this helpful: but here’s the setup I use that provides a cool amount of options.

          I use a Yamaha receiver (R-N803) that has their MusicCast software on it. And I use these various inputs:

          - CD

          - Phono

          - USB. A little teensy usb loaded with music I’ve collected for the last 25 years - however, I’ve cleaned it up so it’s not filled with random things that makes my wife go “what is all this stuff! I just want to see MY music!”

          - I have a Navidrome server running on a pi, with a hardrive connected to it. It basically contains the USB + all the other random stuff. This is played via the bluetooth input and Play:Sub app on my phone.

          - Likewise I play the bandcamp app via Bluetooth through the receiver, and spotify as well (I mostly use spotify for listening to the back catalogs of established artists).

          - Net Radio. Access thousands of radio stations, worldwide, that stream their service. It’s pretty cool!

          There’s more. But, point is the setup is cool and diverse and it’s pretty easy to use.

    • zpeti 4 years ago

      Great to hear someone else is doing the same. For me the biggest issue is I have young kids and the convienence of quick photo taking is good on a smartphone. Might look into pinephone or something similar.

      • hutzlibu 4 years ago

        "Might look into pinephone or something similar."

        Pinephone is my hope, too, but do not expect anything stable soon. That will take some time, probably years. It does not happen on its own, though, they need support now to make it a real alternative and not just a tinker toy.

        • Fnoord 4 years ago

          There's like three major options if you want to run a FOSS smartphone, and I'll sum them up.

          Pinephone: Cheap. The device isn't very powerful. With people coming from an Apple device, that's a problem.

          Fairphone 3: Fair. The hardware isn't very powerful either, and the device is more expensive, but the product is better for the people who assembled it and the environment.

          Librem 5: Open. Even more expensive than Fairphone, but the hardware features killswitches, and there's no binary blobs. Lacks the fair advantages Fairphone has.

          Each of these can run a myriad of FOSS OSes from a deGoogled Android (ASOP-based fork) such as /e/ or Ubuntu or Debian/Arch/Ubuntu mobile versions or SFOS (Sailfish) community version (without Android emulation layer!), and each hardware and software has their pros/cons. I use a Fairphone 3 with stock firmware with a Pinephone as back-up phone (and have to use a Samsung flagship device for work). Previously I used a Fairphone 2 with LineageOS + microG (kind of like predecessor of /e/ before that took off).

          PS: On the gaming side, I'm getting a Steam Deck. Its a bang for the buck compared to Aya Neo/Nintendo Switch/gaming smartphones). No, it isn't open hardware, but the device runs Linux and you get root on it, plus all the reviews (including Linus Tech Tips) are positive.

          • hutzlibu 4 years ago

            "Fairphone"

            It is nice the the fairphone trys to be nice and fair, but I would rather have a focus of a actual open phone under my control and they do not deliver this (not to blame them, the issue is hard). Fixing the global exploitive economy is a different issue and trying to solve everything at once is not working usually.

            "Librem 5"

            How useful is a microphone killswitch, if there is no killswitch for the speakers, that can be used as a microphone, too? And it would be news to me, that it is now completely free of binary blobs and their claims always felt a little bit dishonest to me. I recently read a interview by the former CTO that confirms it

            https://www.phoronix.com/scan.php?page=news_item&px=Zlatan-T...

            I would go with the Pinephone. For now I have a stupid samsung phone with facebook app preinstalled and unremovable, but have not yet found the time to try lineage with it.

            • dahfizz 4 years ago

              > if there is no killswitch for the speakers, that can be used as a microphone, too?

              I know that physically / electronically, a speaker is a microphone, but is there any way for someone to actually record sound through the speakers on the librem? There is a lot more to a microphone than just the diaphragm...

              • hutzlibu 4 years ago

                Not easy probably, but likely doable, when someone think it is worth the effort. When the goal is security, because you feel (rightfully or not) targeted by state level intelligence, false sense of security can be dangerous:

                https://www.hackread.com/hackers-steal-data-air-gapped-pcs-m...

                • detaro 4 years ago

                  The work in that link has some pretty far-reaching requirements to claim that, that do not generalize to random phone hardware.

                  • hutzlibu 4 years ago

                    It was just a random link. There are plenty of other articles in that area I read about, because I do care about privacy. I do not have them at hand - but the point stands - it is possible. So if there are speakers connected - I assume someone could listen.

                    Maybe not at all likely (in my case) but when we talk about real security and for some people this is indeed a question of live and death, then I don't want to promote half solutions.

                    edit: to clarify. , yes a microphone killswitch is probabyl useful in the way that it eliminates most common attack vectors to silently listening to people, but it is potential harmful if people would rely on it for 100% - but do get listened to and send to gulag because the local KGB did in fact took the effort to implement such spyware

            • Fnoord 4 years ago

              Pinephone is a certainly a good bang for the buck, but the hardware is nothing special, and the killswitches are DIP (better than nothing, like Fairphone's current iterations). If you want a cheap solution, this one's the one to opt for. Especially a good option for people who live in (relatively) poorer countries/regions than US or North/West-Europe.

              A lot of people in our world simply cannot afford a Fairphone. I can, and I applaud the project, so I went for it. I also applaud the other projects, and remember that perfect is the enemy of good. That a Librem 5 isn't going to be perfect in terms of security, is OK. Its their first iteration (and they had various iterations of it, which lead to considerable delays).

              There's also some keyboard smartphones such as Planet Cosmo Communicator and Planet Astro Slide. And some other ones as well such as F(x)tec (which is a good successor to Nokia N900). These are also niche, specific, with their hardware keyboard (which include custom layout such as Dvorak). But they can run alternative OSes, by default. I believe that, for me, this (hardware keyboard smartphone) is going to be the ultimate usability dream, if the keys are large enough. I previously owned a Nokia E71 and Nokia N900, before touch typing became the status quo.

              > [...] I recently read a interview by the former CTO that confirms it [...]

              I also backed Astro Slide (and own a Cosmo Communicator), and am disappointed with their hardware downgrade from Dimensity 1000 to 800. I hate it when promises are not kept. But it happens. As mentioned I owned a Nokia N900 previously, but I wasn't fond of the keyboard, so I hope Astro Slide's going to be better. And, given its like the Cosmo Communicator (which I am used to), I am confident it will be. The big disadvantage of Planet devices is their slow updates, and being reliant on Mediatek (MTK) which means EOL soon.

              With regards to hardware keyboard I read Pinephone is planning such as well, which is great news because its otherwise such an affordable smartphone. Pine64 sells a lot of other cool FOSS stuff such as Pinecil and Pine Camera.

            • fsflover 4 years ago

              > And it would be news to me, that it is now completely free of binary blobs and their claims always felt a little bit dishonest to me.

              It's the only phone running FSF-endorsed OS without binary blobs, PureOS. It's recommended by the FSF [0]. More details here [1].

              [0] https://www.fsf.org/givingguide/v11/

              [1] https://source.puri.sm/Librem5/community-wiki/-/wikis/Freque...

              • detaro 4 years ago

                Note that the FSF takes the position that binary blobs that are in non-writable memory and executed by secondary processors are part of the hardware and thus not relevant for judging the openness under RYF criteria. Which is how the Librem5 achieves that status, by deliberately picking components that do not use firmware upload from the host CPU but rather ship with the firmware in non-writable memory, and by adding read-only memory that is only used in the pre-boot environment. The OS is blob-free because the thing is engineered to make the blobs inaccessible to the OS. Which is a valid choice, since entirely blob-free would be impossible to make and ship, but I also see why people disagree that "blob-free" is a good description for the device.

            • marcosdumay 4 years ago

              > if there is no killswitch for the speakers, that can be used as a microphone, too?

              Speakers can be wired to do that, but this is not something you can change with software.

            • dylan604 4 years ago

              >phone with facebook app preinstalled and unremovable

              I've never used a phone like this, but are you also forced to provide FB credentials during initial setup? If not, then is the FB app just being installed a privacy threat if it is never used? Is it still accessing information on the phone without being tied directly to you?

              • npongratz 4 years ago

                I have had a phone that had a Facebook application preinstalled, and prevented removal of said application. Setting up the phone did not require FB credentials.

                I would be worried that even without logging into Facebook or giving it my credentials, my FB-ized phone would help FB's efforts in creating and maintaining shadow profiles. As far as I'm concerned, since the FB app is tied into the OS so tightly that it cannot be removed, it poisons the phone and makes it an adversarial surveillance device.

                This type of poison, of course, is not limited to Facebook.

                • jraph 4 years ago

                  Deactivating the app is usually possible and is equivalent to uninstalling it on Android. It still takes some room in the system partition but you cannot use it anyway. Unless you root your phone, in which case you can also remove the app entirely.

                  These preinstalled apps are still crap though, I'd rather have a smaller system partition and a bigger user data partition, should I own such a phone.

                  • npongratz 4 years ago

                    This phone was paid for and controlled by the company for whom I worked at the time, so I didn't want to cross the IT overlords and their policies by rooting the phone.

                    Good to know about deactivating apps on Android, though; thank you! I do not remember if I had that option.

              • hutzlibu 4 years ago

                " Is it still accessing information on the phone without being tied directly to you? "

                I don't know. I do not have FB. But the fact, that I still have to have the app no matter what I want, illustrates my point, that I really do not own or controll this phone. But it works reliable, was affordable - AND I can remove the batterie.

                And I do it regulary, because then I can be sure, it is really turned off.

                Otherwise I kind of assume everything I do with it or around it, is potentially recorded.

                So yes, I really, really want a phone that I can trust, even if it is turned on.

                • dylan604 4 years ago

                  I can assure you I'm not asking as an advocate for accepting FB to be pre-installed. I'm asking to know truly how vile it really is. At this point, I assume FB knows enough about everyone to be able to ID them without confirmation via logged in FB app.

          • m4rtink 4 years ago

            A slight correction about Sailfish OS - the official/commercial Android emulation layer is only available for officially supported Sony Xperia devices, as can be seen in the table on Jolla Shop:

            https://shop.jolla.com/

            I do have Xperia X & Xperia 10 II and can confirm Android emulation layern works very well.

            You can run Sailfish OS on many other devices thanks to community porting work, but without support for the Jolla provide Android emulation layer. The devices will still run all the many native Sailfish OS apps + ARM compiled flatpaks just fine & the is community work in getting Anbox to run to provide Android emulation on the community ports as well. :)

          • shadowoflight 4 years ago

            If one is okay with less-open hardware, the F(x)tec Pro 1 X seems to be a good higher-end smartphone that has a decent camera sensor, slide-out keyboard, AMOLED display, and can be ordered with either Ubuntu Touch or Lineage preinstalled.

            It's at the top of my if-I-ever-jump-ship-from-Apple list of phones.

            • Fnoord 4 years ago

              Yeah, I'm going for Astro Slide instead. I have used a keyboard similar to F(x)tec in past (the Pro 1 X is just a rebrand, btw) with Nokia N900 and Nokia E71. It simply does not type comfortably on such a keyboard, the keys are too small (there's always a learning curve with regards to layout as it is never 100% standard qwerty). If the Astro wouldn't be available I'm better off with touch type with a second screen as keyboard. Though, do see the Pinephone hardware keyboard link posted elsewhere in this thread. It seems to be akin to the Astro Slide. At least in spirit.

              • shadowoflight 4 years ago

                > Astro Slide

                Oh wow, that's a very interesting option - I like the way that hinge works, but I fear that a mechanical part that complex in a smartphone is likely to become worn out quickly (source: I had both a Moto RAZR and a T-Mobile Sidekick back in the day and both would barely stay closed by the time I upgraded).

                > the Pro 1 X is just a rebrand

                This is very interesting to me, would you mind filling me in on what it's a rebrand of?

            • hutzlibu 4 years ago

              That sounds quite good too. I like pragmatic approaches.

        • squarefoot 4 years ago

          They have other interesting products on their site. I'm also waiting to decide if getting this Pinephone or waiting for a beefier one with more battery life (1), but I can wait since I'm not into smartphones. In the meantime I supported them by purchasing their solder iron, plus tips and other add ons, which works surprisingly good for the price, and I'll probably buy one of their SBCs. So if anyone wants to support them, there are other ways to do that.

          (1) To Pinephone designers: I would absolutely love a 2cm thick Pinephone if that allowed some more speed and serious battery life. I'm serious about that; my current phone is a Nokia 8110 4G (the new "banana") which is 1.5 cm thick, and although the OS is a joke and I use it only for calls and as 4G access point for my laptop, wrt usability it's the best thing I've bought in years.

          • fsflover 4 years ago

            > I would absolutely love a 2cm thick Pinephone if that allowed some more speed and serious battery life.

            This is exactly what the keyboard mod is for. The keyboard has 6000 mHa battery (although it does not make Pinephone run faster).

      • hughrr 4 years ago

        Fortunately mine are teenagers now so that bit of my life is over mostly. I only get photos of them lurking, hiding and giving me the middle finger and that’s about it :)

        • asddubs 4 years ago

          well make sure to take plenty of pictures, you will cherish the memories of them lurking, hiding and giving you the middle finger for the rest of your life

    • kowlo 4 years ago

      You may struggle to get the same functionality from a Nikon DSLR, but at least it has a better camera.

      Are you switching from a laptop to a desktop machine? Do you have no use for the portability anymore?

      • hughrr 4 years ago

        I still have a company issued laptop (Windows based). I don’t need a laptop for personal stuff.

        • beardedwizard 4 years ago

          If you value privacy you won't use a company laptop for personal business.

          • hughrr 4 years ago

            Yes the two worlds are and always will be kept separate. I could have worded that better before.

            I have a desktop (not laptop) for my own stuff and a laptop for company stuff and a dock and KVM setup for it.

            I am not using the company laptop for personal stuff.

          • thunfischbrot 4 years ago

            Doesn't that really depend on how you use it? They might have a separate user account or even a whole separate OS installed like I do, encrypted, and use only online services.

    • amelius 4 years ago

      > It’s too late either way. The fact that it even got this far through implementation says the vendor is not on my side.

      Can Apple force people to install this even on devices they already sold?

      • tchalla 4 years ago

        If you don't use iCloud Photo Library or sync photos to the iCloud, none of this will apply to you.

        https://daringfireball.net/2021/08/apple_child_safety_initia...

        • mandeepj 4 years ago

          > If you don't use iCloud Photo Library or sync photos to the iCloud, none of this will apply to you.

          Not true. They will be scanning your messages also for inappropriate content - https://www.apple.com/child-safety/

          • kemayo 4 years ago

            That's also opt-in, though it's opted in by the parent of the child you're communicating with.

            If you're talking to adults, or children whose parents don't want to use the service, you're not getting your photos scanned.

            • MomoXenosaga 4 years ago

              Interesting, time for American teens to switch to WhatsApp, Telegram or Signal. Kids have sex life finds a way.

              Has Apple thought this through?

              • ratww 4 years ago

                Interesting observation. This might be more damaging to Apple than anything else.

                Without iMessage/Facetime, a large part of the peer pressure teens get for having an iPhone is gone. Now they might start asking for a Galaxy or something like that.

                • WrtCdEvrydy 4 years ago

                  "The green bubbles are people who can sext, blue bubbles are spies for Apple"

                • thw0rted 4 years ago

                  Tweens, yes. Teens can be opted into the feature but it only offers the (teen) user a warning before viewing the image, it never notifies the parent.

              • tpush 4 years ago

                It only applies to children up to age 12 (inclusive).

        • read_if_gay_ 4 years ago

          If this technology is that easily circumvented then why is there an expectation that it will be effective at all?

          • unstatusthequo 4 years ago

            The CASM scanning happens on device, right? At least so we’ve heard.

            My sense is Apple is trying to keep CASM off their servers. Scanning phones before it gets there was their solution to what I assume is a government demand/ultimatum. “Do this or we repatriate your foreign entity taxes” or some other shit.

            I too feel that Apple just caved and eroded trust that took decades to build up. The only way this gets sorted is the “screeching minority” continues to screech and brings other in. Notify state attorneys general, FTC, etc. will that do anything? Who know? My bet is that it’s the DOJ behind all of this.

            Hopefully the plaintiff bar which are already preparing class action lawsuits will find a way to get documents in discovery that allude to government coercion. But then again I’m sure there would be a clever way those are not produced under some “national security” bullbaiting reason.

            All we can do is try, and keep the pressure on.

            • FabHK 4 years ago

              > My sense is Apple is trying to keep CASM off their servers

              It could (maybe) also be a prelude to enabling E2E encryption for everything in iCloud.

              • jeromegv 4 years ago

                That's my theory as well. I see all those people selling their iPhones that will instead use Android, upload all their photos in Google Photos, and Google will happily share all those those same photos. Yes, yes, yes, I know, one is done on device, the other one is done in the cloud, for me that is pretty much 2 sides of the same coin.

            • rahoulb 4 years ago

              As I understand it (and I've not spent too long on this, just picking at various articles) - there are two separate things at play here.

              Firstly - CASM scanning is done via fingerprinting - the image is fingerprinted on device and when uploaded to iCloud that fingerprint is compared with the "dodgy images" fingerprints and an alert raised if a threshold of matches is reached (what's the threshold and with whom?)

              Secondly - there is on-device AI image recognition - when you send an image to someone else (via iMessage or the share sheet) it is checked for nudity and if the iCloud account in question is registered to a 13-year old or younger, their parents are alerted.

              In both cases the fingerprinting/scanning is on-device and is triggered by the images leaving the device.

              • monocularvision 4 years ago

                > the image is fingerprinted on device and when uploaded to iCloud that fingerprint is compared with the "dodgy images" fingerprints and an alert raised if a threshold of matches is reached (what's the threshold and with whom?)

                Nope. The comparison is done on the device and the threshold is set there as well.

                I am not sure how alarmed I am yet at this whole affair but I do know that maybe 50% of posts I read about this have glaringly incorrect information which definitely dampers my alarmism.

                • kemayo 4 years ago

                  > Nope. The comparison is done on the device and the threshold is set there as well.

                  As I understand it the fingerprinting and comparison is done on device, but it only happens as part of the upload-to-iCloud process. So the grandparent's phrasing isn't unreasonable.

          • dwighttk 4 years ago

            It isn’t being circumvented. It is intended to work on photos that are uploaded to iCloud. If you don’t use that (via turning it off or via selling everything Apple and switching to Linux) then you aren’t using it.

            • read_if_gay_ 4 years ago

              No. The intent is not that it works on iCloud. The intent is catching pedos.

              This is obviously not effective given that you can get around it that easily if you want to. Coincidentally though, it will be totally effective at surveilling the 99.999% that are normal users and won’t go out of their way to disable iCloud. The whole CP thing is such an obvious farce.

          • tinus_hn 4 years ago

            The theory would be that many people are stupid. Of course we only know about criminals who get caught and that tends to be because they made a mistake, so it looks like most criminals are stupid.

        • JeremyNT 4 years ago

          So, you trust Apple to install this spyware and only use it in the way they currently describe. Great!

          But what happens the second they get an order from $GOVERNMENT that tells them to use the spyware to also look at other documents on the device?

          I think it's pretty obvious what Apple will say. They'll say "OK." They have no plausible deniability to tell $GOVERNMENT to go pound sand - they have demonstrated the capability already! Telling the spyware to scan different files is a trivial change from a technical perspective.

          • caymanjim 4 years ago

            They could have done what you describe at any time in history. This doesn't change anything in that regard. Either you trust Apple enough to use their products or you don't.

            • adventured 4 years ago

              > They could have done what you describe at any time in history.

              That doesn't make sense. The issue is that Apple is very publicly signaling they are changing their approach to privacy now. Companies change approaches to any number of things all the time, they're not static entities. As such you have to evaluate their nature as a consumer on an ongoing basis, not one time forever. It's true of food, it's true of consumer electronics, it's true of general product or service quality, it's true of privacy issues or censorship, and so on. Apple even knew the consequences ahead of time - per the insider notes - and don't care, they charged ahead regardless.

            • m4rtink 4 years ago

              They could have done that any time because their code is proprietary, their hardware closed & won't boot code not signed by apple + they gate keep all third party apps from their walled garden.

              It would be much harder for them to pull of if the system was open with user actually in control.

            • amanaplanacanal 4 years ago

              So I guess the answer is “don’t”.

      • hughrr 4 years ago

        No but they just refuse to service your updates without this enabled.

        • amelius 4 years ago

          Even security updates?

          • dspillett 4 years ago

            Probably, at least until someone successfully gets a court to say otherwise, by which time it'll be irrelevant because everyone will either have installed & enabled it to get the updates or (less likely given how entrenched many iDevice users have become) moved to other products.

            And after the case to stop them refusing security updates for those without it installed+enabled, there will need to be another one to force them to allow it to be disabled, then a few circuits around the court of public tattle to make it really disable and not magically re-enable itself at random intervals.

          • easton 4 years ago

            Interestingly, iOS 15 is the first version of iOS in history to be optional if you want security updates. You will be able to choose if you want to go to iOS 15 and get the new features (including the CSAM prevention stuff), or you can stay on a security update only channel for iOS 14 (for a unknown period, but I'd guess until WWDC 2022? N-1 seems reasonable).

            https://www.apple.com/ios/ios-15-preview/features/ (under settings)

    • cultofmetatron 4 years ago

      get the z5. I got it a month ago, its smaller and lighter than a dslr and the image quality has been excellent.

      heres my photo gallery all shot with the z5

      https://www.flickr.com/photos/193526747@N04/

      • hughrr 4 years ago

        That’s actually what I’m looking at so far. Thanks for the gallery link - some nice shots in there

    • f3d46600-b66e 4 years ago

      On a DSLR front: are there now DSLR/cameras that do the type of "computational photography" that pixel or iphones are doing?

      Not having to edit the pictures is a huge plus, and JPG files in Nikon, even with dynamic range on, are pretty mediocre compared to Pixel phone.

      • cultofmetatron 4 years ago

        Generally, if you are shelling out the kind of money for a dslr or mirrorless, you want control over the final image. I shoot in raw and tweak the images I like by hand in darkroom. Lightroom is another option if you want to support adobe.

        It takes longer but the end result looks MUCH better than anything your phone can produce. That said, sometimes I just want to take a selfie and not fiddle too much. Thats when I use my google pixel.

        • ValentineC 4 years ago

          What might be useful for the next generation of prosumer cameras is being able to capture depth data (which is probably the main differentiator allowing computational photography to work on smartphones), with editing tools like Photoshop eventually supporting it.

          • cultofmetatron 4 years ago

            lidar builtin to the mirrorless that takes a depth would be amazing. not just for computational photography but also to make focusing way more accurate!

      • hughrr 4 years ago

        You can post process that from RAW if you want to. I generally want people to not fuck around with my images before I get to do it though if I'm honest.

    • Raed667 4 years ago

      I'm not really sure about feature phones either. Modern Nokia ones come preloaded with Facebook and Whatsapp.

      • hef19898 4 years ago

        As soon as I have time it will be CalyxOS on my current Pixel 2. Once I got to update, no idea when, to a Pixel 5 I'm going to try Graphene OS. Oh, and Linux on my private laptop. I don't think you can do anything else to avoid FAANG's eyes. And MS's. Which sucks, I still remember when the only thing tp worry about was malware, not being spied upon by OEMs. Oh, and being spied upon by three letter agencies and their counterparts, but there is not much you can do about that either way.

      • hughrr 4 years ago

        You can remove them from the screen and they are not used unless activated.

        • Raed667 4 years ago

          Again why are they there in the first place? Also how many eyes are on KaiOS anymore? How sure are we of its sandboxing and all other security/privacy aspects?

    • ayush--s 4 years ago

      why would you miss apple music? spotify wipes the floor with apple music!?

    • darthrupert 4 years ago

      Sounds to me like you just wanted to buy a lot of new stuff.

  • nyuszika7h 4 years ago

    I find it laughable whenever someone says "this is the last straw" because it just shows how incredibly misinformed they are.

    Yes, backdooring E2E encryption in general is a bad idea. However, consider two things:

    * iCloud Photos was never E2E encrypted in the first place. They already can scan your photos all they want server-side, and they have been scanning for CSAM since 2019, while Google has been scanning for it since 2009. Yes, if iCloud Photos were to become E2E encrypted leaving in a backdoor like this could be bad, but it's still the lesser of two evils. Would you rather they keep photos non-E2E forever and have even more unfettered access to them than a "backdoor" allows? It does NOT scan photos that are not uploaded to the cloud, despite being on-device. And it's important to note the threshold and manual human review system put in place before the authorities receive any notification at all.

    * For iMessage, all this entails is warning children under 18 about explicit content, and optionally notifying parents if the child is under 13 and the parent opted in. (I don't think it even sends the photo itself to the parents, but that's not explicitly clarified anywhere.) At no point do Apple or the authorities learn the contents of E2E encrypted iMessages. (Also worth noting: if you use iCloud Backup, your messages are no longer E2E encrypted in the backup, as Apple holds the keys to that. This was true even before the new system was introduced.)

    • RHSeeger 4 years ago

      > It does NOT scan photos that are not uploaded to the cloud, despite being on-device.

      Yet. Once it's on the device, it's a MUCH smaller step to use it in other ways. It's certainly easier fro governments to argue that they should be able to force it to be used arbitrarily... you know, for the children/terrorists/etc.

      > And it's important to note the threshold and manual human review system put in place before the authorities receive any notification at all.

      Until it's not. Once again, once it's in place, it's a lot easier for malevolent actors (governments) to force it to be used other ways.

      This a back door. Plain and simple. The fact that it's not _currently_ going to be used for evil (depending on your definition of evil) does not mean it won't be in the near future. Back doors are bad. How many times does this need to be said?

      • avianlyric 4 years ago

        > Yet. Once it's on the device, it's a MUCH smaller step to use it in other ways

        We crossed this bridge a long time ago. Apple already has on device Neural Nets processing everyone one of your on device photos. That’s what powers spotlight search and “photo memories”.

        Simple fact of the matter is that this isn’t the top of some slippery slope, it’s half way down one. A slope we started down when we figured out how to put powerful Neural Nets on mobile devices in people’s pockets.

        > Until it's not. Once again, once it's in place, it's a lot easier for malevolent actors (governments) to force it to be used other ways.

        Which is why Apples current solution makes it cryptography impossible to decrypt photos until a large enough number of suspect photos have been uploaded.

        • trangus_1985 4 years ago

          The key difference, of course, is that when the neural network classifies certain types of content, it doesn't forward it to a centralized server "for review"

          • sharken 4 years ago

            And depending on that review you could find yourself on the other end of some "questioning" from law enforcement.

            Yes, you might laugh and say that won't happen, but on-device scanning is the first step.

            In less trustworthy countries it's not that farfetched to imagine what this can be used for.

            So Apple must back down now or face the consequences in the form of loss of reputation and eventually loss of sales.

      • treesknees 4 years ago

        > Yet

        I keep seeing this jump. There's no evidence this will happen. Apple can already technically do anything they want to compromise the security of your device in the next software update, so could Google or Samsung or any other company. But when in Apple's history have they done this? There is zero reason to believe this is the next step other than speculation and fear mongering.

        • RHSeeger 4 years ago

          > Apple can already technically do anything they want to compromise the security of your device in the next software update

          But they're making it easier for governments to come along and force them to do more. Or even for themselves, but I tend to think they're less of an issue.

          I know "it's a slippery slope" gets overused... but if you keep taking baby slips down that slope, it only gets slipperier. You should avoid taking as many of those steps as possible.

          • ryandrake 4 years ago

            Anyone can imagine a hypothetical future feature and oppose it. What if Apple one day replaces all my music with Best of ABBA? That would be terrible, but they haven't done or proposed it, so why argue about it?

            • emptysongglass 4 years ago

              Because that's not what's being argued here. Nobody in power cares enough to mass load ABBA onto your phone. But there's very powerful nation states who care, more than they care about anything else, to maintain power at any cost.

            • sharken 4 years ago

              Could anyone have imagined law enforcement using Corona contact-tracing data for other purposes ?

              Because that actually happened, and in a democratic country even.

              So it's not hard to imagine what less democratic countries could demand of Apple.

              https://www.abc.net.au/news/2021-06-29/queensland-coronaviru...

          • nicce 4 years ago

            > But they're making it easier for governments to come along and force them to do more. Or even for themselves, but I tend to think they're less of an issue.

            It is as easy as always been. Only problem is that this might give them new ideas. As the most of the politics are probably non-tech people, they don’t know what is possible.

            For tech person, functionality like this (on-device scanning and flagging) is super trivial to add. Antivirus engines have existed decades.

    • new299 4 years ago

      > Would you rather they keep photos non-E2E forever and have even more unfettered access to them than a "backdoor" allows? It does NOT scan photos that are not uploaded to the cloud, despite being on-device.

      Yes I'd rather they do this. The fact that they're implementing on device checks doesn't suggest to me that they will be deploying E2E encryption. It suggests to me that they will be expanding on device scanning to all content in the future.

      If they were going to make iCloud E2E encrypted, it would be a clear win to announce this at the same time as deploying on device scanning.

      • nicce 4 years ago

        Their PR did not handle this well. If you look at the spec, new encryption level has been added, which allows access by Apple only if CSAM hash threshold is reached. It is E2EE with backdoor now.

        • new299 4 years ago

          Unless you have a public reference, I really doubt this is the case.

          Because they’d also need to be announcing that you can no longer reset your iCloud password and recover to a new device. And I’ve not seen anything that suggests this.

          So I suspect it is encrypted at rest, with a key known to Apple as before as well as this CSAM approach.

          • nicce 4 years ago

            There is public reference on Apple site[1].

            Citing final phrase on the paper to TLDR their system:

            > Apple is able to learn the relevant image information only once the account has more than a threshold number of CSAM matches, and even then, only for the matching images.

            This applies only for images, so you can still reset your password. Technically, there are two layers of encryption on images. Regular server-side encryption and this "E2EE like" encryption, which allows access for CSAM matches in specific threshold.

            [1]: https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...

            • new299 4 years ago

              This document contains the following:

              > As part of setup, the device generates an encryption key for the user account, unknown to Apple.

              The question is, how is this generated. Can it be re-derived from information Apple has? If not, how will Apple handle cases where the user loses or breaks their device?

              Is it derived from the iCloud password? Currently Apple can reset your iCloud password and restore access to your images. Will Apple no longer be able to do this in the future?

              It’s really unclear to me, and I’d want explicit answers to these questions personally.

              • nicce 4 years ago

                This seems to be explained on white paper of PSI system[1]. A lot of math is included, but on the page 30 there is a mention that different devices can be used. I am not the one who can explain that well.

                [1]: https://www.apple.com/child-safety/pdf/Apple_PSI_System_Secu...

                • new299 4 years ago

                  Sure, different devices can be used they share the same key as stated in the document.

                  But it’s still not clear how that key is derived. It’s not clear, as implemented that Apple do not hold a master key to decrypt all data (as they do currently).

                  In fact, if the key is randomly generated, if you have one device (as many users do) and you lose that device. Do you lose all your data? Even if you have your iCloud password?

                  It doesn’t make sense. It would be a massive change to how iCloud currently operates and is used. And I find this extremely unlikely.

                  Right now, you can browse your photos online. That functionality is going away?

                  There are seemingly many open questions. But given that there’s no clear statement from Apple, I’m inclined to believe that they retain the ability to decrypt all data.

                  • nicce 4 years ago

                    Most likely you can’t browse your photos online anymore, unless they add some kind of method to export keys from the device(s). I speculate that it is possible to lose all of your data if you lose all of your devices. There might be option to create local backup from device keys, so it would not be the dead end.

                    • new299 4 years ago

                      Given the lack of an explicit announcement this seems very unlikely.

                      I don’t think Apple are stupid, it would have been a clear PR win if they said “we’re adding E2EE”.

                      Given no explicit statement, and how drastically it changes the nature of their service, I don’t think your speculation is justified.

                      • nicce 4 years ago

                        The problem is, that this was not supposed to be released properly yet. Missleading leak caused them to hurry. About E2EE it is not speculation because it is literally on their papers what I linked?

        • charcircuit 4 years ago

          It's not a backdoor if it's a public part of the system / protocol.

    • Notanothertoo 4 years ago

      Exactly. This has all been done for 10 years in various forms.

  • neilv 4 years ago

    I blew my entire weekend trying to move off iPhone, to something more trustworthy (a problem that's getting harder, because of some other things going on): https://news.ycombinator.com/item?id=28111995

    • cheald 4 years ago

      I run CalyxOS on a refurbished Pixel device. Works great, very privacy-respecting out of the box. Good at being a device that you actually own.

      • neilv 4 years ago

        Thanks, I was looking at CalyxOS, and plan to try that "next weekend".

        • neilv 4 years ago

          In case anyone reads this thread later.. So far, CalyxOS is working out well. The hardest parts were buying a used device for it (uncertainty of getting a version with an unlockable bootloader, and having to avoid all the OLED display problems).

          • cheald 4 years ago

            Chiming in with my own experience, as well. The two wrinkles I've encountered:

            * SafetyNet doesn't work, as expected, which means no Google Pay. Nothing else I use has been impeded, though. * Chromecasting doesn't seem to be implemented in MicroG, so no casting content

            Other than that, it's been solid.

    • stemc43 4 years ago

      lineage os is what I use. oneplus devices are simply superb.

      but you are forewarned - you can blew through way more then a weekend de-oppressing you digital life.

      • neilv 4 years ago

        Exactly. I've invested an hour or two :) towards trying to have the tech I use be less-creepy.

        It turns out that compromises have to be made. And it's also a moving target.

  • GeekyBear 4 years ago

    >It's definitely the last straw for me in terms of apple products.

    Google, Microsoft, Facebook, Twitter, etc. have all been scanning content for those same child porn images for darn near a decade now.

    >The system that scans cloud drives for illegal images was created by Microsoft and Dartmouth College and donated to NCMEC. The organization creates signatures of the worst known images of child pornography, approximately 16,000 files at present. These file signatures are given to service providers who then try to match them to user files in order to prevent further distribution of the images themselves, a Microsoft spokesperson told NBC News. (Microsoft implemented image-matching technology in its own services, such as Bing and SkyDrive.)

    "There are two opportunities to look at content," when it's going into a cloud-storage account and when it's leaving, she said. "There is technology to do this," Grant added, pointing out that file signatures — unique hashes or fingerprints — could be used to confirm the nature of the files.

    https://www.nbcnews.com/technolog/your-cloud-drive-really-pr...

    • hef19898 4 years ago

      Scanning on their cloud storage is different from scanning on users devices. The latter is opening a door to mass surveillance that wasn't even there before. That is the problem, not the scanning itself.

      • GeekyBear 4 years ago

        The thing that is changing here is that the images are scanned on the device before being uploaded to the cloud, instead of being scanned on the server after they are uploaded to the cloud.

        If it hasn't been a problem that Google has been scanning your cloud data for the last decade, it didn't suddenly become a problem now.

        • hef19898 4 years ago

          The scanning was already a problem. A lesser one, as not using Google or iCloud avoided it. Now I can't, because technically Apple can now look directly at the phone. No idea hey it is so hard to get that difference.

          • nicce 4 years ago

            You should note the fact that everything on closed systems is based on trust. Apple has had always the opportunity to look directly on your phone. That does not simply change overnight when they add some feature. And currently they scan only those images on-device, which would end up into the cloud. You can avoid this scanning by not using iCloud.

            The feature what everyone is afraid of (scan all in my device), is super trivial to add generally. Company like Apple can push it to public in less like week regardless if this Child Safety came first.

            This new feature is actually really hard to develop, because they try to create E2EE system with backdoor. And they want to lock themselves out of this backdoor to prevent misuse.

          • GeekyBear 4 years ago

            Technically, Google can look directly into Android phones.

            There is no difference.

  • JK_2234 4 years ago

    Comments from a different post shows this is "Presumably to implement E2E encryption, while at the same time helping the NCMEC to push for legislation to make it illegal to offer E2E encryption without this backdoor."

    If this is the case, then It is coming to every device (not just apple) or E2E will be made illegal(or a backdoor).

    • 14 4 years ago

      End-to-end encryption (E2EE) is a system of communication where only the communicating users can read the messages. In principle, it prevents potential eavesdroppers – including telecom providers, Internet providers, and even the provider of the communication service – from being able to access the cryptographic keys needed to decrypt the conversation.[1]

      End-to-end encryption is intended to prevent data being read or secretly modified, other than by the true sender and recipient(s). The messages are encrypted by the sender but the third party does not have a means to decrypt them, and stores them encrypted. The recipients retrieve the encrypted data and decrypt it themselves.

      Because no third parties can decipher the data being communicated or stored, for example, companies that provide end-to-end encryption are unable to hand over texts of their customers' messages to the authorities.

      Would it even be considered end 2 end encryption based on this Wikipedia definition? I don’t think it meets the definition if apple can determine certain files exist in a conversation.

  • Razengan 4 years ago

    This reads like a pre-prepared rant.

    > It's definitely the last straw for me in terms of apple products.

    Uhh and where else will you go where the grass is so much rosier privacy-wise?

    • misnome 4 years ago

      > I'm slowly trying to switch to a lightphone (non smartphone)

    • nix23 4 years ago

      Pixel4 with LineageOS (Non google apps) F-Droid and Nextcloud for Backup and sync....grass is rosy for me ;)

siscia 4 years ago

So I was ignorant on the issue and completely against the approach of Apple.

Then HN taught me that any company storing images on their infrastructure in the US must report pedophilic images to the US government.

At this point, the approach taken by Apple seems like the best one to me, if you don't want to store pictures in clear on your servers.

What other technical approach are people advocating for?

Another point it is to try to change the law, but this is beyond the scope of the conversation.

  • viktorcode 4 years ago

    The problem I have with this approach is that it introduces on-device scan for images. All what is needed to adopt it to scan for different kind of images is to connect it to different database, say, Winnie the Pooh memes featuring CCP chairman, and boom, jailed dissenters. And ability to scan all images is but a minor firmware update away.

    Server scanning makes it clear that the company running the servers has access to your photos. So you can either find a form of encrypted storage, or be okay with that, depending on your privacy stance. Having device with ability to scan your photos removes that choice. It is a privacy invasion.

    • hansel_der 4 years ago

      > ability to scan all images is but a minor firmware update away

      ios already does on-device ml-based photo categorisation for some time, afaik no way to turn it off.

      • romwell 4 years ago

        And now it's pretty much the same thing, but with a SWAT team knocking your door out when the ML messes up.

        Yay progress.

        • davidcbc 4 years ago

          The SWAT team is knocking on your door after you've uploaded multiple instances of child porn to iCloud and those instances have been verified to actually be child porn by a human. That sounds fine to me.

          • romwell 4 years ago

            >The SWAT team is knocking on your door after you've uploaded multiple instances of child porn

            ...or whatever gets sneaked into a database that nobody can take a look at, and whose maintainers have zero obligations to you.

            >and those instances have been verified to actually be child porn by a human.

            Yeah, SWAT teams doing their homework before shooting people up is precisely why SWATting is a completely innocent thing to do and never put anyone in danger.

            And that also does nothing in case of "neural" (aka blackbox) hash collision, where the Algorithm mistakes a normal picture for CP. The "human" you have in your dreams doesn't have access to the actual file you have on your device, right? (At least, that's the sales pitch for on-device privacy). They won't know until they get you.

            Personally, I would hope that HN people know better than to blindly trust an opaque algorithm running off an opaque database to never make a mistake in where it sends SWAT teams.. but here we are.

            • davidcbc 4 years ago

              The algorithm doesn't report to the SWAT team. It reports to Apple who verifies it.

    • EveYoung 4 years ago

      But Apple only plans to scan photos that are synced with iCloud, don't they? So you could just switch to an E2E encrypted alternative and drop iCloud completely.

      • unionpivo 4 years ago

        Yes but it probably took additional code to only scan pictures that are synced to iCloud.

        Probably not monumental task, to change to scan every picture.

        • nicce 4 years ago

          I have to remind again, that iOS is a blackbox, closed source system. All this speculation applies also for a moment before they added anything. They might have had this code ready for years already. All we have is what they say. It is already very trivial to scan everything on on your device and send that metadata. Few lines of code. At the moment when they say about scanning everything in phone publicly without opt-out, then we should be worried. Once again, there is no way telling what they are doing already.

          • unionpivo 4 years ago

            The difference is now every government knows too.

            They can't pretend they don't have the capability.

            And if they can scan for CP, why can't they scan for "whatever" else instead.

            • m4rtink 4 years ago

              This is not the first time they have run into this - due the to AppStore being a walled garden they are the sole gate keeper who decides what goes in and what not. Makes sure the users are safe and everything. Perfect, right ?

              Well, until protesters want to use an app in the store to coordinate their protests yet the government wants you to reject it, so the protesters can't use it:

              https://www.applefritter.com/content/teargas-walled-garden-i...

              With users not being able to install the app themselves Apple is the single point of failure with no plausible deniability like Android (any any sane OS in general) has. And they did reject the app.

              And just a few months before this happened I attended a talk about free software from FSF and they mentioned just the same thing about iOS and the gate keeper being the single point of failure a repressive regime can apply pressure on. Turned on to not be far fetched at all...

            • avianlyric 4 years ago

              iOS has been running complex Neural Nets on all your images for years now. It powers all their social features and search.

              Apple have always had the capability, and have been advertising it as central selling point of new versions of iOS for years. That ship sailed along time ago.

              • nicce 4 years ago

                Neural Net might be overkill as an example. Antivirus software has existed since 1980s[1].

                [1]: https://en.m.wikipedia.org/wiki/Antivirus_software

              • adventured 4 years ago

                What changed is Apple just signaled to the governments of the world what it's willing to do toward abusing user privacy and exactly how it can work. And hey, Apple, if you're willing to do that, why not just go a bit further and do this, because we're asking you to or else (and now we know you're obviously even more morally flexible than what you used to present yourself as).

                Before that, Apple put up a front that they would fight for user privacy at every turn. They pitched that over and over and over again as a corporate ethos, a selling point. That was the facade at least, even if one is cynical and wants to pretend it was a lie. Now they're not even presenting the facade, which will open the flood gates dramatically. They went from a supposedly resisting agent, to a morally gray and willing agent at a minimum. Apple dumped an enormous vat of blood into the shark infested waters.

                • nicce 4 years ago

                  I think I disagree. Current move was improvement for user privacy, compared to what it used to be. Abuse is only on speculation, not on what has actually been done.

      • croutonwagon 4 years ago

        Well.....

        I think its more than that. images sent with iMessage are stored in iCloud, even if the device is not necessarily uploading.

        How else would that have such warnings they claim in their announcement. [1]

        And we have seen these systems have their scope/use case changed in the past [2]

        To the point in the other discussion [3]. OP stated that Apples plans to scan and then upload suspected images are illegal. But i would think that they are only scanning images, client side, that users themselves are attempting to upload (either though attachments, or automatic iCloud backups etc) which would put Apple in the clear. In this case that would be iCloud images, or those that piggyback iCloud services like iMessage etc.

        [1] https://www.apple.com/child-safety/ [2] https://www.eff.org/deeplinks/2020/08/one-database-rule-them... [3] https://news.ycombinator.com/item?id=28110159

        • jeromegv 4 years ago

          Stop repeating this lie. iMessages photos are not part of this. This is written in the technical document. This is only photos from iCloud photos. It's been debunked, just read this article: https://daringfireball.net/2021/08/apple_child_safety_initia...

          And of course the scope could change tomorrow. Just like the scope of Android could change tomorrow. They could even have changed the scope without doing an announcement!

          • croutonwagon 4 years ago

            So there is really no need to be this aggressive.

            In my comment history it clearly shows that there's an effort to parse through the information and seek clarity.

            And its worth noting that iMessage data is and can be backed up to iCloud, and not just using backups. For many with multiple devices this is specifically useful.

            https://support.apple.com/en-us/HT208532

            Further, as to this

            >And of course the scope could change tomorrow. Just like the scope of Android could change tomorrow. They could even have changed the scope without doing an announcement!

            I am pointing out that there is a specific history of this already on record and documented. And their technical documents specifically state their intentions.

            Page 3 : https://www.apple.com/child-safety/pdf/Expanded_Protections_...

            "This program is ambitious, and protecting children is an important responsibility. Our efforts will evolve and expand over time"

            I don't understand why you find such an observation so offensive. Its pretty clear Apple sees this as a first step into what will eventually be a much larger program.

    • carom 4 years ago

      >The problem I have with this approach is that it introduces on-device scan for images.

      Windows already does this via Windows Defender. This is a basic AV functionality and much more privacy preserving.

      • DangerousPie 4 years ago

        But Windows Defender doesn't report you to law enforcement when it believes it found a virus.

        • nicce 4 years ago

          How do you know that? It is blackbox paradox, and all we have is what they say. They might report CSAM hashes to law enforcements. Any file can be a threat, hence images are included for scans. Defender also uploads whole files as unencrypted if you don’t opt-out.

        • vmladenov 4 years ago

          Neither does this.

          https://www.howtogeek.com/719825/how-to-stop-windows-10s-ant...

          If Microsoft receives an illegal file through this channel, they are legally obligated to report it in the US.

          • thw0rted 4 years ago

            ...if a human actually gets the file, figures out what type it is, and examines it for themselves, they'd be obligated to report it. With the number of Win10 devices in the world, how big would their security team have to be to hand-groom every automatically submitted "suspicious" sample? (For that matter, why would a vanilla JPG get flagged as "suspicious" in the first place?)

    • avianlyric 4 years ago

      > All what is needed to adopt it to scan for different kind of images is to connect it to different database, say, Winnie the Pooh memes featuring CCP chairman, and boom, jailed dissenters.

      The CCP have already throughly demonstrated that they don’t need manufactures consent to build these systems.

      Look at the Uyghur population in China. They already have their phones scanned on device for dissident material, not by coercing manufacturers, but by forcing the population to install a surveillance app. Then making it illegal to use a phone without it.

      Being caught at checkpoint without the app installed and working is grounds for immediate arrest and re-education.

      • adventured 4 years ago

        > The CCP have already throughly demonstrated that they don’t need manufactures consent to build these systems.

        It was obviously merely an example for illustration purposes by the parent. To get a point across it's often very helpful to use a stark, clear example.

        Few governments will ever have the extraordinary capabilities and resources of the CCP in China.

        For the other ~190 governments that will never reach that level of capability, what they might have now is a globe-spanning billion-device corporation like Apple more willing to assist them.

  • 542458 4 years ago

    Do mandatory reporter laws work like that? I was under the impression that you had to report something if you saw it, but you had no obligation to be actively scanning or to compromise encryption to do so. For example, I don’t think S3 does any active scanning and you can definitely shove any encrypted blob you want onto their servers with no obligation to give them a decryption key.

    IMO this appears to be Apple either a) trying to preempt future criticism or regulation or b) responding to some behind-closed-doors pressure/bargaining with US authorities.

    • nicce 4 years ago

      I think you have to be aware of what is happening, before you can say that nothing criminal happens. This is where scanning steps in. You can’t turn blind eye.

      • amanaplanacanal 4 years ago

        There is a big jump from reporting criminal activity if you happen to see it, to actively searching it out. It is the jump from police arresting you if they see you smoking a joint to police searching your rooms to make sure you don’t have any cannabis in there.

        • nicce 4 years ago

          I read the law and you are correct, there is explicitly mentioned that provider is not required to enforce seeking of CSAM evidence. However, they might be required to comply the demands of NCMEC if they ask to stop redistribution of certain visual depictions by providing hashes. This is were scanning steps in.

  • GeekyBear 4 years ago

    > Then HN taught me that any company storing images on their infrastructure in the US must report pedophilic images to the US government.

    It's certainly been going on for the past decade.

    For example:

    >a man [was] arrested on child pornography charges, after Google tipped off authorities about illegal images found in the Houston suspect's Gmail account

    https://techcrunch.com/2014/08/06/why-the-gmail-scan-that-le...

  • matheusmoreira 4 years ago

    Simple.

    1. Encrypt everything.

    2. Don't store images on your servers at all.

    There's nothing to report if all you have is some encrypted blob. Alternatively, just don't consume any user data at all. Data is and should be a massive liability.

    • skinkestek 4 years ago

      > Data is and should be a massive liability.

      My thoughs as well.

      If you don't want there very dangerous weapon you have thought out to be abused, don't create a physical assembly of it and don't tell anyone who has a habit of abusing powerful weapons.

  • ianmiers 4 years ago

    Apple's banned image reporting wont stay iCloud only.iMessage is next. Maybe all data on your phone. 1) phone scanning is overkill for pics already on their servers. You don't build this and take the PR flack for something you can already do server side 2) Even if it's somehow not Apple's plan, they will be forced to use it on iMessage. Congress has been trying to for years.See the EARN IT act[0].

    Apple just erroneously said "it's safe" despite the fact that it clearly can be abused.

    [0] https://blog.cryptographyengineering.com/2020/03/06/earn-it-...

    • zepto 4 years ago

      > You don't build this and take the PR flack for something you can already do server side

      That’s exactly what you do if you plan to enable E2E.

      • ianmiers 4 years ago

        Yep. That certainly is the next step. And then, once you are scanning encrypted data, iMessage is next whether you want it or not.

        • nicce 4 years ago

          It is not the next step, it is already there, if you read the technical papers. Additional encryption level comes to iCloud images with this change, and Apple can’t see your photos anymore unless CSAM threshold is achieved.

        • zepto 4 years ago

          > And then, once you are scanning encrypted data,

          They aren’t.

          > iMessage is next whether you want it or not.

          Is there some evidence you have of this plan? Sounds like this is just a fear you have.

          • ianmiers 4 years ago

            >Is there some evidence you have of this plan? Sounds like this is just a fear you have.

            The EARN IT act. It may not be Apple's plan, Apple's plan, as you suggest, might only be for doing scanning on encrypted iCloud and excluding encrypted iMessage. But what Apple will be pushed to do after that is pretty clear.

            • zepto 4 years ago

              If the government passes a law mandating that encrypted messages be scanned, it won’t be done using this CSAM mechanism, and it won’t only be Apple doing it.

              In short, you might be right to be afraid of this outcome, but it has nothing whatsoever to do with CSAM countermeasures.

              • ianmiers 4 years ago

                Read the article and discussion here https://news.ycombinator.com/item?id=28118350. It makes the point pretty well.

                >That, of course, is the rub: Apple controls the algorithm, both in terms of what it looks for, what bugs it may or may not have, and also the inputs, which in the case of CSAM scanning is the database from NCMEC. Apple has certainly worked hard to be a company that users trust, but we already know that that trust doesn’t extend everywhere: Apple has, under Chinese government pressure, put Chinese user iCloud data on state-owned enterprise servers, along with the encryption keys necessary to access it. What happens when China announces its version of the NCMEC, which not only includes the horrific imagery Apple’s system is meant to capture, but also images and memes the government deems illegal?

                >The fundamental issue — and the first reason why I think Apple made a mistake here — is that there is a meaningful difference between capability and policy. One of the most powerful arguments in Apple’s favor in the 2016 San Bernardino case is that the company didn’t even have the means to break into the iPhone in question, and that to build the capability would open the company up to a multitude of requests that were far less pressing in nature, and weaken the company’s ability to stand up to foreign governments. In this case, though, Apple is building the capability, and the only thing holding the company back is policy.

                • zepto 4 years ago

                  I’ve read the article. It changes nothing.

                  I agree that it could be used to detect image collections (and only image collections) that are not porn, that users upload to iCloud Photo Library.

                  That is the only established abuse case. Apple has categorically denied that they will comply with it, just as they refused to help the FBI in the San Bernardino case.

                  Even if they do end up complying in China because China passed a law, authoritarianism in China is a red herring. This mechanism is of no consequence to the Chinese government.

                  All of has absolutely nothing to do with your claim that ‘iMessage is next’ and the article doesn’t support your claim.

    • thw0rted 4 years ago

      From everything that I've read, iCloud Photo Library is currently encrypted on the server, with a key that Apple only uses when presented with a warrant. If I ran the company (disclaimer: I do not) I'd implement this with an airgapped system in a vault somewhere, where a very small number of people have access to bring encrypted images in on a CD-R under two-person control.

      That being said, one of two things is true. Either Apple does exactly what they say, in which case they are not able to perform server-side content / fingerprint scanning, or Apple is outright lying about only using their key on behalf of law enforcement. This latter case would open them to all sorts of legal liabilities, like a suit from shareholders for false reports. It would also require the silence of every Apple engineer who has ever been involved in at least their iCloud Photo program, and probably a bunch of server infrastructure as well. Additionally, they'd be legally obligated to report their scan results to the NCMEC but would have to do so in a way that doesn't give away that they're lying about how their systems work.

  • hef19898 4 years ago

    Because once that functionality is there it affects everyone, not just in the US. And it basically means we sell out our democratic principles, or rather allow our tech giants to sell it out. Or force them to do it, like our elected governments doing it. Either way, I don't like the outcome.

    • zepto 4 years ago

      > Because once that functionality is there it affects everyone, not just in the US.

      The functionality to detect CSAM uploaded to Apple’s servers or sent to pre-teens?

      > And it basically means we sell out our democratic principles

      What democratic principle is being sold out?

      • hef19898 4 years ago

        The right to secret communication. The right of not being under surveillance. The government cannot open letters without a warrant, but somehow Apple, Google, MS and co can sniff through electronic communication as they see fit because of a clause in an EULA. No idea how came there, but maybe the days when Stasi surveillance was the poster child of government intrusion into private life are too long gone to be remembered. Or they aren't and certain people choose to make shit load of money from the thing.

        • jdavis703 4 years ago

          > The government cannot open letters without a warrant, but somehow Apple, Google, MS and co can sniff through electronic communication

          This is no different than a private doctor testing for illicit drugs and reporting results to the DEA (they literally do this for ADHD patients.)

          • pseudalopex 4 years ago

            I know American ADHD patients. None of them take drug tests.

            • jdavis703 4 years ago

              I’m an American ADHD patient. My doctor made me (and his other patients) come in on random weekends for drug tests. He said the DEA made him report his records.

              • pseudalopex 4 years ago

                It's easy to find other people saying they don't have to take drug tests. It seems more likely your doctor is mistaken or lying than many other doctors just ignore a legal requirement.

                • jdavis703 4 years ago

                  I don’t know if it’s a legal requirement, I certainly couldn’t find any information on it. My assumption is some lawyer told him it’s a “best practice” to have this information on record in case the DEA audits him or something.

        • amanaplanacanal 4 years ago

          Might as well make it legal for police to search our houses at will, as long as they are looking for child abuse images. Doesn’t sound much like the US any more at that point.

        • zepto 4 years ago

          > Apple, … can sniff through electronic communication as they see fit

          Except that they can’t and don’t.

          • hef19898 4 years ago

            Google is checking, apparently, Gmail for cp. Apple is doing it, soon, on your phone. Checking your mail for analog cp requires a warrant and can only be done by police. See the difference?

            • zepto 4 years ago

              > Apple is doing it, soon, on your phone.

              Apple is only checking images you choose to upload to iCloud photos to see if you are uploading a collection of CSAM. This is entirely optional, and they have publicly explained what they are doing.

              They are not sniffing through your communications as they see fit.

              • hef19898 4 years ago

                One last try, after that I'll stop since you are all over these submissions defending Apple here.

                Take traditional mail. That is not opened, it is, usually, not read. Nor is content checked. It can, and is, opened in case of warrants (let's ignore totalitarian regimes here). What Google is doing when it comes to photos, as was Apple before, is opening every envelope containing photos to check wether or not it was CP. Already bad enough because they still opened your mail. You could avoid that by just using another mail carrier, so.

                What Apple is doing now is checking you photos before you put them in the envelope. In case they find too many stuff they don't like they open all your other photo albums. And they tell authorities. Without any means for you to prevent that. It's like the postal service looking at your mail before they pick it up.

                All that without oversight by courts. Without proper legal and investigative proceedings. Heck, even without any law, currently, forcing them to do that.

                The more recent incidents where that or similar things happened were:

                - the USSR

                - the DDR with the Stasi

                - Nazi Germany

                - Western allies during WW2 through dedicated censorship bureaus

                All of those were historically deemed unacceptable, maybe necessary for the greater good so. Now a private entity, with a global reach, does the same thing in principle. Even with the technical capabilities to do it on a much larger scale, and more thoroughly. And because of Apple being private is, for some reason, ok for you.

                Not sure if further discussion woth you has a point, I'll just leave it at that.

                • zepto 4 years ago

                  > One last try, after that I'll stop since you are all over these submissions defending Apple here.

                  Ad hominem is bad faith. It’s usually a sign that you know your arguments don’t hold up.

                  > What Apple is doing now is checking you photos before you put them in the envelope.

                  No, an ‘envelope’ is a totally misleading analogy. This has nothing to do with sending messages.

                  If you want an analogy try this one: Apple provides a warehouse for people who want to store copies of their precious photos. They give you a copier to make copies of your photos, you give them the copies, and they file them.

                  Because they don’t want a vault full of child porn, then equip the copier with a scanner to detect known child porn while it makes the copy.

                  That is all that is happening here. No sniffing through communications as they see fit, only a way to prevent you from uploading child porn to their service.

                  Anyone saying otherwise simply isn’t being truthful.

                • lttlrck 4 years ago

                  traditional mail is under federal jurisdiction from mailbox to mailbox. For obvious reasons. Storing files on a private company's servers is nothing like that whatsoever.

  • sebzim4500 4 years ago

    They could still do the scanning, but if the photo fails it would just refuse to upload it and display an error to inform the user that the photo will not synchronize. There is no reason that the results of the scans need to be sent to Apple servers.

    • nicce 4 years ago

      If you read the technical details, result of the scan is packed with the photo. So, if upload of photo fails, then result of the scan is not uploaded as well.

  • tyingq 4 years ago

    >if you don't want to store pictures in clear on your servers

    Reading https://support.apple.com/en-us/HT202303 , it seems that Apple may encrypt pictures on their servers, but they have the key. The list of what's actually end-to-end encrypted doesn't include photos. So, they may be scanning on your phone, but they can scan on their servers if they wanted to.

    • thw0rted 4 years ago

      I posted more detail upthread but what I've found suggests that Apple does have a key to decrypt pictures but they claim to use it only to respond to a warrant. (They could of course be lying about that, but I don't believe they are.)

    • siscia 4 years ago

      I believe the want this update exactly to enable E2E encryption.

      In this way the can get rid of the keys on their servers and still find pedo pictures.

  • nojito 4 years ago

    Apple doesn’t scan iCloud for CSAM and refuses to do it. Which is why they researched intensively on differential privacy.

    • sneak 4 years ago

      But they could, as iCloud Photos is not e2e (Apple can read all of it) and they turn over the user data on over 30,000 users per year to the USG without even a warrant.

      This is just farce.

      • zepto 4 years ago

        > But they could, as iCloud Photos is not e2e

        Client side scanning is a prerequsite to making it e2e if you also want countermeasures against CSAM.

        • amanaplanacanal 4 years ago

          Has Apple said this is what they are going to do, or are people just guessing?

          • zepto 4 years ago

            They haven’t announced this, but they invest a lot in encryption and privacy, and have stated that user privacy is a value of theirs. They have also expressed that they don’t want access to be able to be forced by law enforcement.

            • sneak 4 years ago

              Their actions speak louder than their words.

              • zepto 4 years ago

                And their actions show that they aren’t likely to do any of the scary things they are being accused of.

                That’s the point - people keep claiming some nefarious slippery slope, which is of course in the realm of possibility, but is not actually happening.

                • dont__panic 4 years ago

                  Apple stores user iCloud backups and their encryption keys on Chinese government-controlled servers in China, and gives the Chinese government full access to those servers. And routinely grants the US government warrantless access to those same backups in the US.

                  So what actions are you referring to that show they won't do any of those scary things?

                  • zepto 4 years ago

                    Right, so presumably you’d agree that the people who are saying that CSAM detection is a problem because China might abuse it are just being silly, right?

                    As for the US government having access to the backups, that’s required by law.

                    You can always make the paranoid case that Apple wants to do this because they are somehow lying about their values, or you can make the case that their hand has been forced.

                    You could also note that they promised to implement e2e backups but haven’t yet, and this is rumored to be because the FBI asked them not to.

                    If you assume that Apple is doing this stuff because they want to, then of course you’ll see this next move as just another bad thing they are doing.

                    If on the other hand you consider that they don’t want to do these things but are being forced to until they have a better option, then you can look at this move as a way to get out of a double bind.

                    Now they can turn on e2e without being accused of creating a safe haven for pedophiles.

                    Both pathways are plausible, but given the investment in privacy Apple has been making and the consistency with which they state their values and boundaries, I don’t think they want to be creating backdoors.

                    • sneak 4 years ago

                      > As for the US government having access to the backups, that’s required by law.

                      This is a false statement. Google's android backups are end to end encrypted.

                      • zepto 4 years ago

                        It’s not a false statement.

                        Encrypted or not, Google will give the backups to the government, along with any keys they have.

                        I agree that there would be more protection against the government if the backups were encrypted, and I hope this is still Apple’s plan.

                        Google on the other hand, has been scanning photos for CSAM all along, and collects a massive trove of behavioral data from android and every one of their other properties including search history, all of which are also available to the government.

      • nojito 4 years ago

        No they refuse to indiscriminately scan iCloud.

      • thw0rted 4 years ago

        Does that 30k number include iCloud Photo data? Do you have a citation for this?

        • sneak 4 years ago

          Apple's own transparency report, under FISA orders. Presumably it includes all subscriber data they can access for the specified accounts, so likely contacts, photos, and device backups (full iMessage chat history, or sync keys to decrypt same).

          FISA orders are not warrants and do not require probable cause; the FISA Amendments Act Section 702 spying that goes on (aka PRISM internally to the IC) pulls data directly from cloud provider systems without a search warrant and was cited by Ed Snowden as one of the main reasons he came forward.

  • ryanlol 4 years ago

    > any company storing images on their infrastructure in the US must report pedophilic images to the US government

    Ones they know of…

    > What other technical approach are people advocating for?

    Apple already has a technical solution, encryption.

    • zepto 4 years ago

      > Apple already has a technical solution, encryption.

      How does encryption help prevent porn being sent to pre-teens?

      • ryanlol 4 years ago

        That’s a completely different feature than the one we’re discussing. These things were announced together, but they are not the same.

        Nobody is objecting to opt-in clientside content filtering.

        • zepto 4 years ago

          Both of the features involve opt-in client side content filtering.

          The only objections are to that.

          • ryanlol 4 years ago

            The CSAM scanning is not opt-in.

            Sure, you could stop using iCloud. That’s opt-out.

            • zepto 4 years ago

              That’s not correct. This applies only to iCloud Photo Library, not to iCloud as a whole.

              iCloud Photo Library is an optional feature, and there are numerous alternatives.

              • ryanlol 4 years ago

                Doesn’t matter, it’s still opt-out if you were using iCloud Photo Library before these features were announced.

                It’s really ridiculous to try to call this “opt-in”.

                • zepto 4 years ago

                  I was using iCloud Photo Library before this was announced.

                  None of my photos have been scanned, nor ever will be unless I choose for them to be. I don’t have to do anything to achieve this. They won’t scan anything unless I decide to go ahead.

                  That is the very meaning of opt-in.

                  Opt-out typically means that someone will go ahead with something unless you decline. This is not that.

                  I do agree that if I don’t want on device scanning in future, I will need to choose another could photo service, but in the meantime, nothing will be scanned without me taking positive action to initiate it.

                  • ryanlol 4 years ago

                    > Opt-out typically means that someone will go ahead with something unless you decline. This is not that

                    That’s exactly what this is. If you use iCloud Photos your pictures will be scanned unless you explicitly disable iCloud Photos.

                    How is that not opt-out? You never get asked if you’d like to opt-in to have your images scanned for CSAM.

    • ashildr 4 years ago

      Encryption does not help, Apple still is responsible. If Apple intends to let the user store photos in iCloud (or send by imessage) encrypted, they either have to keep the keys, so they can decrypt and scan the photos or or to keep the user from uploading incriminating content. Apple found a third way: they will only get to reconstruct the keys if the user uploads too many pictures triggering alarms.

      • adambatkin 4 years ago

        Source? I am not aware of a law in the US that requires Apple to actively scan images, or to store them unencrypted (or keep copies of the keys).

        • hef19898 4 years ago

          The US aren't the only government with a stake in that. And countries like China, Saudi, the Emirates have a lot of leverage. Financially and diplomatic. Heck, Facebook bowed to Myanmar just to get the users there.

        • nicce 4 years ago

          Every cloud infrastructure holder is required for doing that. Closing an eye does not take a duty away. You must be actively pursuing that. Encryption would start flood of new laws

          https://www.govinfo.gov/app/details/USCODE-2011-title18/USCO...

          • skinkestek 4 years ago

            Tarsnap exists so either it is legal when done right or tarsnap is a walking dead and I haven't heard anything to that effect from any credible source.

            • nicce 4 years ago

              I guess that service slightly goes out of the scope for active scanning, because it is for general backup, not a cloud especially for photo sharing and storing.

              • skinkestek 4 years ago

                And that is my point: by tying oneself to the mast, denying oneself the access to navigate after the sweet sweet sound of user data, it becomes possible to sail straight past the sirens.

                Today this is less about physically tying management and physically putting wax in the crews ears and more about technically and legally making oneself unable to touch the juicy juicy customer data.

          • ryanlol 4 years ago

            Those laws do not exist (yet?). You can’t justify this as a compliance measure for legislation that does not exist.

            • nicce 4 years ago

              Yes, but current laws also restrict storing images as E2E encrypted, so there is dilemma?

              • ryanlol 4 years ago

                Where are you getting this from? That’s simply not true. It’s perfectly legal to “store images as E2E encrypted”

                • nicce 4 years ago

                  Encryption itself is not illegal, but it might make harder to comply other legal requirements. I have just heard this many times, and now I read the whole law (curse me). It says on 2258Af part especially that there is no requirement to find evidence of CSAM material all the time. However, if NCMEC especially shares some information about visual depictions and asks to stop redistribution, then provider is required to comply in some cases. For example if they share hashes and these should be stopped. To be able to stop this data, then search is required and complying this with E2E encryption is not possible.

                  • pseudalopex 4 years ago

                    Please show where those legal requirements have been applied to E2E encrypted files.

  • dhfjskh 4 years ago

    >Then HN taught me that any company storing images on their infrastructure in the US must report pedophilic images to the US government.

    That's not true though.

  • PolCPP 4 years ago

    You could generate hash on device, send it to server alongside the file, once validated delete hash or create decoding voucher.

    • nicce 4 years ago

      This is exactly what they are doing right now. Decoding voucher applies when their system thinks that too many hashes goes into CSAM category.

  • starfallg 4 years ago

    >What other technical approach are people advocating for?

    Reduce user data stored in cloud data centres as much as possible. This is the approach taken by Whatsapp, so not surprised they are the ones most vocal against it.

    And at the risk of appearing to be supportive of a Facebook product, I think this is the right way to take computing. We don't need a central place to put stuff or to do compute when we can do it on our own devices. We just need orchestration.

    • nicce 4 years ago

      It is a bit ironical that WhatsApp is worried about privacy. All message metadata is unecrypted and part of their business model. They don’t know about message contents, but they know everything about your social network (who do you message and when, who are part of your groups etc.) Add cross-app tracking with Facebook APIs and soon they can also categorize your message contents.

    • asutekku 4 years ago

      How would you do it? With whatsapp you have the file on messaging partner’s phone. People do not want to share their images over a peer network with random people.

  • LatteLazy 4 years ago

    Apple aren't required to decrypt anything. This is why ever other server/storage provider are not also demanding access to everything client side (or keys to decrypt server side). It's a red herring for Apple to pretend they're "required" to do this, they're no more required to do so any more than the post office are required to open your mail on the off chance they might be handling CP...

ashildr 4 years ago

“other tech experts“ want to keep Apple from using End2End encryption so they have a reason why they don’t provide it to their customers. With Apple‘s approach Apple can only decrypt Data hosted with them if the (false) positives are over a threshold. It‘s a pretty neat way to implement a usecase so deeply wrong. All other companies protesting now want to keep access to userdata in the clear.

Apple‘s approach is the only way to - at the same time - act lawfully regarding to EARN-IT act in the US and provide E2E in iCloud.

I really hate these laws, but Apple is not the problem here. Read up on EARNIT and the EU laws currently in the works. All communication WILL HAVE TO BE SCANNED by the provider. Beating the drum against Apple will just lead to E2E encryption being forbidden. What needs to be forbidden instead is any access to communication.

https://www.apple.com/child-safety/ https://en.m.wikipedia.org/wiki/EARN_IT_Act_of_2020 https://ec.europa.eu/info/law/better-regulation/have-your-sa...

traceroute66 4 years ago

I'm in two minds about this Apple news.

In one part, the pro-privacy part of me is of course aghast at the whole idea.

However...

If you "read the room", there have been increasing noises from the global political world in recent years, and perhaps especially in the US.

So if you think about it that way, it might be a case of Apple jumping before they were pushed.

I mean, let's face it, if you wait for the politicos to come up with a solution and force it through with legislation, they really would put in actual backdoors and encryption bans given half the chance.

I suspect others, such as WhatsApp, might begrudgingly follow in due course.

There's always GPG and a whole litany of other tools and apps for those who know what they are doing in terms of privacy.

  • ndr 4 years ago

    > There's always GPG and a whole litany of other tools and apps for those who know what they are doing in terms of privacy.

    And it's always there also for whoever they claim to want to catch, so this measure is useless.

    This is not protecting anyone, Apple might very well be anticipating the regulation, but that does not automatically deserve our praise. We should fight against this implementation and any regulation requiring similar measures.

    • comeonseriously 4 years ago

      > And it's always there also for whoever they claim to want to catch, so this measure is useless.

      Right. This will, a) catch the low hanging fruit type of criminal and b) keep honest people honest while forcing them to give up something for nothing.

      • ndr 4 years ago

        c) non-trivial chance it will be used for something much worse than what (a) & (b) fix, without solving the main issue

        That risk is not tiny, can you imagine any authoritarian government asking a compliant Apple to remove inconvenient pictures?

  • studentrob 4 years ago

    > let's face it, if you wait for the politicos to come up with a solution and force it through with legislation, they really would put in actual backdoors and encryption bans given half the chance.

    They've tried to do this for decades and have failed. If they're going to do it then let it be on record. Let's see how voters like it.

    • altantiprocrast 4 years ago

      90% of the voters are too stupid or probably don't even know what encryption is. Many are also going to stick to party lines. It just matters whether the 10% swing voters will care about this issue *more than* the other awful things the leadership is doing.

    • dnh44 4 years ago

      Well, they could always try putting pressure on the tech giants in other places via other means in order to get them to acquiesce to these sorts of anti-privacy measures. Maybe via things like anti-trust measures which seem to be popular.

    • thw0rted 4 years ago

      Bad news, buddy: https://www.patrick-breyer.de/en/posts/message-screening/?la...

      ETA: in short, about a month ago they did get the votes, at least in the EU, and it's now "allowed" for providers to scan all content. In a little while, they're going to have a vote to change "allowed" to "required", and we have no reason to think it'll go differently.

      • studentrob 4 years ago

        Bad news for the EU maybe, if that were to pass. That sort of thing never passed muster in the US. There is always a huge backlash because it infringes on the speech rights of both companies and individuals. You'd essentially be forcing banks to build in backdoors that criminals could use, and also making it so that only criminals can use true E2E encryption.

        There is no sense making laws you can't enforce. It erodes trust and credibility.

        • thw0rted 4 years ago

          If you think EU policy only impacts the EU, you didn't pay attention to what happened with GDPR. Some companies might scan only EU-to-EU communications, some might scan communications where only one end is in the EU, and some might just scan everything because why build two completely separate systems rather than just doing whatever is compliant everywhere you operate?

          • studentrob 4 years ago

            Maybe, what's your point? You want this legislation to pass? You seem intent on delivering news of a future dystopia.

            I don't think any of these scanned systems or policies will survive in the long run. They're inherently insecure and won't lead to growth.

            • thw0rted 4 years ago

              Your post I originally replied to said

              > They've tried to do this for decades and have failed.... Let's see how voters like it.

              My "point" is that I thought the same way you did -- look what a mess Clipper Chip was, they always want backdoors but surely a voice of reason will show up, etc -- but something has changed. Couple the vote in the EU with the way the major tech companies reacted to GDPR (you'd be surprised how many sites simply block all of Europe rather than comply) and it's a wakeup call. There is a real chance of the bad guys winning here.

              • studentrob 4 years ago

                My opinion here is that such policies are unenforceable and will therefore blow up in the faces of whoever implements them. Whoever does not will have the people's backing and will pave the way to the future. Of course, none of us can see the future, so we'll just have to wait and see. If I lived in the EU I would make my voice heard about that legislation.

                • thw0rted 4 years ago

                  Maybe I'm just too jaded but I don't think "making voices heard" matters -- in the link I posted upthread, the overwhelming majority of voters did not want the Chat Control measure to pass, but it did anyway, "for the children". (I can't even do that -- I'm an American living over here, I have no say in politics but am subject to a lot of their rules.)

                  Maybe we'll get lucky and the next vote will fail, or maybe if it passes there will be providers that refuse to comply. I think if it happens, it's far more likely that most will cave, and a few will just pull the plug and stop offering service.

                  • studentrob 4 years ago

                    > I don't think "making voices heard" matters

                    Of course it matters. Politicians must listen to voters on topics of import or they're out. If you're arguing against democracy and for some imagined alternative, then I can't help you because that's a worse outcome.

                    It's true some policies do pass that a lot of people don't want. It's up to the voters to make an issue of that in the next election cycle. As an American living in the EU you can certainly use your voice. That may be as consequential as your vote if you are convincing. Since I do not live there I don't engage in those politics, despite the connectedness of the world. There's enough to deal with on our home turf.

                    • thw0rted 4 years ago

                      I'm not arguing against democracy: it's the worst system except all the other ones. I didn't think much of Brexit, I think they really shafted themselves, but at the same time I get the strong impression that EU governance is hopelessly broken.

                      I don't have to bring a solution to notice that the system we have is not working. (That's not to say I don't wish I had one, I just don't.)

  • unionpivo 4 years ago

    This is worse.

    If they ban encryption tech sector will kick up enough noise that even non tech people will at least notice.

    This way, it essentially opens a backdoor. Changing this from scanning hashes of pictures stored locally, to scanning for arbitrary things stored locally probably is not monumental task ( next in line probably hate speech ).

    And once you have that capability it's hard to argue to governments that you cant let them scan the content of either particular phone, or all of the phones, for whatever they want, which could be: .*

    This way, the message to non techies will be, we are protecting children, but bunch of online weirdos and maybe pedophiles don't want us too.

    • hef19898 4 years ago

      I'd say next in line would be stuff oppressive governments don't like. Winnie the Puuh might be front runner along with men standing shirtless in front of tanks.

      • unionpivo 4 years ago

        Everything sold in China, already has that.

        Remember ICloud is operated by Chinese company in china.

        • hef19898 4 years ago

          And we all now that these governments stop carrying about their citizens at the border. Not.

          • zepto 4 years ago

            Are you claiming that China will force Apple to scan the phones of US citizens now?

            • hef19898 4 years ago

              Not necessarily, but I jave no reason to believe China, Saudi, Turkey, the UAE or other countries with some kind of leverage won't try. Especially with their own citizens living abroad.

              • zepto 4 years ago

                I’m sure they will try, but I don’t see how this mechanism will help them.

                • hef19898 4 years ago

                  Really? Now they are technically capable to do so. Before that, they were not. Or they were and just didn't tell anyone. Still kow idea how that mechanism might help?

                  • zepto 4 years ago

                    Capable to do what exactly? What technical capability are you referring to? This isn’t a general purpose file scanning mechanism.

DavideNL 4 years ago

WhatsApp/Facebook critizing Apple for privacy… that’s hilarious.

1024core 4 years ago

The road to hell is paved with good intentions.

Everything starts off with "won't anyone think of the children?". Next thing you know, Apple is scanning your photos for faces of known "terrorists", etc.

I have children, and hate CP with a passion, but know that this is not the answer.

  • vorpalhex 4 years ago

    Terrorists, illegal drug use, "dangerous" materials like pipes or fuses (god forbid if you like model rocketry). The list goes on.

Octabrain 4 years ago

With the noble excuse of "protecting the children" they are just opening the vector that will allow them to go for other stuff in the future. I think this is just the first step and the intention is to make it "cognitively acceptable" for the majority of us later on.

At this point, I have no doubt that in the future, a more ambiguous excuse such as "hate speech" will be used and under that umbrella, the elites will have a huge margin for pursuing any kind of "dissidence".

  • ashildr 4 years ago

    This vector has been opened for a while. Companies are not allowed to provide E2E encryption anymore, they have or will have (see EU) to scan every picture they host or transport for you.

    • sebzim4500 4 years ago

      Then how come so many companies provide E2E encryption? Are the signal devs about to be sent to Gitmo?

  • HNisBaizuo 4 years ago

    Wait until you find out your noble support of using big tech to "fight nazis" over the last four years was a key part in normalizing moves like this.

    • kfprt 4 years ago

      The authoritarian impulse is disturbingly common. This is a product of the long decline in societal trust. also, baizuo

elzbardico 4 years ago

This applies only to photos that are uploaded to iCloud. Apple is doing to do the processing on the device though, but only for the photos that are going to be uploaded to iCloud. Like other services do. WhatsApp is a Facebook property and decided to use this confusion to hit Apple politically with a low punch. People that should know better prove once again that most people read no much than the headlines and base their conclusions on the tastiest sound bytes they hear.

  • bitcurious 4 years ago

    That’s like building a tank and saying it’ll only be used for deliveries. Once the infrastructure is in place on every device to detect arbitrary images/messages on the device the switch will be flipped.

    • elzbardico 4 years ago

      iOS is not open source, so, Apple could just deliver the spy payload with another update and probably nobody would ever know if that was the intention. The slippery slope argument simply don't make much sense here (as usual, btw)

      • pnt12 4 years ago

        You can keep that secret until someone goes to court or has police raid their house over photos that were supposedly end to end encrypted.

        Like whatsapp saying their chat is encrypted, is it really? Well, Facebook is trying really hard to losen this up, why don't they just release an update sending them the keys?

    • charcircuit 4 years ago

      The infrastructure for detecting files does not need to be that complex. We are talking something a single engineer can do in less than an hour.

      • thw0rted 4 years ago

        If you believe this, then you don't understand what their system can do.

  • BatteryMountain 4 years ago

    Facebook still sour about Apple blocking tracking mechanisms, hence the low punch?

throwaway75 4 years ago

Sometimes there are more important things than maintaining individual privacy and this is clearly one of them.

Finding and protecting even a few children from becoming victims of pornography is clearly something that is well worth my not having 100% privacy.

Even knowing that there might be false alarms.

Despite what the strident discourse has been, individual privacy is not some sancrosanct idea that cannot ever be tread upon. There are some things that are far far more important than that.

  • okokwhatever 4 years ago

    No sir, not at all. The basement of all individual rights is freedom. If police can't do their job in a proper way we shouldn't abandone our right to not be monitored in our private life to supplant their ineffectiveness.

  • gigel82 4 years ago

    Politicians love this guy ^ ; that's the way to right-think citizen, carry on :)

    • throwaway75 4 years ago

      Child pornographers love this guy ^ ; that's the way to right-think consumer, carry on :)

retskrad 4 years ago

Tim Cook is one of the best managers in the world but this is one of very few instances where Apple completely forgot about their mission as a company under his leadership.

I'm surprised he went ahead with this considering how much privacy goodwill they have built up over the years.

  • pndy 4 years ago

    > Tim Cook is one of the best managers

    What my SO says that he's just the manager of sales interested in selling their products and nothing else; the "else" is done by others - with marketing and pr and actual work.

mkl95 4 years ago

I find this controversy kind of vexing. Apple crossed the line between being strongly opinionated and patronising years ago. There's nothing about this plan that is out of character, Apple have and will continue making unilateral decisions on behalf of their users because it's their shtick.

myspy 4 years ago

What is striking is that a lot of misinformation is spreading around. How it works, what is behind it, what the alternatives could be, and especially what problem it tries to solve.

This tweet here gives interesting options to learn more about it https://twitter.com/yoyoel/status/1424154582372872192?s=20

I have no imagination of the suffering of the kids behind it and it's definitely good that we fight it. But why not for older kids?

Apparently this is already happening on all major platforms in the moment. Apples implementation is the most privacy friendly one, or isn't it?

  • strzibny 4 years ago

    It isn't. It's the worst one.

    • EtienneK 4 years ago

      Please elaborate. If as others have stated that this will enable E2E encryption in the cloud, then it very well might be?

      • strzibny 4 years ago

        They don't provide E2E encryption in the cloud.

        And generally speaking I want my device to be mine. I don't even want debugging or app verification requests happening without my consent. It's my device and when I upload to the cloud, I'll use my own encryption.

TravisHusky 4 years ago

This whole thing was the final straw to get me to switch away from iPhone. I decided to get a PinePhone, even though they are not ready for primetime, I am an embedded OS developer by trade, so I figure I might as well give it a go and see what I can contribute. I only use my phone for web browsing and Telegram anyway which are apparently usable on PinePhone.

fjfaase 4 years ago

I understand that the EU is working on legislation that would make Apple's Child Safety plan manditory on all private communication platforms within the EU.

  • viktorcode 4 years ago

    I believe it has a good chance of failing on level of different EU states. Just try telling a general German citizen that their private communications must be monitored.

  • rmvt 4 years ago

    that's awful. do you have a source on this?

    • modernerd 4 years ago

      See https://european-pirateparty.eu/child-protection-through-sca....

      > Not only searches for known pictures and videos are to be legalised, but also error-prone “artificial intelligence”, for example to automatically search text messages for “luring” of minors. If an algorithm reports a suspected message, message content and customer data could be automatically forwarded to law enforcement agencies and non-governmental organizations worldwide without human examination.

      • heavenlyblue 4 years ago

        It will only take a few false-positive images shared on WhatsApp by everyone to take down that initiative.

  • amelius 4 years ago

    How would that work on a Linux phone?

    • ip_addr 4 years ago

      If you upload a file to email/a messenger service it would then scan the image.

kfprt 4 years ago

'Child Safety' is extremely disingenuous. Their 'plan' provides no benefit to the safety of children and only strips users of their remaining shred of privacy. Authorities care more about stopping images than actual children currently being abused (lack of investigation into Sophie Long allegations). Of course this is because 'child safety' is merely a smokescreen for authoritarianism and tyranny. Just like China.

  • mikro2nd 4 years ago

    "The state must declare the child to be the most precious treasure of the people. As long as the government is perceived as working for the benefit of the children, the people will happily endure almost any curtailment of liberty and almost any deprivation." —Adolf Hitler

nyuszika7h 4 years ago

This is extremely hypocritical coming from Facebook, who is working on a system to scan E2E encrypted WhatsApp messages to use them for targeted advertising.

isodev 4 years ago

It’s funny that WhatsApp, who themselves do the very same thing (in a much less transparent and privacy minded way) would call out Apple.

system2 4 years ago

Isn't WhatsApp part of Facebook? Why is it being separated as WhatsApp?

  • deaddodo 4 years ago

    Because Will Cathcart is the lead of WhatsApp, not Facebook. As is mentioned clearly in the title. It's not like they are hiding the association as his quote is immediately followed up by:

    > WhatsApp’s owner, Facebook, has reasons to pounce on Apple for privacy concerns.

  • padolsey 4 years ago

    Likely relevant as WhatsApp does not always see eye-to-eye with mother FB, notably on privacy issues. Source: used to work there.

tablespoon 4 years ago

> Kendra Albert, an instructor at Harvard’s Cyberlaw Clinic, has a thread on the potential dangers to queer children and Apple’s initial lack of clarity around age ranges for the parental notifications feature.

>> The idea that parents are safe people for teens to have conversations about sex or sexting with is admirable, but in many cases, not true. (And as far as I can tell, this stuff doesn't just apply to kids under the age for 13.) — Kendra Albert (@KendraSerra) August 5, 2021

>> EFF reports that the iMessage nudity notifications will not go to parents if the kid is between 13-17 but that is not anywhere in the Apple documentation that I can find. https://t.co/Ma1BdyqZfW — Kendra Albert (@KendraSerra) August 6, 2021

I'm against Apple forcing a backdoor onto every device, but this argument falls totally flat to me. Yes there are shitty parents out there, but despite that, parents still need the ability to parent. If Apple's "think of the children" arguments for their backdoor are wrong, then this "think of the children" argument against it is wrong too. There's nothing wrong with notifying parents that their pre-teen is doing someone they shouldn't be doing with their phone.

IMHO, I'd support the existence of such feature, but only as long as it's a user-installable option, not installed on every phone as part of the OS.

Componica 4 years ago

Imagine taking a photo or have in your gallery a photo a dear leader doesn't want to spread. Ten minutes later there's a knock at your door.

  • throwaway75 4 years ago

    Imagine now a young child being sexually abused in front of a camera for all its years of existence, and that there's no knock on the door.

    • aeternum 4 years ago

      False dichotomy, how long will it take the predators to learn not to use apple devices as cameras?

      Yet the privacy implications will last forever. Once it's implemented, it only takes a rubber-stamp warrant to compel Apple to scan your device for anything the government deems concerning. In fact, no warrant needed in most countries.

      • throwaway75 4 years ago

        > how long will it take the predators to learn not to use apple devices as cameras?

        Which is why you make it mandatory on all devices sold, not just apple.

    • amanaplanacanal 4 years ago

      So if the camera goes away, the child abuse won’t happen? Somehow I doubt that.

raspyberr 4 years ago

I think it's sad that the "think of the children" argument has been (ab)used so much that people are becoming numb to it.

  • josefx 4 years ago

    The tool is just hilariously easy to abuse, in the west it "may" be limited to child porn, in some countries it might be images featuring gay sex, political dissent, parodies or other kinds of crimes against the regime.

  • matheusmoreira 4 years ago

    Yeah. It's gotten to the point I automatically assume anyone who uses children as an argument is acting in bad faith.

darthrupert 4 years ago

Don't criticize this idea without informing us of a better way to protect against the worst offences of modern internet life.

Privacy advocates don't seem to get it. There can not and will not be a future where these people can hide in the net. Either somebody figures a way to catch them without hurting the privacy of the innocent, or we will use systems that hurt the privacy of the innocent.

There is no third option, so put your energy into finding a solution that satisfies the first option if you care about this so much.

endisneigh 4 years ago

What are people switching too that is not susceptible to the same thing?

  • bdibs 4 years ago

    I’m considering a switch to GrapheneOS on a Pixel. I’m not sure if it’s an overreaction but I feel like I’ve lost what little trust I had remaining in Apple. It’s really unfortunate as they make quality products, but I guess everything sours eventually.

    • PolCPP 4 years ago

      Why Graphene instead of Calyx? The later seems to be more user friendly (asking because i'm looking into it)

      • bdibs 4 years ago

        My (limited) understanding so far is that Calyx is a "looser" Graphene security wise to make it a bit more user friendly, I'm wanting to just jump in with both feet. I don't use a ton of apps anyways so I don't think I'll be missing anything.

fattybob 4 years ago

Seems to me a lot of noise about Apple joining all the other services in meeting US regulations, aren’t all major vendors equally compliant, maybe more invasively so!

Componica 4 years ago

Imagine taking a photo or have in your gallery a photo a dear leader doesn't want to spread. Ten minutes later you heard a knocking at your door.

pyaamb 4 years ago

does anyone know if there has been studies showing rise in child sexual exploitation that would warrant something so drastic? I would have imagined it would be lower as society progresses but i do not know. I dont see enough people asking if giving up our personal privacy is really warranted or if we are being mislead. People should be asking for hard proof.

  • jgaa 4 years ago

    Child porn was, as far as I remember, made illegal because 2 million children was used in it's production. Now that number is 5 million.

    • amanaplanacanal 4 years ago

      Is that five million a year, or five million total, including the original two million? This isn’t saying anything about whether more children are being abused now.

      Given how ubiquitous cameras are, I can fully imagine more pictures being taken, though.

      • jgaa 4 years ago

        It's just a number taken out of thin air, just like the original 2 million.

nathanvanfleet 4 years ago

I had heard that WhatsApp and many other such products have been doing this, in a more unfettered way, for a decade?

hi41 4 years ago

Few years back Google informed authorities about a person who stored vile images of child porn in Google Drive. How is Apple’s implementation any different from that of Google. Aren’t both the same? However, I did not see lot of protests against Google at that time.

kkarakk 4 years ago

And the journey to not being able to trust big tech smartphones just like you can't trust big tech social media has been completed.

Guess I'm gonna have to buy a semi terrible privacy focused smart device with phone capabilities in the near future.

46ve18v 4 years ago

That's not even Apple role to fight child abuses. That's the governements role

voidnullnil 4 years ago

The most annoying thing is that everyone denouncing Apple's action still agree that "CSAM" is a problem that needs action by technology companies (and thus decentralized stuff should be illegal). While "CSAM" is a problem, just like any crime, it's completely overblown and much more rare than they pretend it is. NO. The internet doesn't need regulation. Never.

- Most instances of "child abuse" involve something that matches the legal term, but involves teenagers and is almost certainly not abuse

- Lots of conservatives want to punish said teens and anyone involved for sexuality and go along with the sophistry of calling people abuse victims when they have consensual sex or post their nude photos online

- Naturally, there is no incentive to look at naked 5 year olds, because that's not how the human body works. This is an edge case and is what the media makes out to be the norm

Stop pretending to have a "mature perspective". Companies should literally never touch your data unless there is a search warrant. Now that I read this article I'm concerned about what WhatsApp is doing.

wydfre 4 years ago

What is going on with people - why on earth are people overreacting to this. Apple is doing something unconditionally good, and people are posting about how it is bad? I mean the people writing the opposition to this change just sound pants-on-head stupid, and obviously suffer from sort of mental health problem (such as this[0]). Imagine the poor NSA having to know this evil and doing nothing, and Apple is doing it's small bit to help. It sounds a lot better than the useless HUMINT divisions.

[0]: https://www.reddit.com/r/apple/comments/p0i9vb/bought_my_fir...

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection