Settings

Theme

iOS 17 Is a Prude

old.reddit.com

63 points by grupthink 2 years ago · 63 comments

Reader

december456 2 years ago

I will break the HN spirit, but im fucking horrified of this thread. _So_ many people being happy that a company is protecting them from themselves, or their family members. Where is my controversial personal websites? Did i take a wrong left turn somewhere?

  • xenadu02 2 years ago

    This is an opt-in feature. You can enable it for yourself. It is also available as a parental control for your kid's devices. I think that is 100% appropriate.

    I don't want to see dick pic spam. Actually I don't want anyone to send me naked pictures. Previously it was up to every messaging app to figure this out themselves. Now they can use the Sensitive Content Analysis framework. It also means they don't need to give PTSD to humans building a model to classify crap like CSAM images.

    All it does is detection. It is up to the app whether to prohibit sending/receiving the message, how to notify the user, etc. The API documentation says this:

    > Apple provides the SensitiveContentAnalysis framework to prevent people from viewing unwanted content, not as a way for an app to report on someone’s behavior. To protect user privacy, don’t transmit any information off the user’s device about whether the SensitiveContentAnalysis framework has identified an image or video as containing nudity. For more information, see the Developer Program License Agreement.

    • attqqq 2 years ago

      This is how they sold it: of course, nobody wants to see unsolicited dick pics. Good framing, they obviously pay their marketing department very well. Yet, consider how many unsolicited dick pics you actually see in a day, week, or month- and if sometimes blocking a % of them is worth permanently ceding more privacy to Apple.

      If you’re seeing so many dick pics to where this is positive trade off for you, you need to reflect on how you’re interacting with the web.

      • alphabettsy 2 years ago

        Some of my female friends get them quite regularly. So idk.

      • turquoisevar 2 years ago

        I’m going to go on a limb and guess you’re a dude.

        I suggest you expand your social circle and talk to some women that are moderately active on social media.

        Women are nowadays confronted with their face generated on fake porn by AI, hell some are confronted with generated sexual abuse as a form of threat and harassment if they piss off the wrong person, and you’re out here talking about dick pics.

        What’s worse is that you just couldn’t resist to add a dash of misogynistic victim blaming as a cherry on top.

      • xenadu02 2 years ago

        > permanently ceding more privacy to Apple

        How is on-device image classification "ceding more privacy"? Because an OS vendor provided it?

        > If you’re seeing so many [...] you need to reflect on how you’re interacting with the web

        Are you seriously claiming not to understand the concept of SPAM and info leaks?

  • vore 2 years ago

    I mean, you can just click the button to send the nudes. It's a sign, not a cop.

    • kome 2 years ago

      it's an AI powered digital cop.

      • vore 2 years ago

        Who is it stopping? You can literally click through. It LITERALLY says that it’s your choice.

  • sph 2 years ago

    Most people on HN are from the same cultural niche Apple is located in, in a country founded by actual bonafide Puritans.

    Of course they are OK with a machine being an obnoxious prude.

    • vore 2 years ago

      It’s not even being prudish. It’s clearly informing you that hey, be careful about sending nudes because you might not realize the consequences – the messaging is about as nonjudgmental as it gets. As someone who has who has literally made amateur porn and probably as far from prude at it gets, this really doesn’t register on my prude detector.

brucethemoose2 2 years ago

If its local scanning, this is fine. Dare I say, its a pretty good use of machine vision.

  • madrox 2 years ago

    The caption underneath this setting when you turn it on makes clear that scanning is on-device

  • diego_sandoval 2 years ago

    Even if done locally, it's still creepy.

    • brucethemoose2 2 years ago

      Why? Its a slab of silicon scanning a representation of a photo and spitting out a probability. Its no less creepy than originally taking the photo, and postprocessing it, and categorizing it in your photos app.

      If any kind of metrics leave the phone's chassis as a result, then its quite creepy, but I was operating under the assumption that they are not.

    • roenxi 2 years ago

      There is a problem out there of unsolicited nudes; I can see this being welcome capability for a lot of people [0]. It doesn't nudge up against any privacy issues. Seems like a good idea overall.

      Of course, there is also a real issue with the fact that, as closed source software on a locked down platform, we can't know what happens next. But that is just part of the deal with iPhones; there is already a lot of data like that (eg, I'd expect the US uses iPhone GPS data froom targets to hunt them down).

      [0] Not sure what the feature actually does because nobody has posted details here, so there is some guesswork here.

    • ashildr 2 years ago

      But photos.app scanning for kittens on device is not creepy somehow? Interesting.

      • lucb1e 2 years ago

        Computers do a lot of if-then.

        If it detects cat pictures, what evil thing is it going to do? Label it as a photo of a pet (I don't even know, I don't use a scanning phone)?

        If it detects nudity, what kind of unwanted behavior might it exhibit then, report to legal guardians? Not the picture itself but even just that the device is being used for that.

        I can see how this scanning+warning is more creepy than scanning+labeling cat pictures, even if the information screen tells you it was just used for this warning screen.

        • Clent 2 years ago

          Your issue is of trust.

          Do you use an Apple device? I can see why non-Apple users lack trust.

          Apple would be eviscerated if they were doing more than they claimed.

          Other devices makers are excused for poor behavior. Often with the tech-bro response that is some flavor of "install a custom OS, bro!"

          • lucb1e 2 years ago

            Not only trust, also just knowing what the system does above board. The documentation can say exactly what it does, but who reads that? At least not until they get a "hey that's an interesting pic you took there" pop up

    • lucb1e 2 years ago

      I'd also be creeped out, but honestly it's not bad to be a bit paranoid about who can see what you're currently sending onto the internet and double checking that things are as they should

mensetmanusman 2 years ago

This is a feature for kid’s devices, not the default setting.

zeratax 2 years ago

optional things like this are fine. preventing me from joining e.g. NSFW discord servers wholesale is not imo. As an adult I should be able to use my phone however I want

  • mfer 2 years ago

    > as an adult

    You touch on an interesting element. A lot of iPhone users are kids. Even young kids. Logged in with their parents account.

    Not trying to justify what’s going on. Just add context

    • ben_w 2 years ago

      Mmm.

      Given the internet as it is, and as it has been even back when FOSS discussions included hating GIF because of patent enforcement, kids shouldn't be on the (general) internet at all.

      Smartphones are even worse, given the deliberate attempts to make content more addictive.

      I'm not sure how to square that particular circle with the likelihood of social exclusion from not being online — it's not like me putting (general) in brackets in the first paragraph will convince the right people that there's money to be made in a genuinely safe subset, despite the existence of YouTube Kids and whatever Netflix' thing is called.

    • TheHappyOddish 2 years ago

      And this is the parents responsibility to not share passwords and ensure device usage is appropriate.

      A lot of kitchen users are kids, it's the parents responsibility to ensure they're supervised or that the sharp knives are put away out of reach.

  • EA-3167 2 years ago

    100%. It's the difference between empowering users and patronizing them.

  • squeaky-clean 2 years ago

    > preventing me from joining e.g. NSFW discord servers wholesale

    Is this something that iOS (or some other client) does? Or just hypothetical. I don't keep up with these things aside from when they reach HN

ehPReth 2 years ago

So, I can’t make stickers with penises in them for whatever reason (you’d think they’d lump creating/receiving them in the same setting), but the ‘are you sure you want to send or receive something that looks naughty’ was, in fact, turned off by default for me. Anyone else?

  • majormajor 2 years ago

    This was a "welcome to the new version, let's get started" walkthrough optional setting for me. So the user here turned it on and acts surprised it does what it says?

  • jdlshore 2 years ago

    It’s meant for kids.

    • TheNewsIsHere 2 years ago

      The new major OS releases include a version of this feature that you can enable without the parental control context/overhead. So for example if you don't want to see random penises that might get messaged to you, you can avoid seeing that.

      It is optional and disabled by default, just like the child-oriented Communication Safety feature set. They call the adult-oriented version "Sensitive Content Warnings".

    • ehPReth 2 years ago

      Stickers are made for kids? No, not at all. Perhaps it's just not your communication style.

  • brucethemoose2 2 years ago

    Perhaps the default is related to age?

ashildr 2 years ago

I’ve been to more sex parties than I can count and there’s definitely a significant amount of porn on my iOS devices, so I’d not consider myself prude. Still I think that giving people the option to hide unsolicited dick picks on their devices is a good idea and not prude. I think the same about an option hiding pictures of spiders.

Extrapolating from that we may touch hiding queer or black people, or mixed couples one day, which will be an interesting situation.

estevaoam 2 years ago

Not hotdog

turquoisevar 2 years ago

Not sure what the point of this post is.

This feature is entirely opt-in for adults.

Are we going to dedicate a post to every optional feature that we enable and act shocked about?

sph 2 years ago

It's not the nanny state, it was the nanny megacorporation all along.

atarian 2 years ago

I think most people would prefer this over CSAM.

  • lucb1e 2 years ago

    I don't think this screen would prevent anyone doing CSA from sending that M, and a C independently wishing to send SM is not necessarily being A, so I'm not sure what you mean

  • Karellen 2 years ago

    Does that sort of content show up often for people who would not prefer it and don't go looking for it?

    As someone who's used earlier versions of iOS for some years now, and who knows a bunch more people who also have, that's not a problem I'm aware any of us ever experiencing. I realise that anecdotes are not data, but it doesn't seem like it should be a common issue at all...

    • CharlesW 2 years ago

      > Does that sort of content show up often for people who would not prefer it and don't go looking for it?

      Oh yes.

      https://www.psypost.org/2020/08/new-research-uncovers-womens....

      • johnmaguire 2 years ago

        Unsolicited dick picks is not the same thing as CSAM...

        • CharlesW 2 years ago

          Because no underage person has ever sent or received a dick pic?

          • lucb1e 2 years ago

            That's not what your article is about though, or even so much as mentions in passing, so I find the critique that the provided data is not about CSAM legitimate

            • CharlesW 2 years ago

              > That's not what your article is about though…

              Teen Girls’ Experiences Negotiating the Ubiquitous Dick Pic: Sexual Double Standards and the Normalization of Image Based Sexual Harassment: https://link.springer.com/article/10.1007/s11199-021-01236-3

              "The YouGov poll also highlighted that, of all of the women questioned, 46% received an unsolicited dick pic before the age of 18…"

              Hope that helps.

              • vore 2 years ago

                This is not CSAM. CSAM is none of:

                - A child receiving an adult's naked pictures.

                - An adult sending a child an adult's naked pictures.

                - Children sending each other their own naked pictures.

                CSAM is specifically adults producing naked, sexual pictures of children. Your example is categorically not that.

      • paulmd 2 years ago

        > Only 26% of women reported having a positive reaction.

        I would consider that surprisingly high lol. 1 in 4 women is happy about receiving unsolicited dick pics, really?

        • TheHappyOddish 2 years ago

          What percentage of guys do you think would feel positively about getting boob pictures?

          I think straight people generally like the genitals of the opposite gender, that's not really surprising to me.

tiahura 2 years ago

I thought IOS 17 was going to feature sideloading? I thought the EU was requiring it?

asddubs 2 years ago

whatever happened to sideloading in this update anyway

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection