Settings

Theme

Instagram Recommends Sexual Videos to Accounts for 13-Year-Olds, Tests Show

wsj.com

96 points by Umofomia 2 years ago · 108 comments

Reader

UmofomiaOP 2 years ago

https://archive.ph/TXPbL

Zelphyr 2 years ago

I know this sounds defeatist but, I don't know why anyone even bothers to report things like this anymore. Meta doing something they're not supposed to do, even after promising they'd never do it again? This isn't news. This is a Thursday.

Our governments aren't going to do anything about it. We all know it. In part because not enough constituents are complaining loudly enough (or, you know; at all) about these companies.

Speaking of; there is a very quick solution to these companies misbehaving: us. We don't even need to cancel our accounts (wouldn't matter anyway, they won't really delete them). Just stop using the services and let the companies know why. Let them know that when they truly take responsibility for their actions and behave then maybe we'll come back. Until then, we're refusing to be good products for them to sell.

But, let's face it, there are too many people out there that foolishly think they can't have a social life without social media despite the overwhelming evidence to prove that's not true. Too many people thinking they can't do without Amazon, Apple, Netflix, or Google, or Microsoft among many others. So, nobody will do or say anything and these companies will continue abusing us.

I mean, damn. If we can't get fired up enough to do something about one of them blatantly showing sexual material to minors then we haven't hit rock bottom from this particular drug.

  • ksaun 2 years ago

    Besides just not using these services, wouldn't it also be necessary to divest any investments in any of these companies (even if through mutual/index funds, etc.)?

    I agree it is hard to not feel defeatist.

    • Zelphyr 2 years ago

      Actually, if you're a direct shareholder then you have a direct voice that they will listen to if enough people do the same. So, rather than sell, use that voice to demand change.

  • crowcroft 2 years ago

    If this is the innovation we get from Section 230, then we need carve outs.

photochemsyn 2 years ago

The hijacking of basic human emotions and drives in the name of some ulterior goal - turning a profit, establishing authority, etc. - is a notion that encompasses and also predates social media, the internet, television and radio advertising, the invention of the printing press, and the tactics of priests and kings throughout recorded history. It's the most fundamental feature of all such propaganda.

If we wanted to really undermine such efforts, we'd be teaching children about the history of such propaganda tactics and methods from a quite young age. This has been tried before in the USA, but it got shut down quickly at the beginning of WWII:

https://en.wikipedia.org/wiki/Institute_for_Propaganda_Analy...

game_the0ry 2 years ago

I am disappointed that parents allow 13 year olds to have iphones + social media accounts.

The downsides have been well-documented. [1] Can we all collectively get together and say "no" to that stuff until like 18?

[1] https://www.youtube.com/watch?v=CI6rX96oYnY

  • jchung 2 years ago

    I have kids who will soon be teens. They don't have phones yet, but I can see the decision looming: my ability to communicate / coordinate with them plus the fact that a huge portion of their social life will migrate online on one hand vs. all of the dangers on the other.

    This is why a lot of parents start with smart watches or restricted phones. They try to get the communication / coordination benefits without the online social risks. But that only lasts so long.

    I'm not sure how I'll navigate it. Probably not by saying "no to that stuff until 18".

    • rsync 2 years ago

      "I'm not sure how I'll navigate it."

      I am not going to address the broad set of questions here but I want to point out that two items exist:

      1) No-data SIM cards ... they call and text only ...

      2) imessage only access point - fairly trivial to set up.

      So ... a child can be given a phone - even a smartphone - and it can't be used as a social media device when the family wifi turns off. Further, you can heavily restrict family wifi without curtailing normal phone talking if you have an imessage/facetime only access point.

      Again, no magic bullets here but some tools that you might find useful.

    • Zelphyr 2 years ago

      I put off letting my kids have a phone until 14. They had tablets before that. If I could have done it all over again, they wouldn't have gotten any kind of device until at least 16. At best they would have gotten a flip phone.

  • stawik22 2 years ago

    I totally understand your point and others commenting here. IMHO the main problem are other parents giving their children their "old" phone because they want the latest one.

    There is a finite amount of control that you can have, there's always a friend with another phone where they can watch everything.

    Communication is the key, we talk a lot with our 11 yo about the danger and pitfalls of social media, electronic games etc. Yes, he has a simple smartwatch (bare minimum: call, location), yet I think we managed to develop a healthy digital hygiene. I wish you best luck!

  • nomel 2 years ago

    I'm a strong believer that smart watches cover plenty: music, phone calls, texts, schedule/reminders, and even light reading.

  • intrasight 2 years ago

    I'm even more disappointed that parents allow 11-15 year olds to be Instagram "influencers" where 85% of the people being influenced are creepy old men.

    https://www.nytimes.com/2024/02/22/us/instagram-child-influe...

1vuio0pswjnm7 2 years ago

"I started Facebook, I run it, and I'm responsible for what happens here."

https://en.wikisource.org/wiki/Zuckerberg_Senate_Transcript_...

2OEH8eoCRo0 2 years ago

Platforms should be liable for recommendations

When are we gonna hold these companies accountable and stop accepting excuses?

gumby 2 years ago

I am always surprised by articles like this. Instagram never recommends sexual videos to me.

  • tennisflyi 2 years ago

    It’s very easy to train it.

    1. Search “TikTok dance”

    2. Scroll until one is at the beach thus bikini

    3. Let it loop a couple times

    4. Exit the app and comeback - that’s your new feed

    • sandworm101 2 years ago

      As a former rock climber and sailor I appreciate knots. I sometimes watch videos on youtube about knot theory and rope handling practices. One subset of this is "braiding" of rope, which is an important rope-handling technique. But I watch one video about braiding hair and for MONTHS youtube is pushed me nothing but makeup tutorials and teenager fashion junk.

  • laweijfmvo 2 years ago

    The issue with even saying you don't like something seems to be that _everything_ contains some sort of sexualized being. Into fitness? here's a mostly naked person doing fitness. Hiking? Same. Gardening? Yup! Etc... How can the algorithm possibly disambiguate?

  • mandibeet 2 years ago

    How different people's experiences with social media platforms can be. Instagram for me too but YouTube shorts sometimes..

  • xyst 2 years ago

    search for "transparent clothing influencers" on a fresh account and you will get a flood of them

rdtsc 2 years ago

> In one clip that Instagram recommended to a test account identified as 13 years old, an adult performer promised to send a picture of her “chest bags” via direct message to anyone who commented on her video. Another flashed her genitalia at the camera.

When it comes encryption and privacy the legislators just can't wait to jump in an "save the children", let's see how vigorous they are going to be investigating and prosecuting Meta for showing inappropriate things to children.

> On TikTok [...] new teen test accounts that behaved identically virtually never saw such material—even when a test minor account actively searched for, followed and liked videos of adult sex-content creators.

Well, isn't that embarrassing? The evil TikTok they are trying hard to ban, and for good reasons I think, is doing a better job "protecting our children" than Meta.

  • fragmede 2 years ago

    This is the same "evil" TikTok that has a child predator problem where older men DM sexually explicit content to accounts of 14 year old children*, and generally a problem with predators**, ?

    * https://www.forbes.com/sites/simonchandler/2020/11/02/tiktok...

    ** https://www.mmguardian.com/blog/tiktok-predators

    https://cseinstitute.org/tiktok-and-the-growing-media-exploi...

  • pizzathyme 2 years ago

    The reason for the US government banning TikTok is not that TikTok is a worse product (it is superior).

    The reason is that it is a massive geopolitical risk.

    People often conflate these two.

    • devsda 2 years ago

      Insta/FB/X/TikTok are all massive geo-political risks for a large number of countries. They have been called many times as tools for social engineering against governments.

      There is no good side in this debate.

      • pizzathyme 2 years ago

        Correct, this is why Insta/FB/X are banned in China.

        • ocodo 2 years ago

          It's also why a TikTok feed in China is vastly different from the sludge that other countries get from it.

    • elevation 2 years ago

      Correct.

      The attacker in the 2015 OPM hack acquired biometric data on every federal government employee. Cross referencing against biometrics collected at ports-of-entry unmasked covert US operations, undermined US interested and put the lives of Americans abroad at risk. This information is valuable to every adversary equipped to use it.

      The scandal of Tiktok is that rather than breaching OPM, a foreign entity has simply requested this information directly from future federal employees. Beyond mere facial biometrics, tiktok knows the childhood street address of ~every future spy -- and everyone near and dear to them. Tiktok can generate a psychological profile on ~every future diplomat and international businessman. And the biometrics they can access includes things like sleeping habits, gait, vocal chord structure, vocabulary, and accent -- much more than the OPM hackers got.

    • somenameforme 2 years ago

      I find it bemusing that people continue to pretend that some TikTok videos can just completely brain-warp people, when domestic politicians paired alongside US intelligence agencies, countless astroturfed social media accounts, and a completely dysfunctional and collaborative media system can't make people think what they want them to.

      • Zak 2 years ago

        I'm not saying this is true, but as a thought experiment, imagine you are an agent of the Chinese government assigned the task of weakening the west relative to China over the course of a decade or two given TikTok as a tool. What would that look like?

        It can't look like obvious propaganda; people wouldn't use it. It doesn't have to convince people of any one thing to work anyway. Just shortening the average attention span would be enough to weaken a society a little. How about increasing polarization? People produce plenty of polarizing content on their own; just favor that a little more than an algorithm merely trying to be addictive does. Antisocial and self-destructive behaviors should also get a subtle algorithmic boost. Aggrandizing the platform itself is harmful as well - if someone chooses to be an influencer instead of a scientist, damage is done.

        If those all sound like things most social media does, that's the point. It's like popularizing a junk food that's just a little higher in sugar, fat, and salt than the rest. Harmful elements of social media are harder to measure than that though.

    • oh_nice_marmot 2 years ago

      Genuine question: why is TikTok viewed as a risk? How does it differ from Facebook or Twitter with all the fake news, threats, etc.

      Imo, the way that some have instrumentalized Western social media for disinformation is a bigger threat than allowing a competitor

      • recursive 2 years ago

        A Chinese entity has a controlling interest. We want all of our manipulation to be domestic, thankyouverymuch.

      • photochemsyn 2 years ago

        FB/IG will delete or downgrade content that's extremely embarrassing to the US government; TikTok will do the same for the Chinese government. It's amusing when a subject (e.g. US government funding for a Chinese virus lab doing bat coronavirus research right before the Covid pandemic) hits both triggers - but X does allow that content to some degree so perhaps X really is a bit more pro-free-speech than either FB/IG or TikTok.

        • kredd 2 years ago

          FB/IG/Threads openly downrank any political content for good reasons nowadays. There was a thread from one of the execs that I saw a few months ago. I actually like that. Not everything has to be "rage bait of the day sprinkled with geopolitical drama". And whenever I have that itch to scratch, I can just doom-scroll through Twitter.

          TikTok is a bit different though, I think they were able to capture the audience and put them in correct bubbles, so if you want to just watch your funny memes, you won't see anything else. But looking at my friend's feeds, they see the opposite - extremely polarizing political content from all the possible sides of the spectrum. It looks like IG Reels has started to eat their lunch, but time will show.

    • fallingfrog 2 years ago

      How is it a risk?? Are the communists cackling with glee as they steal all those 30 second clips of teens doing the cinnamon challenge?

      • sandworm101 2 years ago

        If your algorithm can convince large numbers kids to snort cinnamon and eat tide pods for dinner, bending their perceptions enough for them to vote or not vote for something would be a piece of cake.

        • fallingfrog 2 years ago

          That’s just the worst reasoning I’ve ever heard. Your if.. then statement just doesn’t logically follow.

          Honestly this whole thing reeks of the satanic panic type moral scares of the 80’s or the anti communist rhetoric of the 60’s repackaged in new form. Moral panic over something or other is an American tradition; right now it’s tiktok but god knows why.

          Set yourself a reminder to come back to this comment in 10 years once the fervor has died down and you will read your own comments with bafflement.

          • sandworm101 2 years ago

            Lol. Such manipulation is happening now, has been for over a decade.

            https://en.wikipedia.org/wiki/Cambridge_Analytica

            "Using what it called "behavioral microtargeting" the company indicated that it could predict "needs" of subjects and how these needs may change over time. Services then could be individually targeted for the benefit of its clients from the political arena, governments and companies, providing "a better and more actionable view of their key audiences."

            • fallingfrog 2 years ago

              Boomer brain worms. If you really want to get rid of propaganda, you should take Fox News off the air immediately.

      • sva_ 2 years ago

        It is probably possible to manipulate people to further your political goals in an automated fashion using these technologies. In fact you can manipulate insane numbers of people in all kinds of ways with them.

        • somenameforme 2 years ago

          This runs into a simple logical problem. The powers that be would love nothing more than to be able to do exactly what you're describing. And they have effectively endless resources at their disposal alongside dysfunctional intelligence agencies, a completely dysfunctional and obsequious media, and social media sites that have no doubt all signed up for PRISM and its likely even more insidious successors. Yet society is absolutely, in no way whatsoever, becoming more compliant.

          • sva_ 2 years ago

            > Yet society is absolutely, in no way whatsoever, becoming more compliant.

            I'm not so sure. People are putting up with a lot of bullshit. Not saying that is due to social media though.

          • fallingfrog 2 years ago

            Exactly!

      • justajot 2 years ago

        Tim Ferris has a great podcast episode with Matt Pottinger that discusses this topic. https://tim.blog/2024/05/08/matt-pottinger/

  • burningChrome 2 years ago

    I would be interested to know how many people are coming from different social media platforms TO tikTok for this kind of content. I know back in the day almost every subreddit that had porn or women posting pics would also link to their tiktok accounts and many of the videos and accounts were also posted on tiktok so a woman could post a few pics or videos, link to their tiktok account and then get people to subscribe to their accounts there.

    There's also an assumption that users are 100% honest with their age. Simply confirming you are 18 gives you an easy end around the filtering of content. Even at 8-10 I had friends who were quite ambitious about getting their hands on porn and other material we weren't supposed to have. If the bar is simply lying about your age, I would say that's not a very good way to try and filter content from underage users.

  • infecto 2 years ago

    Even more damming imo is that Instagram is rife with those “young model” accounts.

    Makes sense though. When you have a product produced in an authoritarian state, they probably spend a lot more time on censoring for better and worse.

  • atestu 2 years ago

    Same for Snap, which is surprising given its reputation:

    > Despite their systems’ similar mechanics, neither TikTok nor Snapchat recommended the sex-heavy video feeds to freshly created teen accounts that Meta did, tests by the Journal and Edelson found.

    This imo proves that Meta isn't even trying:

    > In some instances, Instagram recommended that teen accounts watch videos that the platform had already labeled as “disturbing.”

    This could be a very simple toggle, it's disingenuous to blame everything on the "black box" of the "algorithm."

    • kredd 2 years ago

      Mostly because Snapchat can't be easily used as a "funnel through discovery algorithms". On the other side of it, it's probably the app where everyone sends their explicit photos to each other, especially within the younger demographics.

  • strangemonad 2 years ago

    It’s almost as if it’s not a single factor issue and insta could stand to improve its under-age content filters AND TikTok can also be a threat because of its ties to China.

  • CommanderData 2 years ago

    The TikTok ban is political and not to save the children, perhaps to stop them seeing the horrors in the middle east, instead tune them into $MindDestorying content instead.

cue_the_strings 2 years ago

I'm genuinely curious, how old were you when you were first exposed to sexual content? I think I was like 7 or 8.

Like, we had a channel (https://en.wikipedia.org/wiki/RTV_Palma) publicly broadcasting porn after midnight (I don't remember watching that) and all the older (~15YO) kids around the neighborhood were collecting porn magazines from who knows where and hiding them around in "caches".

I remember all the older kids being really excited about it, and us younglings being curious but grossed out. There was some pressure to pretend you were interested, kids love pretending they're more grown up than they are. Funnily enough, I remember having the same feeling about football world cup, like not getting the whole fuss about it, but being expected to be interested.

Couple years later, I naturally found out what they were excited about... For football I never did, though.

  • arp242 2 years ago

    I don't think you can really compare staying up late to sneakily watch a (typically soft-core) porn and things like that to always having this content being pushed on you with a device always in your pocket every day.

    It's really a matter of degree and scale.

    • Spivak 2 years ago

      I guess but I had unfiltered access to the internet starting around this age and it was fine.

      There were two stages of my life, too young to understand or have any desire for sexual content, and old enough to want and seek out sexual content. At 13 I was well into the second stage and it's super weird how we pretend that tweens and teens aren't horny as hell despite all of us having lived through the hormonal onslaught.

      Looking back having access to stuff beyond extremely sanitized softcore porn was extremely healthy. I learned from a young age that sex was a thing people did for fun, it wasn't super serious, and desires I had and were ashamed of were completely normal and honestly kind of vanilla.

      • afro88 2 years ago

        Sure, but go to one of the main porn websites and you're shown extremely hardcore content on the front page, in video thumbnails. Often there's something "rough", probaby something about step siblings, and in most cases the actors are being exploited or quite possibly trafficked.

        I agree with your sentiment though. Removing access to pornographic material isn't the answer. But these days unfiltered access comes with a different set of issues that likely have negative consequences on development of a healthy attitude to sex.

        • kredd 2 years ago

          You have a point, but my guess is it's simply a numbers game. I remember everyone in middle school sharing explicit, gore, "rough" LiveLeak vides in our school-wide MSN group chats. Probably a percentage of people suffered because of it later in life, but at that point still not everyone was connected to the Internet. Even if the percentage of people that get negatively affected stays the same nowadays, the absolute numbers are bigger, so it makes it to the headlines more often.

      • mock-possum 2 years ago

        That’s pretty much exactly my experience and my opinion as well -

        I knew what I wanted, I figured out how to get it, I never had a moment’s hesitance at the time, and I’ve never had a moment’s regret now as an adult. I sought out sexual material when I became interested in it, the same way I sought out sex when I became interested in it. I can’t even imagine what the adults in my life could have possibly done to stop me.

      • doublerabbit 2 years ago

        Meanwhile I encountered the wrong, extreme side of porn at 15. And it spiraled from there. It really caused me issues in my twenties and its only now in my thirties that I finally feel I can start to bury the ashes. I'm on track, I've eliminated what I was obsessed with and still fighting the last of the demons.

        I'm not who I was and it's been an insane hard task to change myself. I understand why people don't. That's what I have to remind myself every morning.

        During my twenties I was easily wanking 5-8 times a day, I couldn't get enough. I existed within a fandom, the hint is in my username, which is now a thing of the past and I have nothing good to say about such fandom! But, still feels like my horny switch is glued to on. I've even been mulling over the idea of castration just to ensure I have no libido.

        God forbid, I was a creep and my mind, psyche was an horrorshow. It was only when I accidentally took a heroic dose of an psychedelic I saw who I was. During the psychosis the devil on my shoulder and where I was to stay in the after life. I hit rock bottom, one step away to the point of no return.

        My parents tried, I beat their system. I moved out after college and then had my own place with no restrictions.

        Porn is no joke.

        In true crazy, I even documented the episode somewhat. old site, so no ssl et cetera, exists for legacy.. /face palm

        http://pixc.pl/argh/blogs/paws_palace.html

        • mock-possum 2 years ago

          I mean… no offense man but it sounds like your problem wasn’t porn, your problem was you.

          • doublerabbit 2 years ago

            Debatable.

            The problem did originate from porn. I didn't know what was right and what was wrong, I was naive. And unfiltered internet access, I was a rogue. I could talk all day about warez, CC gens, botnets. I'm just luckily I've had a solid family environment. The internet was my social outlet then.

            My maturity age was a lot lower than to how old I was. And because the internet was not a commodity in 2000 and really the wild west, what could folk do? I was just a cowboy who took all the wrong paths. Kept taking the left rather than right and hid it well. No one knew the dangers of the internet back then, nor knew how to navigate the dark side. As well not having the maturity required.

            Prior to this, I was kicked out an school subject for "hacking" and discovering a Windows 98 DCOM exploit. I wasn't allowed to touch any computer apart from one. It may of been that I also got pissed off with another student and sabotaged his coursework under the school's news paper account. A true uprising of an menace.

            I bought a BB gun at school, started showing it off till someone reported me. I had no intention to do anything with it. This is in Europe, I just thought it was cool. Already having the rep of "hacker boy" some older year student reported me, I got in trouble and then someone randomly pulled me aside handed me a CD-R copy of Quake 3 Arena.

            It kept me occupied but sadly never saved my soul as after college went to university, had my own dorm room, when I got my own place I was too hooked to porn and in other area's. The twenties was all mistakes but, made through it just.

            And here I am now mid-thirties, job, mortgage and someone who even surprises myself. Still working stuff out, most at work don't understand me. I have near to zero friends, but regardless, myself is making great progress.

            I was dealt the wrong set of cards and took all the wrong paths. And I've flipped them, both physically and mentally.

      • avgDev 2 years ago

        What do you mean by "it wasn't super serious"? Sex is seriously some serious business.

        Pornography definitely caused issues for me. It isn't an accurate representation of a healthy sexual relationship.

  • dkarl 2 years ago

    I was exposed to porn before I was even interested. I think I was seven. My friend's parents had the complete cable package that carried all the channels the local provider offered (which at the time I think was around thirty channels) and one of them showed porn. I only saw it once when I slept over, so I don't know if it was a dedicated porn channel or if it was just late night. The few minutes I saw was soft core, but looking back in retrospect, I'm pretty sure it was leading up to a hard core scene. My friend was also seven, but he seemed pretty obsessed with it. I left him to it and played video games instead, and I never slept over again.

    I didn't get exposed to porn after that until a couple of high school trips where we got to stay in hotel rooms and of course figured out how to access the pay-per-view porn. But that was it. The "inspiration" for my teenaged solo sexual activities came from catalogs with models in lingerie, relatively tame TV, the Sports Illustrated swimsuit issues, tennis magazines, and my imagination. The closest thing I had to porn when I lived with my parents was pausing a rented VHS tape of an R-rated movie, on one of the rare occasions when I had the house to myself.

  • throwaway22032 2 years ago

    When I was younger porn was a properly seedy thing that people would consume but almost no-one knew anyone that 'did it'.

    Nowadays you have OnlyFans models or similar basically all over every social media platform, businesses trying to sell it as a legit lifestyle, etc. It's completely different, because it's becoming much more bidirectional and open now.

    • tennisflyi 2 years ago

      Porn and OF are not similar. Porn is more of a job and OF a lifestyle

      • probably_wrong 2 years ago

        OF is porn with a fresh coat of paint.

        Sure, it is advertised as something one does as "a lifestyle", but when the model works for a professional studio with metrics to reach and gig workers chatting with customers while pretending to be her [1], how is that not a job? And more important, how do you know that the model on screen is doing what (s)he's doing because of their "lifestyle" and not because they are behind on their weekly numbers and may otherwise lose their spot in the agency's rotation?

        [1] https://news.ycombinator.com/item?id=40391797

      • AlexandrB 2 years ago

        I don't follow. In my mind OF is just "democratized" porn that doesn't have to go through seedy producers/distributors. It's like Uber for porn.

        • tennisflyi 2 years ago

          The top of OFs is usually a couple gallivanting courtesy of simps dedicated to a model or producing content from their modern apartment while fucking/taking nudes thus lifestyle.

          Riley Reid/agency models go from job to job as an IC and gets a check from the company via simps dedicated to a certain website.

          The former is a much easier sell/explain and a bit easier to justify to puritans. It’s all very samey but different.

      • throwaway22032 2 years ago

        I think that what you're saying is that oldschool pornography jobs are different to how pornography works on OnlyFans.

        They're both still porn and they're both definitely a lifestyle. One does not merely "do porn".

  • ajsnigrutin 2 years ago

    But it's different... you actually had to seek porn to find it back then (like stay up past your bedtime, turn on a relatively bad tv station, and sneak sticky magazines from somewhere.

    Now it's seeking you... open instagram, and you get porn ads, even if you're 13.

    On the other hand, boobs in shower gel ads were pretty normal back then, and noone really got excited by them. Also asses in thongs for suntan location ( https://svetkapitala.delo.si/media/images/20200217/322547.wi... )

    • usrnm 2 years ago

      > you actually had to seek porn to find it back then

      You clearly haven't seen early Internet. Giant flashy banners with semi-porn content everywhere

      • ajsnigrutin 2 years ago

        Not really.. maybe on some seedy piraty ('warez' back then) websites.

        If you visited whatever site agreggator was popular locally, and opened any of the sites in the first two pages of any category, you didn't get to see a single boob.

  • cmrdporcupine 2 years ago

    Culture around this stuff is completely different in the Anglo-American world, for one. Europeans are accustomed to nudity and the like everywhere, and it's not even particularly ... anything.

    But also, 1980s porn magazines or whatever are very different from today's porn content. A brief visit to any of the free "tube" sites exposes you immediately to a level of intensity that you wouldn't get from Playboy or Penthouse in the 80s. Right on the front page you'll find choking, incest ("stepdaughter"), BSDM, etc.

    I don't particularly want my teenage son traipsing into that before the context and patterns of healthy mature sexual relationships have been established.

    But more than anything, I don't want it pushed on him.

gigatexal 2 years ago

Maybe social media was a mistake. Oh well… Pandora’s box and all that

Congress won’t do anything; it’s too mired in infighting and lobbyists. And these companies’s better angels won’t do anything about this.

  • throwway120385 2 years ago

    Maybe using opaque algorithms to score and assign prominence to specific content based on ill-defined metrics like "engagement" was a mistake.

    • aydyn 2 years ago

      If an algorithm correctly determines content that will keep a person interested, and that content is, lets say problematic, isn't that an issue with the person not the algorithm?

      • AlexandrB 2 years ago

        Sure, in the same way that gambling addiction could be argued to be an issue with the person. But we still keep kids out of casinos because they don't have the judgement to make "good" choices when exposed to this kind of stimulation.

      • throwaway173738 2 years ago

        My mom is currently struggling to quit drinking and Facebook is constantly showing her ads for White Claw. Expecting her to navigate that during withdrawals and recovery is completely insane.

      • lovethevoid 2 years ago

        Yes, that's still a problem with the algorithm as it's prioritizing problematic content for increased ad revenue - something those buying ads on the platform should be extremely concerned about.

      • hn_throwaway_99 2 years ago

        If a dealer correctly determines that crack will keep a person interested, and that crack is, let's say problematic, isn't that an issue with the person not the crack?

        • degmee 2 years ago

          We appear to be arguing over semantics. The points being raised here are variations of the following questions: how much is a human life worth, and whose responsibility is it to preserve that life? If you want to poison yourself, is anyone under any obligation to stop you from finding or giving you that poison?

          • throwaway173738 2 years ago

            Let’s stop with the semantics and get down to specifics.

            • aydyn 2 years ago

              I mean sure, the specifics are:

              It's not the Internet's or Society's job to raise your child.

              If you're a parent and you dont want your teen to look at adult instagram its your job to give them some rules and enforcement.

              • Jensson 2 years ago

                > It's not Society's job to raise your child.

                It actually is, children are more raised by society than parents in general. Overall the environment has much more influence on children than the parents, that is societies job to make good environments for children, that is way too much for parents to handle.

lalos 2 years ago

I wonder if whoever ran this test prefers this type of content and its tied to a wifi network, same device, linked account, IP or location and there's no detail of the setting of the test. Too many factors going on to conclude with not a lot of transparency for reproducibility or flaws in the data collection.

  • jtbayly 2 years ago

    It doesn’t matter. The account was marked with an age. IG pushed sexual content to it.

    You can add as much other information to the picture as you want, but that’s the black and white issue.

    Besides, how often do 13yo’s have their own wifi and IP address?

  • crowcroft 2 years ago

    Ah yes, too complex to really understand what's going on so we will hand wave it away.

    It should be a hard rule that people under a certain age CAN NOT have this kind of content recommended, there should be precisely 0 ways for the algorithm to promote sexual material to children.

tennisflyi 2 years ago

Sex is everywhere. The internet has a lot of sex. It has been kind of the default since inception. Wild people just now know of its presence.

  • octopoc 2 years ago

    Sex ed via profit-seeking corporations is going to be pretty different, and arguably much more unnatural, than sex ed via parents and other people who care about you.

  • iamacyborg 2 years ago

    There’s a pretty big difference between going out of your way to look for it vs it being served up to you in a feed.

  • EgregiousCube 2 years ago

    It's true, it just used to be behind the scenes available to those who seek it out. Like that skeezy room at the back of the video store that had curtains blocking it off.

    IMO the real "problem" here isn't technological, political, or corporate - it's just a slide of social norms toward hyper-permissibility of immodest or bad behavior. The resolution will likely be social as well, and people realizing that kids are getting exposed to smut will likely hasten that. Laws might get passed, corporations might change their policies, but only after the pendulum swings back socially.

  • recursive 2 years ago

    You might have misunderstood if you believed this article was about people just learning that sex is on the internet.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection