Apps That Use AI to Undress Women in Photos Soaring in Use
bloomberg.comFew days ago I've seen an ad for a realistic looking extra finger, you attach it to one of your other finger so that you have an extra finger sticking out. Then when someone makes real incriminating photo of you you point out that extra finger and say it's ai generated.
Someone is going to troll civilization with an AI service that produces incriminating deep fake photos's, security videos, event stubs, receipts, chats, and suspicious crypto currency transfers, all telling compelling tales of motive, opportunity and means.
Preemptively plant evidence to frame anyone you want for any crimes you plan to commit. I hear police departments love cases where they can efficiently dispense looking for alternative suspects. But leave a dozen backup suspect shaggy dog crumb trails just in case. Judicial system take that!
I have no idea where deep fakes are taking us, but if we don't get a cryptographic proof of provenance, location (proof of network latency) and time stamp system working soon for all "evidence", journalistic photos, questionable selfies, nights alone at home nowhere near any crime scene, things are going to get very crazy.
> Few days ago I've seen an ad for a realistic looking extra finger, you attach it to one of your other finger so that you have an extra finger sticking out. Then when someone makes real incriminating photo of you you point out that extra finger and say it's ai generated.
Link?
Sixth finger! Sixth finger! Man, alive! How did I ever get along with five ?
(a song from a 60s toy ad. the sixth finger had invisible ink, a dart gun, the usual spy stuff)
Nice, personnally I like to wear glasses with glitchy edges.
That ad was a joke, admitted by it's author.
That said, this defense would not save you for a bunch of reasons.
If it's dumb but it works, it's not dumb.
That's not a very good rebuttal to the claim that it doesn't work.
> [...] a bunch of reasons
No everyone would classify this as a valid or persuasive rebuttal.
I'm not ashamed to say that it would have been likely to convince any jury that I'm on.
Pretty easy to photoshop that out, but then you say "Nope, I wore an extra finger"
Why not just a shirt with some gibberish printed on it? Seems like it would be more convenient than an extra non-functional finger.
But the AI will remove the tshirt….
why not just wear googly eyes?
It's fascinating how much of a gap there is between how this is socially perceived (undressing someone) and what's actually happening (an hallucination of a naked body).
I see how this can lead to real problematic situations, but it's also so dumb, I don't think there will be any going back. I have no idea how we'll deal with this, I can't imagine we'll just accept that people have bodies and they don't need to be harrassed over it, but there must be a way and we'll find out.
At least people will be able to claim that their authentic nudes were faked.
They could do that before by claiming they were photoshopped. The reality is in these situations it seems far more likely that whether the nudes were fake or real claiming they were fake is unlikely to be believed in a social situation where they are publicly leaked.
The point isn't that it wasn't possible before, it's that it's 100x faster, easier, and cheaper to fake nudes now than it was previously. I think the reason for why claims of leaked nudes being fake are scoffed at is because making 100% believable fakes takes non-negligible time and effort, and maybe even money. Now, it's just "upload clothes-on image, download clothes-off image 20 seconds later". I think it's not insane to expect this huge shift in accessibility to eventually cause social changes — in the same way that asking for someone's Instagram is normal now, despite Instagram being a relatively new invention, I think it's pretty obvious that a few years from now, nude photos will be assumed by default to be fake rather than real. The social standards are a product of reality, they don't determine reality.
And until your prophecy comes to pass, if it ever does, women are out of luck.
Agreed. The "transition period" between this technology being invented and society finally adapting to its existence is probably going to be quite rough for a lot of people. My only hope is that society adapts quickly.
If God meant for people to be naked, then they'd be born that way.
Society has a lot of adapting to do, before they lose their superstitious hang-ups and religious bigotry. It will be a long time, if it ever happens at all.
If it ever does, what's your plan if it doesn't go as you predict? After all I don't think you're from the future coming to reassure us that all this is actually totally fine.
I mean, I'm just making an educated guess based on how society has historically been quite good at changing its norms based on new technology. We didn't have cars, and then we did, and within a generation it became extremely normal and expected for people to drive. Digital cameras were a big deal when they were invented, but for people like me who grew up in the early 2000s, they have never felt that unusual because I've been exposed to videos of myself as a baby for as long as I can remember. When The Avengers was released, it caused a huge amount of hype and hubbub. Now, Marvel produces a dozen superhero movies a year and nobody cares anymore.
Do I think that society will transform its views on nude images overnight? No, obviously not. Do I think that every person in society will go with the flow and adopt new assumptions about nude images? Also no. But it seems entirely logical to me to imagine that once the average person notices "hey, I am being inundated with 1000x the number of fake nudes that I used to," they will eventually reach the conclusion that "I should start assuming all nudes are fake by default."
You're just restating your prediction with more words... what should we do if it doesn't come to pass?
Like I get that it feels cozy to think that it'll just blow over but it ends up being a "thought terminating cliche" that gets repeated ad nauseum in these conversations. I think it's reasonable even if you think the problem will pass to have some idea of what to do if it doesn't. And that's more productive!
You're totally right, apologies for my non-answer earlier :)
I feel like education is probably our best bet for sort-of-kind-of dealing with the repercussions of the problem. Ultimately, attempting to legislatively regulate the production and private distribution of AI nudes just doesn't seem enforceable to me — see: Brandolini's principle. I think our best hope to mitigate the negative impact of this tech is to talk widely about it and actively try to make my "prophecy" come true. Sex ed curriculums should start including warnings about how fake nudes are an epidemic and begin to build the new social norm of not assigning moral judgements based off of them.
I could also see some government-funded PSAs on popular streaming services/TV channels/radio channels being effective. You'd want to try to disproportionately target such messaging to older members of society, since they're less technologically literate on average and therefore more likely to be emotionally traumatized by tech like this.
Now that I'm writing this, I do wonder if there are some regulatory approaches that could help, especially since we have had some success banning CSAM legislatively. The problem with AI nudes, I think, is that they are destined to become indistinguishable from genuine nudes, so a blanket ban is impossible without establishing a panopticon (which is obviously bad). Blanket banning works for CSAM because there is never any case in which a piece of CSAM should ever be knowingly stored or distributed, regardless of its origin. It is always immoral. This complicates things because nude images of adults can be shared consensually by said adults, and banning this practice would be absurd.
I'd be curious to hear your thoughts as well. What sorts of options are there to combat this new AI-generated nude plague if my optimism about the situation turns out to be misplaced?
It's already happened. Everybody knows the vast majority of celebrity nudes are fake and you can't easily tell which ones aren't. It's not that far of a stretch to extend that mindset to non-celebrity nudes.
It's the Naked Streisand Effect: the ones the celebrities made a big deal about are probably real. The most effective defense is to release more fake ones with outrageously unbelievably sexy if not anatomically possible features, so people will want to look at those instead of the real ones you're trying to bury, and make a big deal out of denying and suppressing the fake ones, ignoring the real ones.
The harm here is quite unintuitive. Can anyone take a stab at doing a first principles derivation of why this is harmful?
If you look at it as a form of revenge pornography where the intention of publicly sharing it is to demean and shame the victim the harm should be pretty obvious. People kill themselves over selfish actions like that.
I believe that the parent comment's point is that if the default assumption that a nude of a real person on the internet is fake, the nude should lose a lot of its power to shame. If nobody believes that a picture is real, nobody should believe that the events in the picture happened.
I'm not sure that's true, though. An evil rumour should be assumed to be untrue, yet they're often believed and often cause significant harm.
People have a tendency to keep ahold of their first opinions. And the first opinion will not be "this is obviously fake" - because it looks real, and we're animals first, humans second.
It's not really the same, rumors are more privileged. Rumor is a method of distribution of information and you can't dispose of distribution of information.
I hate that you're being downvoted for this.
Even today, women are judged not on the facts of who they are, but by the image men (and women) create around them. They're already bullied and harassed for their virginity (or lack thereof), "body count" (real or perceived), promiscuity, their conformity to society's standards of beauty, and so on.
Nudes, especially those created specifically to objectify or harass, will harm women. It will harm their self image, it will harm their reputations, it will harm their careers (women are already being fired from jobs because of nude pictures of them on various websites).
People are not rational beings. We can know intellectually that a picture is probably a fake, and it won't change how it impacts us. Especially if it plays on our expectations. We tend to go with our first impression, even if it's later proven wrong.
See: "Biden was shown images of babies being beheaded." An image many of us still have in our mind, even though it was explicitly debunked.
If fake nudes become routine, it will become unreasonable to be hysteric over nudes.
I'll simply repeat: People are not rational beings.
Even today, we believe stories which could easily be faked. We get outraged over stories which have long been proven false. Our belief in stories can even become stronger in the face of such proof.
Unreasonable... humans are entirely unreasonable, and will remain so even in the face of AI.
You can't disbelieve all stories, because there are important stories to believe in, it's not irrational, it's difficulty of categorization. But you can disbelieve nudes, they don't need accurate categorization.
What do we do if that's not true?
You might as well open paint and copy the head elsewhere, it will look shopped, but who cares.
This is like complaining that if someone scribbles over a map with blue crayon that they're creating a lake and destroying property. The map is not the territory. And even more than the normal meaning of that in this case the image is really not actually a photo of the person.
The complexity of the algorithms involved in making images obscures that simple fact and enables a lot of hyperbolic nonsense and dangerous calls for use of force. There's no one being "undressed" here.
Not really, if you sell someone a fake map and pass it off as authentic... you could be liable of fraud. The problem isn't that people are being "undressed" it's that the images are circulated as authentic in order to harm people.
There's certainly legitimacy to the public distribution aspect you lay out. I imagine there are laws that'd almost cover that already. But if you look around this very HN post you'll see most of the calls are for making generation itself illegal, not distribution.
edit: you yourself argue for making generation illegal in a different thread up above.
I am less and less inclined to believe that people actually believe such images are authentic.
How about you try it out and send it to your mom and see if she believes it's authentic?
Or how about have someone else send it to your boss, work contacts, and coworkers? Whether or not it's authentic, that's irrevocably harmed your relationship with them and there is nothing you can do to reverse that.
How has it harmed the relationships?
you seriously think if your ex sent your boss fake nudes it wouldn’t have any negative impact?
you must one of the most understanding employers on the planet if this is the case
people believe the moon landing and the holocaust are fake, there's a very low bar and fakes get better every day
People fight wars over maps, even though they are not the territory. Nine dashes, can you imagine? The maps can be just as risky as the territory. People will feel just as violated by fake nudes as real ones.
People don't fight wars over a random human person drawing over a map. It's not even something you'd think would be made illegal.
The wars happen when nations disagree about borders. Those borders are represented by maps but luckily nation states don't confuse the two (except for political theater and posturing). The wars happen when the real physical objects go over the real physical borders not when a random person scribbles on a map.
I think my analogy holds up pretty well even with your stretching of the scenario.
Do people have an issue with all fake nudes?
For example, there is the sculpture of a nude Brittney Spears giving birth:
http://www.arthistoryarchive.com/arthistory/contemporary/Con...
(NSFW: There is nudity).
Is there a difference between that and what an AI generates?
I believe, as with most actions (and crimes), intent matters.
You can take a picture of a clothed teenager and it can be perfectly and socially acceptable. Or it can be potentially illegal, based on intent. The same with images taken of nude babies.
Are you creating nude art with the intent of celebrating the human form, or sexual objectification. If it's the latter, do you have consent?
Intent matters, and it will probably be the linchpin for the legality for these discussions.
The term fake implies that it is being passed off as real. No one thinks that the sculpture of Britney giving birth is the real thing and its not being displayed as such.
Based on where things are at, I feel like there's legitimately no way to stop this. That said, there will be attempts to stop this, and that is something I'm almost more scared about.
What laws will be made attempting to stop what likely can't be stopped?
When something like this becomes commonplace will society just sort of get over it. Fake nudes are fake. They only have power because of perceived taboo over the real thing.
I don't want society to "get over" highly realistic fake nudes of high school girls. We have to draw the line somewhere (although I'm not sure how to solve the problem). There have been many cases reported of such images being passed around schools as a form of harassment and bullying. Such incidents are highly damaging to the victims, sometimes essentially forcing them to transfer schools and in a few cases perhaps even contributing to suicides.
If harassment is bad, then make harassment illegal.
That's a non-answer, totally disconnected from the reality of the problem. Harassment is already illegal. Punishing the harassers doesn't undo the damage. And in most cases the harassers are themselves also minors. They often act without really thinking through the consequences, and because they're minors they can't be given severe punishments.
MIC DROP!!!
It's the harassment that's the problem, not the fake nudes.
I dunno about that. If they get realistic enough that you can't differentiate between real and fake people will definitely use them to abuse, bully, etc.
Whatever the computer guesstimates as your nude body will always be an estimate. It cannot know what you actually look like under your clothes.
Being able to differentiate real and fake isn't the point. An occasional nude is special. An occasional fake nude is special. An infinite number of fakes nudes might just stop having any real power to abuse.
Sexuality is mostly in the mind and a real nude image has a element of fantasy that a fake doesn't have. The fact that it's just a made up bunch of pixels that has no real meaning in real life is a factor that we haven't yet fully experienced.
There are revenge porn laws. Making fake porn of a person and publicly distributing it could fall under that sort of legislation.
This is stickier than that in the US. I can paint a realistic nude and it's expression. Whether my tool is a paintbrush or neural network, as far as I know right now, makes no difference.
You might have some opportunity exploring a defamation action, but still, it's tenuous.
I don't have an answer, but I think we're looking at constitutional battles.
I think a great example of this at a high level is Kanye West's "Famous" music video, which features fake nudes of
1. George W. Bush
2. Anna Wintour
3. Donald Trump
4. Rihanna
5. Chris Brown
6. Taylor Swift
7. Kanye West
8. Kim Kardashian
9. Ray J
10. Amber Rose
11. Caitlyn Jenner
12. Bill Cosby
https://en.wikipedia.org/wiki/Famous_(Kanye_West_song)#Music...
It ended up being received socially negatively, but legally didn't have issues and the video is still up on youtube.
Virginia for example has made it illegal.
Yes, but I don't think it has survived a challenge. Has SCOTUS heard anything at all in this area yet? I didn't think so, but I could be wrong.
I'm not from the US but intuitive haven't most laws not been challenged and heard by the Supreme Court?
I think there are only a few such laws. Supreme Court also routinely denies to hear the case if it wasn't challenged enough elsewhere.
I mean most laws in the general rather than specific context. To be more obvious, I'm pointing out that just because the Supreme Court hasn't offered an opinion that that doesn't delegitimize the law. As otherwise there'd be thousands? of laws in a similar boat.
For further comparison revenge porn laws also have a similar potential for a first amendment defense but thus far have weathered challenges.
Publicly distributing it is illegal. But creating it currently isn’t. And how do you make a reasonable law against the latter?
Same as we have reasonable laws against possession and production of all sorts of illicit goods from meth to CSAM.
There's nothing new here that couldn't be done manually before; all that has changed is the speed at which it can be done.
And yet both of them have thriving markets. The market for adults being undressed will dwarf both.
I do not see any way a law will stop what is essentially math from being performed.
While everyone is worried about job losses and Skynet, these sort of side effects of accessible AI will be what fundamentally shift society.
I mean murder is illegal and yet there are still murders. Ipso facto we should legalize murder forthwith! Checkmate pacifists!
All murders are non consensual how do you know if mudes are non consensual? Would it also be illegal to draw a nude picture of someone?
We’re discussing non-consensual images.
Yes, so how do you determine whether an image on someone’s computer is consensual or non consensual? Where do you draw the line? If I draw an image that looks like someone and it’s only on my own computer?
So how do you determine if someone is in possession of non consensual nude pictures? We as a society properly assume that all underage nudity is non consensual because children can’t legally give consent to nice pictures
Those are not reasonable laws.
But the creation/use of CSAM and meth harm people other than the person using them. I don't see who's being harmed if a model is used to generate a nude image, and that image isn't distributed.
I wasn't making a qualitative comparison of the harms, merely responding to the question of what kinds of laws and enforcement mechanisms will be used to stop AI pornification of people.
Simply put, I don't think we'll see anything we haven't seen before, because this isn't a new thing that hasn't been done before- only the means of production (using AI) is new.
I responded with a little more detail to a sibling content, but if laws for banning CSAM and meth exist due to the negative externalities of possessing them - that's where I think this is different. If we see laws come about for regulating AI private-use content production, I suspect we'll see some unique court cases about it, premised around free speech / lack of harm.
The production and sale of CSAM and meth harms people, mere possession and consumption harms only the user.
I don't think the analogy to generative models holds. The possession and consumption of meth and CSAM _drive_ production of meth and CSAM; it's a supply and demand market. So simply possessing it harms others as well.
A generative model is different. You can produce as much content with it as you want, hallucinated out of thin air. The production of said content doesn't have externalities, other than electricity usage, if the content isn't distributed.
> The possession and consumption of meth and CSAM _drive_ production of meth and CSAM
This is not true, its the buying of it that drives production. If you make your own meth or pirate CSAM then you would be harming only yourself.
Possession and production can be done in imagination. After all, it's just information.
I think that people can probably go after anyone marketing such tools:
According to the article, one of the largest purveyors markets it for use of “AI images”.
Perhaps it is the case that our legislators, in their infinite wisdom, have dragged their feet on privacy issues precisely because they knew it would lead to this enforcement nightmare when AI was used to generatively breach our most basic personal boundaries. My god, this whole time I thought they were just corrupt and incompetent, but they were actually protecting us!
Any photo becomes potential porn, so it can be regulated with a photographer license. Likewise any unlicensed photographer becomes illegal.
Doesn't seem so difficult? make it illegal to produce and distribute nude images of people without their consent. Enforcement can't be perfect, but neither are csam or revenge porn laws.
Very shortly you don't need to distribute the "nude" image though. The reference images just needs to be public with a one click way to get the tranformed result.
If we really shorten that, it could all fit into an url with a parameter pointing to the original, and nobody's distributing a specific image, they're all privately generated.
I think he means the attempts to prevent people from doing it privately without distribution.
why would you want to do such a thing? clear 1st amendment situation
that's a problem with tons of laws, I can cook meth at home but I'm unlikely to get caught until I start distributing
A bit like photoshop used to be used in the early days, but with a procedurally generated body instead of a cropped image.
FUD. Women (or any other gender) can not be undressed in a photo. Only IRL.
What these software programs do is something else entirely. They refactor the photo into something else which is not the same person in a naked state but an entirely fictional work of "art". It is analogous to posing in front of a painter fully clothed, and the painter applying anatomical knowledge and imagination to add artifacts (or even nudity) that never existed IRL.
Nude