Settings

Theme

Deepfakes weaponised to target Pakistan's women leaders

france24.com

74 points by mostcallmeyt a year ago · 32 comments

Reader

xhevahir a year ago

They say that "media literacy is poor" in Pakistan but it's not at all clear that education would solve the problem. We've seen in more "media-literate" countries a free-floating distrust of media of all kinds that is exacerbated by the keener awareness of deceptive techniques. Niklas Luhmann has written about this "suspicion of manipulation:" https://monoskop.org/images/6/6c/Luhmann_Niklas_The_Reality_...

  • chii a year ago

    The fact that advertising works, despite many claiming they're not susceptible to it, is good evidence that media manipulation works amazingly well - even to people who think they understand and would not be manipulated.

    • renox a year ago

      OTOH there also the saying that there's no bad publicity.. So it's not so obvious!

alephnerd a year ago

Deepfakes are being weaponized against all politicians in Pakistan - not just women.

FYI, Pakistan is also in the process of releasing it's version of China's Great Firewall and has heavily leveraged the "AI Safety" argument for that [0][1].

Also, it's quite rich that a PMLN minister in Punjab is complaining about electoral safety when her party viciously cracked down on the PTI.

[0] - https://tribune.com.pk/story/2467727/rs20b-sought-to-boost-c...

[1] - https://asia.nikkei.com/Business/Telecommunication/Pakistan-...

_-_-__-_-_- a year ago

I would say creating sexual, pornographic or depicting criminal acts with AI generated images and videos using photographs and media of a person without their consent is morally wrong. It will however, only get easier to create and manipulate images. It is going to become automatic and the barriers to entry will be reduced. The social default will be to assume that everything is a AI generated or manipulated. It will not matter if the content is real.

  • para_parolu a year ago

    We basically get back to pre-photo era, when all you access your information based on credibility of messenger.

    • chii a year ago

      or for photos, only film evidence can be admitted. Digital content (or even digitized content, unless there's a film backup), is inadmissable.

      • estebank a year ago

        Making film backed images from edited digital pictures is not beyond the ability of a motivated individual, let alone a state entity. Even something as rudimentary as a film camera pointed at a slightly out of focus 4k screen can be made believable, specially in a film stock with heavy grain.

        • chii a year ago

          I guess if you have enough resources, faking something is possible. But making it expensive means fewer people could do it. The advent of ai and ease of doing so is what is dangerous.

          Perhaps images taken in the future by digital cameras need to come with an IR depth map as well, which would make it harder to fake.

          • burnerthrow008 a year ago

            But it doesn't even seem expensive?

            4k monitors are cheap these days, as are used 35mm film cameras. I bet you could build the system estebank mentioned for less than $500.

            Adding a depth map to a simulated image sounds cheaper than adding it to a real image.

      • qingcharles a year ago

        I can use film to take a photo of a screen... :/

        The way they solve this issue in court is having the photographer lay a foundation by swearing to the authenticity of the image.

  • somedude895 a year ago

    Maybe this will get people to stop taking seriously what they see and read on social media and go engage more with the real world and real people.

  • emporas a year ago

    I would say using goggles/glasses which undress everyone you see on the street is not morally wrong. How about recording and uploading it somewhere?

    Also, not having statues of naked women and/or naked men in public streets is morally wrong.

abdullahkhalids a year ago

While the main fact of this story (that deepfakes were created) is true, the article overall seems like a propaganda piece by a highly unpopular government to create a soft image. Ignore the other stuff in the article.

  • alephnerd a year ago

    My gut says it's probably being used as an argument to deflect from the rollout of a China style Great Firewall.

isr a year ago

Just FYI, to those not in Pakistan who might not be aware. The minister mentioned in the article (yes, deepfakes are a form of sexual criminality!), is a direct underling of another female chief minister (who won despite her party receiving less than 20% of the votes - yay military rigging) who ... get this ...

runs an extortion ring where 100s of judges, politicians etc are honey trapped & copious videos are made. She (& her husband) have OPENLY bragged about this, FOR YEARS. 1 judge (whose video was released by her team) committed suicide 3 years ago. The rest of the judges all toe the line (hence the laughable "lawfare" vs Imran Khan).

She makes J Edgar Hoover seem like a novice. Just keep that in mind ...

  • isr a year ago

    A few brave souls (eg Senator Swati) have openly admitted to this attempted blackmail (his was an explosive, tearful press conference 3 years ago).

    She even does it to the military top brass, and they do it to her & her goons. In short, they're all low-rent garbage, so save your tears for the more worthwhile 1000s of political prisoners (MANY of whom are female) of this military-led, crime-family infested, American State Dept sponsored gang of thieves & cutthroats (eg: witness recent Islamabad massacre, helpfully ignored by the mainstream US media)

    • selimthegrim a year ago

      If the state dept threw out IK that was wrong, but why has everyone forgotten IK begging the “umpires” to throw govt out when he was out of power before elected (with some help) the first time.

  • nox101 a year ago

    deepfakes are not a form of sexual criminality. Gees Thought Police here we come!

    • zamadatix a year ago

      Whether deepfakes are or are not a form of sexual criminality is an absolute based on the jurisdiction. Either it is a crime there or it isn't. That's independent of whether or not individuals or the population agree it should be, whether it's moral, whether it's seen as overreaching, or any other consideration beyond what the law there says.

      • nox101 a year ago

        Yea, sure, being gay or having anal sex, is still a crime too in some places.

        We can argue it shouldn't be considered a crime and I'd argue deepfakes should not be a crime. I'd also argue considering them so is close to "thought police".

        And, I'd argue fighting them is a losing cause. They get easier and easier to make daily. Lots of papers at the most recent Siggraph on how to take a single photo and turn it into an deepfake, oh, I mean avatar for a game, model for virtual clothes fitting, etc...

        • kristiandupont a year ago

          >I'd also argue considering them so is close to "thought police"

          How do you reach that conclusion? Deepfakes get shared, which has real consequences for people.

    • jimjimwii a year ago

      To me, its just another form of fiction. Its as much a crime as writing a story where you base a character in it on someone you know, and subject that character to sexual acts. Sure its depraved, but it doesn't pass the criminality threshold for me, not by a long shot.

      And no, i do not see the use of real photos and video to be material to this specific discussion.

      I think this issue is a great way to evaluate the values and integrity of a society.

DiscourseFan a year ago

The day I see a believable deepfake of myself doing strange sexual acts is the day that I will stop caring what people think of my sexuality

  • tgv a year ago

    Also if these people would amass outside your house and demand you'd be tried for blasphemy? Or would just try to kill you? E.g. Pakistan, women, religion and sexuality is a heady mix. Not everybody has the luxury of not caring.

    I would also like to point out again that people still developing and productizing deepfake software or tools that underpin them, should take the morally only correct path and stop, and retract their work. It's a considerable net negative on society.

    • DiscourseFan a year ago

      >Also if these people would amass outside your house and demand you'd be tried for blasphemy? Or would just try to kill you? E.g. Pakistan, women, religion and sexuality is a heady mix. Not everybody has the luxury of not caring.

      Hard to do if its happening to everyone

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection