Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.

While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.

  • Jarix@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    What the fuck do you mean no?! This is happening right the fuck now. Its already happening. You DONT want it to decrease the total number actual real children who are used and abused to feed this shit?

    I think you think im supporting this in some way. I AM NOT. Im saying i hope that any of the pedos out there are using this instead of taking action against actual children will not have already harmed children or at the very least reduce the total harm done.

    Christ what the hell are we coming to if we cant even try to find some fucking sanity in this situation.

    And for all that is good and right in the world i also very much hope it doesnt lead to MORE abuse.

    Can we at least hope for the best while trying to fix the worst?

    • BreakDecks@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      The pedos out there are using AI to nudity pictures of real kids. That’s just going to drive up the demand for creep shots and child model photosets to exploit.

      There may be a small percentage of offending pedophiles that switch to pure GenAI over pictures of real kids, but I don’t see GenAI ever playing a role in harm reduction given the harm it ultimately enables.

      One of the current sickening trends is for a predator to convince a kid to send underwear or swimsuit pics, and then blackmail them into more hardcore photos with nudified versions of the original pics. They’re already seeing an influx of that kind of CSAM online, that involves abusing real kids on social media.

      I just wish America was less puritanical and taught kids about sex and boundaries to protect them, and that we had a functioning mental healthcare system that directly helps people who experience inappropriate sexuality attractions like pedophilia before they go down these dark paths.