A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.

  • Ultragigagigantic@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    It’s gonna suck no matter what once the technology became available. Perhaps in a bunch of generations there will be a massive cultural shift to something less toxic.

    May as well drink the poison if I’m gonna be immersed in it. Cheers.

    • VinnyDaCat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I was really hoping that with the onset of AI people would be more skeptical of content they see online.

      This was one of the reasons. I don’t think there’s anything we can do to prevent people from acting like this, but what we can do as a society is adjust to it so that it’s not as harmful. I’m still hoping that the eventual onset of it becoming easily accessible and useable will help people to look at all content much more closely.

  • Cris@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    God, generative ai is such a fucking caustic technology. I honestly don’t see anything positive and not disgusting enabled by this tech.

    Edit: I see people don’t agree, but like why can’t ai stick to translating stuff and being useful rather than making horrifically unethical porn, taking the humanity out of art, and replacing peoples jobs with statistical content generation. I hate it here.

    • Mubelotix@jlai.lu
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      You can call people disgusting over what they do with a tool, but the tool itself is just a tool, it can’t be disgusting

      • Cris@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The distinction is that I can see worthwhile use cases for non-generative ai, and not for generative ai, and generative ai is built on theft of creative labor

        I’m not angry at people who use generative ai, I’m angry at the people who built it by stealing from creatives to build a commercial tool that can seemingly only be used in awful ways.

  • JackGreenEarth@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      The people being exploited are the ones who are the victims of this, not people who paid for it.

    • M500@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Wait? This is a tool built into stable diffusion?

      In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.

        • M500@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.

          • TheRealKuni@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            By the way, if you’re interested in Stable Diffusion and it turns out your computer CAN’T handle it, there are sites that will let you toy around with it for free, like civitai. They host an enormous number of models and many of them work with the site’s built in generation.

            Not quite as robust as running it locally, but worth trying out. And much faster than any of the ancient computers I own.

    • roscoe@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      As soon as anyone can do this on their own machine with no third parties involved all laws and other measures being discussed will be moot.

      We can punish nonconsensual sharing but that’s about it.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      Doesn’t mean distribution should be legal.

      People are going to do what they’re going to do, and the existence of this isn’t an argument to put spyware on everyone’s computer to catch it or whatever crazy extreme you can take it to.

      But distributing nudes of someone without their consent, real or fake, should be treated as the clear sexual harassment it is, and result in meaningful criminal prosecution.

      • treadful@lemmy.zip
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Almost always it makes more sense to ban the action, not the tool. Especially for tools with such generalized use cases.

    • goldteeth@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 months ago

      “Djinn”, specifically, being the correct word choice. We’re way past fun-loving blue cartoon Robin Williams genies granting wishes, doing impressions of Jack Nicholson and getting into madcap hijinks. We’re back into fuckin’… shapeshifting cobras woven of fire and dust by the archdevil Iblis, hiding in caves and slithering out into the desert at night to tempt mortal men to sin. That mythologically-accurate shit.

  • afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    It’s stuff like this that makes me against copyright laws. To me it is clear and obvious that you own your own image, and it is far less obvious to me that a company can own an image whose creator drew multiple decades ago that everyone can identify. And yet one is protected and the other isn’t.

    What the hell do you own if not yourself? How come a corporation has more rights than we do?

    • Kairos@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      This stuff should be defamation, full stop. There would need to be a law specifically saying you can’t sign it away, though.

  • guyrocket@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    This is not new. People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me. The result is the same: fake porn/nudes.

    And all the hand wringing in the world about it being non consensual will not stop it. The cat has been out of the bag for a long time.

    I think we all need to shift to not believing what we see. It is counterintuitive, but also the new normal.

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      People have been Photoshopping this kind of thing since before there was Photoshop. Why “AI” being involved matters is beyond me

      Because now it’s faster, can be generated in bulk and requires no skill from the person doing it.

    • echo64@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I hate this: “Just accept it women of the world, accept the abuse because it’s the new normal” techbro logic so much. It’s absolutely hateful towards women.

      We have legal and justice systems to deal with this. It is not the new normal for me to be able to make porn of your sister, or mother, or daughter. Absolutely fucking abhorrent.

        • Jrockwar@feddit.uk
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          And so is straight male-focused porn. We men seemingly are not attractive, other than for perfume ads. It’s unbelievable gender roles are still so strongly coded in 20204. Women must be pretty, men must buy products where women look pretty in ads. Men don’t look pretty and women don’t buy products - they clean the house and care for the kids.

          I’m aware of how much I’m extrapolating, but a lot of this is the subtext under “they’ll make porn of your sisters and daughters” but leaving out of the thought train your good looking brother/son, when that’d be just as hurtful for them and yourself.

          • lud@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Or your bad looking brother or the bad looking myself.

            Imo people making ai fakes for themselves isn’t the end of the world but the real problem is in distribution and blackmail.

            You can get blackmailed no matter your gender and it will happen to both genders.

        • echo64@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Sorry if I didn’t position this about men. They are the most important thing to discuss and will be the most impacted here, obviously. We must center men on this subject too.

          • Thorny_Insight@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            Pointing out your sexism isn’t saying we should be talking about just men. It you whose here acting all holy while ignoring half of the population.

            • echo64@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              3 months ago

              Yes yes, #alllivesmatter amiirte? We just ignore that 99.999% of the victims will be women, just so we can grandstand about men.

      • cley_faye@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        How do you propose to deal with someone doing this on their computer, not posting them online, for their “enjoyment”? Mass global surveillance of all existing devices?

        It’s not a matter of willingly accepting it; it’s a matter of looking at what can be done and what can not. Publishing fake porn, defaming people, and other similar actions are already (I hope… I am not a lawyer) illegal. Asking for the technology that exists, is available, will continue to grow, and can be used in a private setting with no witness to somehow “stop” because of a law is at best wishful thinking.

  • anticurrent@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    We are acting as if through out history we managed to orient technology so as to to only keep the benefits and eliminate negative effects. while in reality most of the technology we use still comes with both aspects. and it is not gonna be different with AI.

  • ulterno@lemmy.kde.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    I say stop antagonizing the AI.
    The only difference between a skilled artist making it with Photoshop and someone using a Neural Net, is the amount of time and effort put into creating the instance.

    If there are to be any laws against these, they need to apply to any and every fake that’s convincing enough, no matter the technology used.


    Don’t blame the gunpowder for the dead person.
    People were being killed by thrown stones before that.

  • GrymEdm@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 months ago

    To people who aren’t sure if this should be illegal or what the big deal is: according to Harvard clinical psychiatrist and instructor Dr. Alok Kanojia (aka Dr. K from HealthyGamerGG), once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves. There’s also extreme risk of feeling depressed, angry, anxiety, etc. The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

    I’ll admit I used to look at celeb deepfakes, but once I saw that video I stopped immediately and avoid it as much as I possibly can. I believe porn can be done correctly with participant protection and respect. Regarding deepfakes/revenge porn though that statistic about suicidal ideation puts it outside of healthy or ethical. Obviously I can’t make that decision for others or purge the internet, but the fact that there’s such regular and extreme harm for the (what I now know are) victims of non-consensual porn makes it personally immoral. Not because of religion or society but because I want my entertainment to be at minimum consensual and hopefully fun and exciting, not killing people or ruining their happiness.

    I get that people say this is the new normal, but it’s already resulted in trauma and will almost certainly continue to do so. Maybe even get worse as the deepfakes get more realistic.

    • spez_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      The technology will become available everywhere and run on every device over time. Nothing will stop this

    • lud@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      once non-consensual pornography (which deepfakes are classified as) is made public over half of people involved will have the urge to kill themselves.

      Not saying that they are justified or anything but wouldn’t people stop caring about them when they reach a critical mass? I mean if everyone could make fakes like these, I think people would care less since they can just dismiss them as fakes.

      • eatthecake@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        The analogy given is it’s like watching video the next day of yourself undergoing sex without consent as if you’d been drugged.

        You want a world where people just desensitise themselves to things that make them want to die through repeated exposure. I think you’ll get a whole lot of complex PTSD instead.

        • stephen01king@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          People used to think their lives are over if they were caught alone with someone of the opposite sex they’re not married to. That is no longer the case in western countries due to normalisation.

          The thing that makes them want to die is societal pressure, not the act itself. In this case, if societal pressure from having fake nudes of yourself spread is removed, most of the harm done to people should be neutralised.

          • eatthecake@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            The thing that makes them want to die is societal pressure, not the act itself.

            That’s an assumption that you have no evidence for. You are deciding what feelings people should have by your own personal rules and completely ignoring the people who are saying this is a violation. What gives you the right to tell people how they are allowed to feel?

          • too_much_too_soon@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            3 months ago

            Agreed.

            "I’ve been in HR since '95, so yeah, I’m old, lol. Noticed a shift in how we view old social media posts? Those wild nights you don’t remember but got posted? If they’re at least a decade old, they’re not as big a deal now. But if it was super illegal, immoral, or harmful, you’re still in trouble.

            As for nudes, they can be both the problem and the solution.

            To sum it up, like in the animate movie ‘The Incredibles’: ‘If everyone’s special, then no one is.’ If no image can be trusted, no excuse can be doubted. ‘It wasn’t me’ becomes the go-to, and nobody needs to feel ashamed or suicidal over something fake that happens to many.

            Of course, this is oversimplifying things in the real world but society will adjust. People won’t kill themselves over this. It might even be a good thing for those on the cusp of AI and improper real world behaviours - ‘Its not me. Its clearly AI, I would never behave so outrageously’.

      • Drewelite@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        I think this is realistically the only way forward. To delegitimize any kind of nudes that might show up of a person. Which could be good. But I have no doubt that highschools will be flooded with bullies sending porn around of innocent victims. As much as we delegitimize it as a society, it’ll still have an effect. Like social media, though it’s normal for anyone to reach you at any time, It still makes cyber bullying more hurtful.

        • afraid_of_zombies@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          3 months ago

          Well if you are sending nudes to someone in high school you are sending porn to a minor. Which I am pretty confident is illegal already. I just would rather not search for that law.