A Telegram user who advertises their services on Twitter will create an AI-generated pornographic image of anyone in the world for as little as $10 if users send them pictures of that person. Like many other Telegram communities and users producing nonconsensual AI-generated sexual images, this user creates fake nude images of celebrities, including images of minors in swimsuits, but is particularly notable because it plainly and openly shows one of the most severe harms of generative AI tools: easily creating nonconsensual pornography of ordinary people.
That’s a ripoff. It costs them at most $0.1 to do simple stable diffusion img2img. And most people could do it themselves, they’re purposefully exploiting people who aren’t tech savvy.
Wait? This is a tool built into stable diffusion?
In regards to people doing it themselves, it might be a bit too technical for some people to setup. But I’ve never tried stable diffusion.
It’s not like deep fake pornography is “built in” but Stable Diffusion can take existing images and generate stuff based on it. That’s kinda how it works really. The de-facto standard UI makes it pretty simple, even for someone who’s not too tech savvy: https://github.com/AUTOMATIC1111/stable-diffusion-webui
Thanks for the link, I’ve been running some llm locally, and I have been interested in stable diffusion. I’m not sure I have the specs for it at the moment though.
By the way, if you’re interested in Stable Diffusion and it turns out your computer CAN’T handle it, there are sites that will let you toy around with it for free, like civitai. They host an enormous number of models and many of them work with the site’s built in generation.
Not quite as robust as running it locally, but worth trying out. And much faster than any of the ancient computers I own.
And mechanics exploit people needing brake jobs. What’s your point?
The people being exploited are the ones who are the victims of this, not people who paid for it.
There are many victims, including the perpetrators.