- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.
While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.
So, if the AI generated tits look real, but they’re not HER tits, is it just less terrible?
I mean literally yes, but that doesn’t mean it isn’t still horrible
It’s all so incredibly gross. Using “AI” to undress someone you know is extremely fucked up. Please don’t do that.
Can you articulate why, if it is for private consumption?
Consent.
You might be fine with having erotic materials made of your likeness, and maybe even of your partners, parents, and children. But shouldn’t they have right not to be objectified as wank material?
I partly agree with you though, it’s interesting that making an image is so much more troubling than having a fantasy of them. My thinking is that it is external, real, and thus more permanent even if it wouldn’t be saved, lost, hacked, sold, used for defamation and/or just shared.
To add to this:
Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.
Wouldn’t that feel wrong? Wouldn’t you feel violated? It’s the same with such AI porn tools. You serve to satisfy the sexual desires of someone else and you are given no choice. Whether you want it or not, you are becoming part of their act. Becoming an unwilling participant in such a way can feel similarly violating.
They are painting and using a picture of you, which is not as you would like to represent yourself. You don’t have control over this and thus, feel violated.
This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated. Becoming unwillingly, unasked a participant, either active or passive, in the sexual act of someone else and having no or not much control over it, feels wrong and violating for a lot of people.
In principle that even shares some similarities to rape.There are countries where you can’t just take pictures of someone without asking them beforehand. Also there are certain rules on how such a picture can be used. Those countries acknowledge and protect the individual’s right to their image.
Just to play devils advocate here, in both of these scenarios:
Imagine someone would sneak into your home and steal your shoes, socks and underwear just to get off on that or give it to someone who does.
This reminds me of that fetish, where one person is basically acting like a submissive pet and gets treated like one by their “master”. They get aroused by doing that in public, one walking with the other on a leash like a dog on hands and knees. People around them become passive participants of that spectactle. And those often feel violated.
The person has the knowledge that this is going on. In he situation with AI nudes, the actual person may never find out.
Again, not to defend this at all, I think it’s creepy af. But I don’t think your arguments were particularly strong in supporting the AI nudes issue.
In every chat I find about this, I see people railing against AI tools like this but I have yet to hear an argument that makes much sense to me about it. I don’t care much either way but I want a grounded position.
I care about harms to people and in general, people should be free to do what they want until it begins harming someone. And then we get to have a nuanced conversation about it.
I’ve come up with a hypothetical. Let’s say that you write naughty stuff about someone in your diary. The diary is kept in a secure place and in private. Then, a burglar breaks in and steals your diary and mails that page to whomever you wrote it about. Are you, the writer, in the wrong?
My argument would be no. You are expressing a desire in private and only through the malice of someone else was the harm done. And no, being “creepy” isn’t an argument either. The consent thing I can maybe see but again do you have a right not to be fantasized about? Not to be written about in private?
I’m interested in people’s thoughts because this argument bugs me not to have a good answer for.
Yeah it’s an interesting problem.
If we go down the path of ideas in the mind and the representations we create and visualize in our mind’s eye, to forbid people from conceiving of others sexually means there really is no justification for conceiving of people generally.
If we try to seek for a justification, where is that line drawn? What is sexual, and what is general? How do we enforce this, or at least how do we catch people in the act and shame them into stopping their behavior, especially if we don’t possess the capability of telepathy?
What is harm? Is it purely physical, or also psychological? Is there a degree of harm that should be allowed, or that is inescapable despite our best intentions?
The angle that you point out regarding writing things down about people in private can also go different ways. I write things down about my friends because my memory sucks sometimes and I like to keep info in my back pocket for when birthdays, holidays, or special occasions come. What if I collected information about people that I don’t know? What if I studied academics who died in the past to learn about their lives, like Ben Franklin? What if I investigated my neighbors by pointing cameras at their houses, or installing network sniffers or other devices to try to collect information on them? Does the degree of familiarity with those people I collect information about matter, or is the act wrong in and of itself? And do my intentions justify my actions, or do the consequences of said actions justify them?
Obviously I think it’s a good thing that we as a society try to discourage collecting information on people who don’t want that information collected, but there is a portion of our society specifically allowed to do this: the state. What makes their status deserving of this power? Can this power be used for ill and good purposes? Is there a level of cross collection that can promote trust and collaboration between the state and its public, or even amongst the public itself? I would say that there is a level where if someone or some group knows enough about me, it gets creepy.
Anyways, lots of questions and no real answers! I’d be interested in learning more about this subject, and I apologize if I steered the convo away from sexual harassment and violation. Consent extends to all parts of our lives, but sexual consent does seem to be a bigger problem given the evidence of this post. Looking forward to learning more!
I think we’ve just stumbled on an issue where the rubber meets the road as far as our philosophies about privacy and consent. I view consent as important mostly in areas that pertain to bodily autonomy right? So we give people the rights to use our likeness for profit or promotion or distribution. And what we’re giving people is a mental permission slip to utilize the idea of the body or the body itself for specific purposes.
However, I don’t think that these things really pertain to private matters. Because the consent issue only applies when there are potential effects on the other person. Like if I talk about celebrities and say that imagining a celebrity sexually does no damage because you don’t know them, I think most people would agree. And so if what we care about is harm, there is no potential for harm.
With surveillance matters, the consent does matter because we view breaching privacy as potential harm. The reason it doesn’t apply to AI nudes is that privacy is not being breached. The photos aren’t real. So it’s just a fantasy of a breach of privacy.
So for instance if you do know the person and involve them sexually without their consent, that’s blatantly wrong. But if you imagine them, that doesn’t involve them at all. Is it wrong to create material imaginations of someone sexually? I’d argue it’s only wrong if there is potential for harm and since the tech is already here, I actually view that potential for harm as decreasing in a way. The same is true nonsexually. Is it wrong to deepfake friends into viral videos and post them on twitter? Can be. Depends. But do it in private? I don’t see an issue.
The problem I see is the public stuff. People sharing it. And it’s already too late to stop most of the private stuff. Instead we should focus on stopping AI porn from being shared and posted and create higher punishments for ANYONE who does so. The impact of fake nudes and real nudes is very similar, so just take them similarly seriously.
The person has the knowledge that this is going on.
Not necessarily, no. It could be that they might just think they’ve misplaced their socks. If you’ve lived in an apartment building with shared laundry spaces, it’s not so uncommon to loose some minor parts of clothing. But just because they don’t get to know about it, it’s not less wrong or should be less illegal.
In he situation with AI nudes, the actual person may never find out.
Also in connection with my remarks before:
A lot of our laws also apply even if no one is knowingly damaged (yet). (May of course depend on the legislation of wherever you live.)
Already intending to commit a crime can sometimes be reason enough to bring someone to court.
We can argue how much sense that makes of course, but at the current state, we, as a society, decided that doing certain things should be illegal, even if the damage has not manifested yet. And I see many good points to handle it that way with such AI porn tools as well.
Traumatizing rape victims with non consentual imagery of them naked and doing sexual things with others and sharing it is totally not going yo fuck up the society even more and lead to a bunch of suicides! /s
Ai is the future. The future is dark.
That’s why we need strong legislation. Most countries wordlwide are missing crucial time frames for making such laws. At least some are catching up, like the EU did recently with their first AI act.
tbf, the past and present are pretty dark as well
it is external, real, and thus more permanent
Though just like your thoughts, the AI is imagining the nude parts aswell because it doesn’t actually know what they look like. So it’s not actually a nude picture of the person. It’s that person’s face on a entirely fictional body.
But the issue is not with the AI tool, it’s with the human wielding it for their own purposes which we find questionable.
An exfriend of mine Photoshopped nudes of another friend. For private consumption. But then someone found that folder. And suddenly someones has to live with the thought that these nudes, created without their consent, were used as spank bank material. Its pretty gross and it ended the friendship between the two.
You can still be wank material with just your Facebook pictures.
Nobody can stop anybody from wanking on your images, AI or not.
Louis CK has also sexually harassed a number of women( jerking off in front of unwilling viewers)…
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Thats already weird enough, but there is a meaningful difference between nude pictures and clothed pictures. If you wanna whack one to my fb pics of me looking at a horse, ok, weird. Dont fucking create actual nude pictures of me.
It’s creepy and can lead to obsession, which can lead to actual harm for the individual.
I don’t think it should be illegal, but it is creepy and you shouldn’t do it. Also, sharing those AI images/videos could be illegal, depending on how they’re represented (e.g. it could constitute libel or fraud).
I disagree. I think it should be illegal. (And stay that way in countries where it’s already illegal.) For several reasons. For example, you should have control over what happens with your images. Also, it feels violating to become unwillingly and unasked part of the sexual act of someone else.
That sounds problematic though. If someone takes a picture and you’re in it, how do they get your consent to distribute that picture? Or are they obligated to cut out everyone but those who consent? What does that mean for news orgs?
That seems unnecessarily restrictive on the individual.
At least in the US (and probably lots of other places), any pictures taken where there isn’t a reasonable expectation of privacy (e.g. in public) are subject to fair use. This generally means I can use it for personal use pretty much unrestricted, and I can use it publicly in a limited capacity (e.g. with proper attribution and not misrepresented).
Yes, it’s creepy and you’re justified in feeling violated if you find out about it, but that doesn’t mean it should be illegal unless you’re actually harmed. And that barrier is pretty high to protect peoples’ rights to fair use. Without fair use, life would suck a lot more than someone doing creepy things in their own home with pictures of you.
So yeah, don’t do creepy things with other pictures of other people, that’s just common courtesy. But I don’t think it should be illegal, because the implications of the laws needed to get there are worse than the creepy behavior of a small minority of people.
Can you provide an example of when a photo has been taken that breaches the expectation of privacy that has been published under fair use? The only reason I could think that would work is if it’s in the public interest, which would never really apply to AI/deepfake nudes of unsuspecting victims.
I’m not really sure how to answer that. Fair use is a legal term that limits the “expectation of privacy” (among other things), so by definition, if a court finds it to be fair use, it has also found that it’s not a breach of the reasonable expectation of privacy legal standard. At least that’s my understanding of the law.
So my best effort here is tabloids. They don’t serve the public interest (they serve the interested public), and they violate what I consider a reasonable expectation of privacy standard, with my subjective interpretation of fair use. But I disagree with the courts quite a bit, so I’m not a reliable standard to go by, apparently.
Fair use laws relate to intellectual property, privacy laws relate to an expectation of privacy.
I’m asking when has fair use successfully defended a breach of privacy.
Tabloids sometimes do breach privacy laws, and they get fined for it.
If you have to ask, you’re already pretty skeevy.
And if you have to say that, you’re already sounding like some judgy jerk.
The fact that you do not even ask such questions, shows that you are narrow minded. Such mentality leads to people thinking that “homosexuality is bad” and never even try to ask why, and never having chance of changing their mind.
They cannot articulate why. Some people just get shocked at “shocking” stuff… maybe some societal reaction.
I do not see any issue in using this for personal comsumption. Yes, I am a woman. And yes people can have my fucking AI generated nudes as long as they never publish it online and never tell me about it.
The problem with these apps is that they enable people to make these at large and leave them to publish them freely wherever. This is where the dabger lies. Not in people jerking off to a picture of my fucking cunt alone in a bedroom.
Mhmm. Cool.
Would you like if someone were to make and wank to these pictures of your kids, wife or parents ? The fact that you have to ask speaks much about you tho.
The fact that people don’t realize how these things can be used for bad and weaponized is insane. I mean, it shows they clearly are not part of the vulnerable group of people and their privilege of never having dealt with it.
The future is amazing! Everyone with apps going to the parks and making some kids nude. Or bullying which totally doesn’t happen in fucked up ways with all the power of the internet already.
There are plenty of things I might not like that aren’t illegal.
I’m interested in thr thought experiment this has brought up, but I don’t want us to get caught in a reactionary fervor because of AI.
AI will make this easier to do, but people have been clipping magazines and celebrities have had photoshops fakes created since both mediums existed. This isn’t new, but it is being commoditized.
My take is that these pictures shouldn’t be illegal to own or create, but they should be illegal to profit off of and distribute, meaning these tools specifically designed and marketed for it would be banned. If someone wants to tinker at home with their computer, yoipl never be able to ban that, and you’ll never be able to ban sexual fantasy.
I think it should be illigal even photoshops of celebs they too are human and have emotions.
I get that it’s creepy but that’s a dark path you want to walk down. Think about how that would have to be enforced.
I would say like how cp is enforced as some of these ai fakes might even involve kids .
That’s a great example though, because truthfully digital cp is incredibly difficult to enforce, and some of the laws that have been proposed to make it easier have been incredibly controversial due to how violating they are to people’s privacy.
Would it be any different if you learn how to sketch or photoshop and do it yourself?
Yes, because the AI (if not local) will probably store the images on their Servers
This is the only good answer.
good point
This is also fucking creepy. Don’t do this.
I am not saying anyone should do it and don’t need some internet stranger to police me thankyouverymuch.
But yet, we’ve (almost) all imagined someone we know naked.
There is a massive difference between thoughts and action. I’m sure a significant portion of us have thought about murdering someone too, does that make actually going through with murder less bad?
This is a false equivalency. The correct analogy would be: if I think about murdering someone and then draw a picture of it or make a movie about murdering them, is that wrong?
Well I’d usually say you should go to therapy if your having vivid thoughts of murdering other people
I think anyone claiming otherwisw would be lying most likely.
You say that as if photoshopping someone naked isnt fucking creepy as well.
Creepy to you, sure. But let me add this:
Should it be illegal? No, and good luck enforcing that.
You’re at least right on the enforcement part, but I dont think the illegality of it should be as hard of a no as you think it is
Creepy, maybe, but tons of people have done it. As long as they don’t share it, no harm is done.
I dont think that many have dude. Like sure, if you’re talking total number and not percentage, but this planet has so many people you could also claim that tons of people are pedophiles too
Lol you’d be surprised…isn’t this one of those things people would do in private but never admit in public (because of people likr you getting all touchy and creeped out by it)?
You say this like we SHOULDN’T be creeped out that you are digitally undressing someone without their permission
You can feel whatever you want.
Same vein as “you should not mentally undress the girl you fancy”. It’s just a support for that. Not that i have used it.
Don’t just upload someone else’s image without consent, though. That’s even illegal in most of europe.
Why you should not mentally undress the girl you fancy (or not, what difference does it make?)? Where is the harm of it?
Where is the harm of it
there is none, that’s their point
I’m going to undress Nobody. And give them sexy tentacles.
Behold my meaty, majestic tentacles. This better not awaken anything in me…
Intersting how we can “undress any girl” but I have not seen a tool to “undress any boy” yet. 😐
I don’t know what it says about people developing those tools. (I know, in fact)
Make one :P
Then I suspect you’ll find the answer is money. The ones for women simply just make more money.
This
You don’t need to make fake nudes of guys - you can just ask. Hell, they’ll probably send you one even without asking.
In general women aren’t interested in that anyway and gay guys don’t need it because, again, you can just ask.
This is simultaneously a gross generalization and strangely accurate and I hate you for it
I’ve seen a tool like that. Everyone was a bodybuilder and Hung like a horse.
I’m going to guess all the ones of women have bolt on tiddies and no pubic hair.
Well of course. Sagging breasts are gross /s
Thats what we all look like.
Don’t check though
Gotta wonder where they get their horse dick training images from
Where does Bad Dragon get them from?
notices ur instance
Can’t judge though, I have a Chance myself lawl
pff no exotic-erotics?
You probably don’t need them. You can get these photos without even trying. Is a bit of a problem really.
Be the change you wish to see in the world
\s
Probably because the demand is not big or visible enough to make the development worth it, yet.
You probably can with the same inpainting stable diffusion tools, it’s just not as heavily advertised.
What it says is that there isn’t any demand for seeing boys naked. We simply look worse because we fucked up evolution.
ITT: A bunch of creepy fuckers who dont think society should judge them for being fucking creepy
Something that can also happen: require Facebook login with some excuse, then blackmail the creeps by telling “pay us this extortion or we’re going to send proof of your creepiness to your contacts”
Another something that can also happen: require facebook login with some excuse, plant shit on your enemy’s computer, then blackmail them by threatening to frame them as creeps.
Sometimes the reason a method is frowned upon is that it is equally usable for evil as for good.
This reminded me of those kid’s who made pornographic so videos of their classmates.
Sharing this screenshot again, to drive the point home.
I feel like you’ve just set me up for an FBI visit.
Send 'em to Zuck.
What in the fuck are all these photos of kids? They’re not part of the ad?
This was from a test I did with a throwaway account on IG where I followed a handful of weirdo parents who run “model” accounts for their kids to see if Instagram would start pushing problematic content as a result (spoiler: yes they will).
It took about 5 minutes from creating the account to end up with nothing but dressed down kids on my recommendations page paired with inappropriate ads. I guess the people who follow kids on IG also like these recommended photos, and the algorithm also figures they must be perverts, but doesn’t care about the sickening juxtaposition of children in swimsuits next to AI nudifying apps.
Don’t use Meta products. They don’t care about ethics, just profits.
Coupled with the article about pedos blackmailing kids with their fake nudes to get real ones, this makes my stomach turn and eyes water. So much evil in this world. I am happy to say I deleted my FB and IG accounts a few days ago. WhatsApp is tough to leave due to family though… Slowly getting people to switch over to safer and more ethical alternatives.
This 100%. I can’t even bring myself to buy new content for my Quest now that I’m aware of the issues (no matter how much I want the latest Beat Saber and Synth Riders DLC), especially since Meta’s Horizon, in my experience, puts adults into direct contact with children. At first I just dismissed metaverse games like VRChat or Horizon as being too popular with kids for me to enjoy it, but now I realize that it put me, an adult, straight into voice chats with tweens, which people should fucking know better than to do. My first thought was to log off because I wasn’t having fun in a kid-dominated space, but I have no doubt that these apps are crawling with creeps who see that as a feature rather than a problem.
We need education for parents that sharing pictures of their kids online comes with real risks, as does giving kids free reign to use the Internet. The laissez faire attitude many people have towards social media needs to be corrected, because real harm is already being done.
Most of the parents that post untoward pics of their kids online are chasing down opportunities for their kids to model, and they’re ignoring the fact that a significant volume of engagement these photos receive comes from people objectifying children. There seems to be a pattern that the most revealing outfits get the most engagement, and so future pictures are equally if not more revealing to chase more engagement…
Parents might not understand how disturbing these patterns are until they’ve already dumped thousands of pictures online, and at that point they’re likely to be in denial about what they’re exposing their kids to, and/or too invested to want to reverse course.
We also need to have a larger conversation, as a society, about using kids as models at all. Pretty much every major manufacturer of children’s clothing is hiring real kids to model the clothes. I don’t think it’s necessary to be publishing that many pictures of kids online, nor is it acceptable to be doing so for profit. There’s no reason not to limit modeling to adults who can consent to putting their bodies on public display, and using mannequins for kids’ clothing. The sheer volume of kids’ swimsuit and underwear pictures hosted on e-commerce sites is likely a contributor to the capability Generative AI models have to create inappropriate images of children, not to mention the actual CSAM found in the LAION dataset most of these models are trained on.
Sorry for the long rant, this shit pisses me off. I need to consider sending 404 Media everything I know since they’re doing investigations into this kind of thing. My small scale investigation has revealed a lot to me, but more people need to be getting as upset as I am about it if we want to make the Internet less of a hellscape.
Wow, that is really disturbing. WTF, IG?
The other day, I had an ad on facebook that was basically lolicon. It depicted a clearly underage anime girl in a sexually suggestive position on a motorcycle with their panties almost off. I am in Germany, Facebook knows I am in Germany and if I took a screenshot of that ad and saved it, it would probably be classed as CSAM in my jurisdiction. I reported the ad and got informed that FB found “nothing wrong” with it a few days later. Fuck off, you child predators.
I logged into my throwaway account today just to check in on it since people are talking about this shit more. I was immediately greeted with an ad featuring hardcore pornography, among the pics of kids that still populate my feed.
I’ll spare you the screenshot, but IG is fucked.
okay enough internet for me today
Shit ain’t right.
The idea that the children in this photo are ment to be seen in the same context of a porn site (or at least somthing using the pornhub logo likeness) is discusting.
DISCLAIMER: Ive havent gone throught this myself but know what porn adiction feels like. its not fun and will warp who you are on the inside.
Anyone lured for any reason to this site, DO NOT ENGUAGE it WILL HURT YOU! If for whatever reason theve put their hooks in you and are reeling you in, Use stratigies that Alcoholics Anonimous use. LITERALLY ANYTHING is better than using pictures of REAL CHILDREN for sexual grtification.
That makes me sick!! 😠
Google Play and co are allowing similar apps
Bro … WTF. AI CSAM. I hate this timeline so much.
My only hope is it also reduces CSAM of real people. Please at least do that much!
That’s what I was hoping too, and I’m sure for some it does, but I also saw an article the other day about people using these to blackmail children into giving real nudes, so fuck me I guess.
I still hope that for the majority of pedos this is solely something used to deter their urges. I’d like at least that much I mean ffs.
Well fuck. The shittiest timeline
No fucking kidding lmao
No. This path is of darkness itself.
What the fuck do you mean no?! This is happening right the fuck now. Its already happening. You DONT want it to decrease the total number actual real children who are used and abused to feed this shit?
I think you think im supporting this in some way. I AM NOT. Im saying i hope that any of the pedos out there are using this instead of taking action against actual children will not have already harmed children or at the very least reduce the total harm done.
Christ what the hell are we coming to if we cant even try to find some fucking sanity in this situation.
And for all that is good and right in the world i also very much hope it doesnt lead to MORE abuse.
Can we at least hope for the best while trying to fix the worst?
The pedos out there are using AI to nudity pictures of real kids. That’s just going to drive up the demand for creep shots and child model photosets to exploit.
There may be a small percentage of offending pedophiles that switch to pure GenAI over pictures of real kids, but I don’t see GenAI ever playing a role in harm reduction given the harm it ultimately enables.
One of the current sickening trends is for a predator to convince a kid to send underwear or swimsuit pics, and then blackmail them into more hardcore photos with nudified versions of the original pics. They’re already seeing an influx of that kind of CSAM online, that involves abusing real kids on social media.
I just wish America was less puritanical and taught kids about sex and boundaries to protect them, and that we had a functioning mental healthcare system that directly helps people who experience inappropriate sexuality attractions like pedophilia before they go down these dark paths.
Good, let all celebs come together and sue zuck into the ground
Its funny how many people leapt to the defense of Title V of the Telecommunications Act of 1996 Section 230 liability protection, as this helps shield social media firms from assuming liability for shit like this.
Sort of the Heads-I-Win / Tails-You-Lose nature of modern business-friendly legislation and courts.
Section 230 is what allows for social media at all given the problem of content moderation at scale is still unsolved. Take away 230 and no company will accept the liability. But we will have underground forums teeming with white power terrorists signalling, CSAM and spam offering better penis pills and Nigerian princes.
The Google advertising system is also difficult to moderate at scale, but since Google makes money directly off ads, and loses money when YouTube content is not brand safe, Google tends to be harsh on content creators and lenient on advertisers.
It’s not a new problem, and nudification software is just the latest version of X-Ray Specs (which is to say weve been hungry to see teh nekkid for a very long time.) The worst problem is when adverts install spyware or malware onto your device without your consent, which is why you need to adblock Forbes Magazine…or really just everything.
However much of the world’s public discontent is fueled by information on the internet (Some false, some misleading, some true. A whole lot more that’s simultaneously true and heinous than we’d like in our society). So most of our officials would be glad to end Section 230 and shut down the flow of camera footage showing police brutality, or starving people in Gaza or fracking mishaps releasing gigatons of rogue methane into the atmosphere. Our officials would very much love if we’d go back to being uninformed with the news media telling us how it’s sure awful living in the Middle East.
Without 230, we could go back to George W. Bush era methods, and just get our news critical of the White House from foreign sources, and compare the facts to see that they match, signalling our friends when we detect false propaganda.
Am I the only one who doesn’t care about this?
Photoshop has existed for some time now, so creating fake nudes just became easier.
Also why would you care if someone jerks off to a photo you uploaded, regardless of potential nude edits. They can also just imagine you naked.
If you don’t want people to jerk off to your photos, don’t upload any. It happens with and without these apps.
But Instagram selling apps for it is kinda fucked, since it’s very anti-porn, but then sells apps for it (to children).
Also why would you care if someone jerks off to a photo you uploaded, regardless of potential nude edits. They can also just imagine you naked.
Imagining and creating physical (even digial) material are different levels of how real and tangible it feels. Don’t you think?
There is an active act of carefully editing those pictures involved. It’s a misuse and against your intention when you posted such a picture of yourself. You are loosing control by that and become unwillingly part of the sexual act of someone else.
Sure, those, who feel violated by that, might also not like if people imagine things, but that’s still a less “real” level.
For example: Imagining to murder someone is one thing. Creating very explicit pictures about it and watching them regularly, or even printing them and hanging them on the walls of one’s room, is another.
I don’t want to equate murder fantasies with sexual ones. My point is to illustrate that it feels to me and obviously a lot of other people that there are significant differences between pure imagination and creating something tangible out of it.Oh no, hanging the pictures on your wall is fucked.
The difference is if someone else can reasonably find out. If I tell someone that I think about them/someone else while masturbating, that is sexual harassment. If I have pictures on my wall and guests could see them, that’s sexual harassment.
If I just have an encrypted folder, not a problem.
It’s like the difference between thinking someone is ugly and saying it.
I think it’s clear you have never experienced being sexualized when you weren’t okay with it. It’s a pretty upsetting experience that can feel pretty violating. And as most guys rarely if ever experience being sexualized, never mind when they don’t want to be, I’m not surprised people might be unable to emphasize
Having experienced being sexualized when I wasn’t comfortable with it, this kind of thing makes me kinda sick to be honest. People are used to having a reasonable expectation that posting safe for work pictures online isn’t inviting being sexualized. And that it would almost never be turned into pornographic material featuring their likeness, whether it was previously possible with Photoshop or not.
It’s not surprising people would find the loss of that reasonable assumption discomforting given how uncomfortable it is to be sexualized when you don’t want to be. How uncomfortable a thought it is that you can just be going about your life and minding your own business, and it will now be convenient and easy to produce realistic porn featuring your likeness, at will, with no need for uncommon skills not everyone has
Interesting (wrong) assumption there buddy.
But why would I care how people think of me? If it influences their actions, we gonna start to have problems, tho.
Fair enough, I’m sorry for making assumptions about you.
I do think my points stand though
It’s about consent. If you have no problem with people jerking off to your pictures, fine, but others do.
If you don’t want people to jerk off to your photos, don’t upload any. It happens with and without these apps.
You get that that opinion is pretty much the same as those who say if she didn’t want to be harrassed she shouldn’t have worn such provocative clothing!?
How about we allow people to upload whatever pictures they want and try to address the weirdos turning them into porn without consent, rather than blaming the victims?
If you have no problem with people jerking off to your pictures, fine, but others do
Of course, but people have been doing this since the dawn of time. So unless the plan is to incorporate the Thought Police, there’s no way to actually stop it from happening.
Maybe, but we can certainly help by, amongst many other things, not advertising AI Nude Apps on Instagram. Ultimately what we shouldn’t be doing is blaming the victims by implying they are somehow at fault for having the audacity to upload pictures of themselves to the Internet.
I agree completely with the last part of your comment at least, that other comment unironically saying that it’s the women’s fault for dressing the way they do is bizarre and archaic.
Nah it’s more like: If she didn’t want people to jerk off thinking about her, she shouldn’t have worn such provocative clothing.
I honestly don’t think we should encourage uploading this many photos as private people, but that’s something else.
You don’t need consent to jerk off to someone’s photos. You do need consent to tell them about it. Creating images is a bit riskier, but if you make sure no one ever sees them, there is no actual harm done.
Isn’t it kinda funny that the “most harmful applications of AI tools are not hidden on the dark corners of the internet,” yet this article is locked behind a paywall?
The proximity of these two phrases meaning entirely opposite things indicates that this article, when interpreted as an amorphous cloud of words without syntax or grammar, is total nonsense.
The arrogant bastards!
So many of these comments are breaking down into arguments of basic consent for pics, and knowing how so many people are, I sure wonder how many of those same people post pics of their kids on social media constantly and don’t see the inconsistency.
There isn’t really many good reasons to post your kid’s picture anyway.
And yet that seems to be 60% of my wife’s facebook feed… I forbade her from posting our kids years ago.
Lemmy rolls worst comment section, asked to leave the Fediverse
Capitalism works! It breeds innovation like this! good luck getting non consensual ai porn in your socialist government
It’s ironic because the “free market” part of capitalism is defined by consent. Capitalism is literally “the form of economic cooperation where consent is required before goods and money change hands”.
Unfortunately, it only refers to the two primary parties to a transaction, ignoring anyone affected by externalities to the deal.
Is there such a thing as a consensual undressing app? Seems redundant
BRB. Got an idea for a start-up
Theoretically any of these apps could be used with consent.
In practice I can’t imagine that would be a particularly large part of their market…
Now I have this image of an OnlyFans girl who just fake nudes all her pictures. Would make doing public nudity style pictures a lot easier.
Y’know, as long as she’s open about it that would be a great use of the tech.
“hey send me some nudes!”
“Ugh… I’m already on the couch in my pajamas. Here’s a pic of me at the coffee shop today, just use the app, it’s close enough.”
I guess it’s really in whether you use it with consent. I used one on my own picture just to see how it worked. It gave me huge tits but other than that was scarily accurate.
Neat. Ladies only or does it do dudes too?
I’m a dude, it’s just a clever name. It’ll do dudes, it’s just gonna give you huge tits. What you’re into is, of course, your business.
My interest in this topic just went from 0 to 10 upon realizing the humour potential of passing it around to see all my bros with huge tits, but only if it worked like a Snapchat filter.
Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do
Also I have a friend who already has huge tits, and I’ve seen them IRL so I’m curious what it would do
Being serious for a moment, it depends on the source image. If it can tell where the contours of the tits are in the source image, they’ll be closer to the right size and shape - otherwise it’s going to find something it thinks are the contours and map out tits that match those, then generic torso that matches the shape of where it thinks the torso is and skintone of the face. It’s not magic, it’s just automating what a horndog with photoshop, a photo of you and a big enough porn collection to find someone with a similar body type could do back in the 90s.
There isn’t, but emphasis on why it’s an issue is always a good thing to do. Same reason people get upset when some articles say “had sex with a minor” or “involved in a relationship with a minor” when the accurate crime is “raped a minor.”
If you (the news) are going to use flowery language, at least imply its a crime!
- “Sexually coersed a minor”
- or “groomed a minor for sex”
- or “had a relationship where the power dynamics were so 1 sided that the child could not give consent”
- or mabe just say “raped a minor”
Its not that hard!
that would just be instructions, wouldn’t it?
I assume that’s what you’d call OnlyFans.
That said, the irony of these apps is that its not the nudity that’s the problem, strictly speaking. Its taking someone’s likeness and plastering it on a digital manikin. What social media has done has become the online equivalent of going through a girl’s trash to find an old comb, pulling the hair off, and putting it on a barbie doll that you then use to jerk/jill off.
What was the domain of 1980s perverts from comedies about awkward high schoolers has now become a commodity we’re supposed to treat as normal.
Idk how many people are viewing this as normal, I think most of us recognize all of this as being incredibly weird and creepy.
Idk how many people are viewing this as normal
Maybe not “Lemmy” us. But the folks who went hog wild during The Fappening, combined with younger people who are coming into contact with pornography for the first time, make a ripe base of users who will consider this the new normal.
Yeah damn, that’s true.
An obvious answer would be to talk to younger people about it, to explain how gross and violating it is. Even if it doesn’t become illegal, there are plenty of legal things that people avoid and recognize are bad because they were taught correctly.
Unfortunately, due to how puritan our society is, I can’t imagine many parents would be willing to talk to their kids about stuff like this.
You just took my feeling on this issue and put it to words
youtube has been for like 6 or 7 months. even with famous people in the ads. I remember one for a while with Ortega
Ortega? The taco sauce?
NSFW-ish
I guess that’s an OERGAsm
Don’t use that as lube.
I sat down with tacos as I opened up that reply.
Witch.