A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.

Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.

  • hexdream@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    21 days ago

    If this thread (and others like it) have taught me aulnything is that facts be damned, people are opinionated either way. Nuance means nothing and it’s basically impossible to have a proper discussion when it comes to wedge issues or anything that can be used to divide people. Even if every study 100% said Ai generated csam always led to a reduction in actual child harm and reduced recidivism and never needed any actual real children to be used as training material, the comments would still pretty much look the same. If the studies showed the exact opposite, the comments would also be the same. Welcome to the internet. I hope you brought aspirin.

    • NauticalNoodle@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      21 days ago

      I was hoping to comment on this post multiple times today after I initially lost track of It and now I see you’ve covered about 75% of what I wanted to say. The rest is this:

      Show me multiple (let’s say 3+) small-scale independent academic studies or 1-2 comprehensive & large academic studies that support one side or another and I may be swayed, Otherwise I think all that is being accomplished is that one guys life is getting completely ruined for now and potentially forever over some fabrications and as a result he may or may not get help, but I doubt he’ll be better off.

      —My understanding was that csam has it’s legal status specifically because there are victims that are hurt by these crimes and possession supports a broader market that faciltates said harm to these victims. It’s not as easy to make a morality argument (especially a good one) for laws that effect everybody when there are no known victims.

    • Eezyville@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      21 days ago

      My man. Go touch some grass. This place is no good. Not trying to insult you but it’s for your mental health. These Redditors aren’t worth it.

      • SynopsisTantilize@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        21 days ago

        A lot of the places I’ve been to start conversation have been hostile and painful. If there is one thing that stands out that’s holding Lemmy back it’s the shitty culture this place can breed.

        • NauticalNoodle@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          20 days ago

          I’m convinced that a lot can be inferred from the type of reactions and the level of hostility one might receive by trying to present a calm and nuanced argument to a wedge topic. Even if it’s not always enjoyable. At the very least it also shows others that they may not be interacting rational actors when one gets their opponents to go full mask-off.

          • SynopsisTantilize@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            19 days ago

            Agreed. And I’ve had my share of “being a dick” on the Internet here. But by the end of the interaction I try to at least jest. Or find a middle ground…I commented on a Hexbear instance by accident once…

    • Jimmyeatsausage@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      Even unrealistic depictions of children in a sexual context is considered CSAM.

      In the United States, the U.S. Supreme Court has defined child pornography as material that “visually depicts sexual conduct by children below a specified age”

      See New York v Ferber for more details

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      22 days ago

      I looked into this awhile back during a similar discussion.

      Creating the content itself is not necessarily illegal if its ficticious.

      Transmitting it over a public medium is very illegal, though.

      So if you create your own ficticious child porn locally and it never gets transmitted over the internet, you’d maybe be okay legally speaking from a federal perspective anyway.

      This guy transmitted it and broke the law.

    • Cryophilia@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      Iirc it was for Florida “obscenity” laws, which covers literally anything the state government finds objectionable.

  • beepnoise@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    23 days ago

    Honestly, I don’t care if it is AI/not real, I’m glad that the man was arrested. He needs some serious help for sexualising kids.

          • Samvega@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            0
            ·
            23 days ago

            You don’t think that reducing testosterone and therefore sex drive will change offending rates? That is contrary to research which has reliably found that this is the best therapy, in terms of effectiveness on recidivism.

            • Farid@startrek.website
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              22 days ago

              Cutting off their testicles and straight up executing them would also reduce the offending rates. Ever more effectively than chemical castration, I’m sure. But we wouldn’t be calling that helping the offender, would we? And the comment above was specifically talking about helping them.
              What we have now is more of a best middle ground between the amount of damage caused to the patient and safety guarantees for the society. We obviously prioritize safety for the society, but we should be striving for less damage to the patient, too.

              • Samvega@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                22 days ago

                …we should be striving for less damage to the patient, too.

                Can you make someone just not sexually interested in something they find arousing? As far as I know, conversion therapy for non-heterosexual people doesn’t have good success rates. Also, those therapies also tended to involve some form of harm, from what I’ve heard.

                • Farid@startrek.website
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  22 days ago

                  Can you make someone just not sexually interested in something they find arousing?

                  No, I can’t. Doesn’t mean that we (as a society) shouldn’t be working on finding ways to do it or finding alternative solutions. And it’s necessary to acknowledge that what we have now is not good enough.

                  those therapies also tended to involve some form of harm

                  They probably did. But nobody here is claiming those were good or helping the patients either.

                • redfellow@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  22 days ago

                  It’s not about making someone want something, less, but helping them to never act on those needs.

                  Computer generated imagery could in theory be helpful, so the itch gets scratched without creating victims and criminals.

                  I’d call that a win-win in terms of societal well being, as also less funds are wasted on police work, jailing a perpetrator, and therapy for victim.

            • macniel@feddit.org
              link
              fedilink
              arrow-up
              0
              ·
              23 days ago

              That guy didn’t even commit anything just having AI imagery depicting children.

              That guy has a mental problem that you can’t only treat by chemical castration. He needs more than that.

              • Samvega@lemmy.blahaj.zone
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                23 days ago

                That does not change the fact that chemical castration is the most successful treatment we have to stop CSA recidivism at present.

                 

                That guy didn’t even commit anything just having AI imagery depicting children.

                Possessing and distributing images that sexually objectify children may be a crime, even if generated by AI.

      • treefrog@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        Depending on the state, yes actually.

        I did time in a medium security facility that also did sex offender treatment (I was there on drug charges). I still have friends that went through that program.

        The men who were actually invested in getting better, got better. The ones invested in staying well, are still well.

    • Cosmonauticus@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      23 days ago

      You and I both know he’s not going to get it. I have a kinda sympathy for ppl attracted to kids **but refuse to act on it. **They clearly know it’s not normal and recognize the absolute life destroying damage they can cause if they act on it. That being said there’s not many places you can go to seek treatment. Any institutions that advertised treatment would have ppl outside with pitchforks and torches.

      Before anyone tries to claim I’m pro pedo you can fuck right off. I just wish it was possible for ppl are attracted to kids and not out touching them to get some kind of therapy and medication to make them normal (or destroy their sex drive) before something terrible happens.

      • Samvega@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        to get some kind of therapy and medication to make them normal

        Hi, Psychologist here. Does society have strong evidence that therapeutic interventions are reducing rates of, say, the most common disorders of anxiety and depression? Considering that the rates of these are going up, I don’t think we can assume there’s a hugely successful therapy to help those attracted to CSA images to change. Psychology is not a very good science principally because it offers few extremely effective answers in the real world.

        In terms of medication androgen antagonists are generally used. This is because lowering testosterone generally leads to a lower sex drive. Here is an article about those drugs, including an offender who asked for them: https://www.theguardian.com/society/2016/mar/01/what-should-we-do-about-paedophiles

        TW: the article contains discussion of whether offenders are even psychologically disordered, when set within a historical cultural context of child-marriage. This paragraph is two above the illustration of people trapped within concentric circular walls, and starts “In the 2013 edition …”.

        Collis began to research the treatment and decided that it was essential to his rehabilitation. He believes he was born a paedophile, and that his attraction to children is unchangeable. “I did NOT wake up one morning and decide my sexual preference. I am sexually attracted to little girls and have absolutely no interest in sex with adults. I’ve only ever done stuff with adults in order to fit in with what’s ‘normal’.” For Collis, therefore, it became a question of how to control this desire and render himself incapable of reoffending.

        […]

        Many experts support Aaron Collis’s self-assessment, that paedophilia is an unchangeable sexual preference. In a 2012 paper, Seto examined three criteria – age of onset, sexual and romantic behaviour, and stability over time. In a number of studies, a significant proportion of paedophiles admitted to first experiencing attraction towards children before they had reached adulthood themselves. Many described their feelings for children as being driven by emotional need as well as sexual desire. As for stability over time, most clinicians agreed that paedophilia had “a lifelong course”: a true paedophile will always be attracted to children. “I am certainly of the view,” Seto told me, “that paedophilia can be thought of as a sexual orientation.”

        Brain-imaging studies have supported this idea. James Cantor, a psychiatry professor at the University of Toronto, has examined hundreds of MRI scans of the brains of paedophiles, and found that they are statistically more likely to be left-handed, shorter than average, and have a significantly lower density of white matter, the brain’s connective tissue. “The point that’s important for society is that paedophilia is in the brain at all, and that the person didn’t choose it,” Cantor told me. “As far as we can tell, they were born with it.” (Not that this, he emphasised, should excuse their crimes.)

        […]

        Clinical reality is a little more complicated. “There’s no pretence that the treatment is somehow going to cure them of paedophilia,” Grubin told me. “I think there is an acceptance now that you are not going to be able to change very easily the direction of someone’s arousal.” Grubin estimates that medication is only suitable for about 5% of sex offenders – those who are sexually preoccupied to the extent that they cannot think about anything else, and are not able to control their sexual urges. As Sarah Skett from the NHS put it: “The meds only take you so far. The evidence is clear that the best treatment for sex offending is psychologically based. What the medication does is help people have a little bit of control, which then allows them to access that treatment.”

         

        Some research on success rates:

        Prematurely terminating treatment was a strong indicator of committing a new sexual offense. Of interest was the general improvement of success rates over each successive 5-year period for many types of offenders. Unfortunately, failure rates remained comparatively high for rapists (20%) and homosexual pedophiles (16%), regardless of when they were treated over the 25-year period. [https://pubmed.ncbi.nlm.nih.gov/11961909/]

        Within the observation period, the general recidivism and sexual recidivism rates were 33.1% and 16.5%, respectively, and the sexual contact recidivism rate was 4.7%. [https://journals.sagepub.com/doi/abs/10.1177/0306624X231165416 - this paper says that suppressing the sex drive with medication was the most successful treatment]

        Men with deviant sexual behavior, or paraphilia, are usually treated with psychotherapy, antidepressant drugs, progestins, and antiandrogens, but these treatments are often ineffective. Selective inhibition of pituitary–gonadal function with a long-acting agonist analogue of gonadotropin-releasing hormone may abolish the deviant sexual behavior by reducing testosterone secretion. [https://www.nejm.org/doi/full/10.1056/nejm199802123380702 - this paper supports that lowering testosterone works best]

          • LustyArgonianMana@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            21 days ago

            Not always. There are people with brain injuries who suddenly develop an attraction towards kids and it’s not really due to power dynamics or anything else.

          • Rai@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            For people actually abusing? Spot on, most of the time.

            For non-offending paedos? Nah… a horrible affliction.

        • z3rOR0ne@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          23 days ago

          Thank you for such a well laid out response and the research to back it up. I rarely see people approaching the subjects of pedophilia, and how best to treat pedophiles, rationally and analytically.

          It’s understandable considering the harm they can cause to society that most can only ever view them as nothing more or less than monsters, and indeed, those that are incapable of comprehending the harm they cause and empathizing with those they could potentially cause or have caused harm to, are IMHO some of the more loathsome individuals.

          That said, I think too often people are willing to paint others whose proclivities are so alien and antithetical to our own as not only monsters, but monsters that aren’t worth understanding with any degree of nuance, that we ultimately do ourselves and future generations a disservice by not at least attempting to address the issue at hand in the hopes that the most harmful parts of our collective psyche are treated palliatively to the best of our ability.

          Your annotated sources indicate that there is not nearly as clear a path forward as detractors to the “pedophiles are simply monsters and there’s no reason to look into their motives further” would pike to believe, while also, by the nature of the existence of the attempted treatments themselves, points out that there is more work to be done to hopefully find a more lasting and successful rate of treatment.

          Like many of the paychological ailments plagueing societies today, you cannot simply kill and imprison the problem away. That is always a short term (albeit at times temporarily effective) solution. The solution to the problem of how to greatly reduce the occurrence of pedophilia will ultimately require more of this kind of research and will require more analysis and study towards achieving such ends.

          Again, I thank you for your nuanced post, and commend you for taking your nuanced stance as well.

        • LustyArgonianMana@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          21 days ago

          I don’t understand why we haven’t used inhalable oxytocin as an experimental drug for people attracted to children and animals. It seems intuitive- children and animals generate oxytocin for humans automatically, and it’s possible some people need a stronger stimulus to release oxytocin or may not have a lot of oxytocin endogenously. Oxytocin can be compounded at a pharmacy and has been used successfully for social anxiety.

  • macniel@feddit.org
    link
    fedilink
    arrow-up
    0
    ·
    23 days ago

    I don’t see how children were abused in this case? It’s just AI imagery.

    It’s the same as saying that people get killed when you play first person shooter games.

    Or that you commit crimes when you play GTA.

    • KillerTofu@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      23 days ago

      How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.

      So no, you are making false equivalence with your video game metaphors.

      • macniel@feddit.org
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        Can you or anyone verify that the model was trained on CSAM?

        Besides a LLM doesn’t need to have explicit content to derive from to create a naked child.

        • KillerTofu@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          23 days ago

          You’re defending the generation of CSAM pretty hard here in some vaguely “but no child we know of” being involved as a defense.

          • macniel@feddit.org
            link
            fedilink
            arrow-up
            0
            ·
            23 days ago

            I just hope that the Models aren’t trained on CSAM. Making generating stuff they can fap on ““ethical reasonable”” as no children would be involved. And I hope that those who have those tendancies can be helped one way or another that doesn’t involve chemical castration or incarceration.

      • Diplomjodler@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        While i wouldn’t put it past Meta&Co. to explicitly seek out CSAM to train their models on, I don’t think that is how this stuff works.

      • fernlike3923@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        A generative AI model doesn’t require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.

        • finley@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          23 days ago

          In that case, the images of children were still used without their permission to create the child porn in question

          • MagicShel@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            23 days ago

            That’s not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.

            Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.

              • TheRealKuni@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                23 days ago

                Because the world we live in is complex, and rejecting complexity for a simple view of the world is dangerous.

                See You Can’t Get Snakes from Chicken Eggs from the Alt-Right Playbook.

                (Note I’m not accusing you of being alt-right. I’m saying we cannot ignore nuance in the world because the world is nuanced.)

                • finley@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  22 days ago

                  We’re not talking about snakes or chicken eggs, but thanks for the strawman

          • fernlike3923@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            23 days ago

            That’s a whole other thing than the AI model being trained on CSAM. I’m currently neutral on this topic so I’d recommend you replying to the main thread.

              • fernlike3923@sh.itjust.works
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                23 days ago

                It’s not CSAM, it’s just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.

                • finley@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  23 days ago

                  It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.

                  Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?

                  Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.

          • CeruleanRuin@lemmings.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            22 days ago

            Good luck convincing the AI advocates of this. They have already decided that all imagery everywhere is theirs to use however they like.

      • grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        23 days ago

        But the AI companies insist the outputs of these models aren’t derivative works in any other circumstances!

        • timestatic@feddit.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 days ago

          But this is the US… and its kind of a double standard if you’re not arrested for drawing but for generating it.

          • oberstoffensichtlich@feddit.org
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            There is a difference between something immediately identifiable as a drawing and something almost photorealistic. If a generated image is indistinguishable from a real photo, it should be treated the same.

            • puppycat@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              0
              ·
              21 days ago

              I don’t advocate for either but it should NOT be treated the same. one doesn’t involve a child being involved and traumatized, id rather a necrophiliac make ai generated pics instead of… you know.

            • ContrarianTrail@lemm.ee
              link
              fedilink
              English
              arrow-up
              0
              ·
              21 days ago

              The core reason CSAM is illegal is not because we don’t want people to watch it but because we don’t want them to create it which is synonymous with child abuse. Jailing someone for drawing a picture like that is absurd. While it might be of bad taste, there is no victim there. No one was harmed. Using generative AI is the same thing. No matter how much simulated CSAM you create with it, not a single child is harmed in doing so. Jailing people for that is the very definition of a moral panic.

              Now, if actual CSAM was used in the training of that AI, then it’s a more complex question. However it is a fact that such content doesn’t need to be in the training data in order for it to create simulated CSAM and as long as that is the case it is immoral to punish people for creating something that only looks like it but isn’t.

                • ContrarianTrail@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  20 days ago

                  Sure, but same argument could be made of violent movies / games / books … It’s a rather slippery slope and as far as I know there doesn’t seem to be correlation between violent games and real life violence, in fact I believe the correlation is negative.

    • Leraje@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 days ago

      The difference is intent. When you’re playing a FPS, the intent is to play a game. When you play GTA the intent is to play a game.

      The intent with AI generated CSAM is to watch kids being abused.

        • Leraje@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 days ago

          There may well be the odd weirdo playing Call of Duty to watch people die.

          But everyone who watches CSAM is watching it to watch kids being abused.

      • ContrarianTrail@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        21 days ago

        Punishing people for intending to do something is punishing them for thought crimes. That is not the world I want to live in.

            • Leraje@lemmy.blahaj.zone
              link
              fedilink
              English
              arrow-up
              0
              ·
              20 days ago

              Intent is defined as intention or purpose. So I’ll rephrase for you: the purpose of playing a FPS is to play a game. The purpose of playing GTA is to play a game.

              The purpose of AI generated CSAM is to watch children being abused.

              • ContrarianTrail@lemm.ee
                link
                fedilink
                English
                arrow-up
                0
                ·
                20 days ago

                I don’t think that’s fair. It could just as well be said that the purpose of violent games is to simulate real life violence.

                Even if I grant you that the purpose of viewing CSAM is to see child abuse, it’s still less bad than actually abusing them just like playing violent games is less bad than participating in real violence. Also, despite the massive increase in violent games and movies, the actual number of violence is going down so implying that viewing such content would increase the cases of child abuse is an assumption I’m not willing to make either.

                • Leraje@lemmy.blahaj.zone
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  20 days ago

                  The purpose of a game is to play a game through a series of objectives and challenges.

                  Even if I grant you that the purpose of viewing CSAM is to see child abuse

                  Very curious to hear what else you think the purpose of watching CSAM might be.

                  it’s still less bad than actually abusing them

                  “less bad” is relative. A bad thing is still bad. If we go by length of sentencing then rape is ‘less bad’ than murder. that doesn’t make it ‘not bad’.

                  so implying that viewing such content would increase the cases of child abuse is an assumption I’m not willing to make either.

                  OK?

                  I didn’t claim that AI CSAM increased anything at all. Literally all I’ve said is that the purpose of AI generated CSAM is to watch kids being abused.

                  Neither did I claim that violent games lead to violence. You invented that strawman all by yourself.

    • Samvega@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      23 days ago

      It’s just AI imagery.

      Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it’s really not something that needs to happen.

      • HelixDab2@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        indicates that this person might groom children for real

        But unless they have already done it, that’s not a crime. People are prosecuted for actions they commit, not their thoughts.

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          23 days ago

          I agree, this line of thinking quickly spirals into Minority Report territory.

          • CeruleanRuin@lemmings.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            22 days ago

            It will always be a gray area, and should be, but there are practical and pragmatic reasons to ban this imagery no matter its source.

      • HubertManne@moist.catsweat.com
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.

        • Samvega@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          0
          ·
          23 days ago

          If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn’t like that, and I have drawn my own conclusions.

          • HubertManne@moist.catsweat.com
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            yeah and if you want to keep people who fantasize about murdering folk. you can’t say one thing is a thing without saying the other is. Im sorry you were raped but I doubt it would be stopped by banning lolita.

            • Samvega@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              0
              ·
              edit-2
              22 days ago

              I don’t recall Nabokov’s novel Lolita saying that sexualising minors was an acceptable act.

              Thanks for the strawman, though, I’ll save it to burn in the colder months.

              • HubertManne@moist.catsweat.com
                link
                fedilink
                arrow-up
                0
                ·
                22 days ago

                You can call it a strawman but doing something evil if its killing folks or raping folks the effect should be the same when discussing non actual and actual. You can say this thing is a special case but when it comes to freedom of speech, which is anything that is not based in actual events. writing, speaking, thinking, art. Special circumstances becomes a real slippery slope (which can also be brought up as a fallacy which like all “fallacies” depend a lot on what else backs them up on how they are being presented)

        • CeruleanRuin@lemmings.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 days ago

          If you’re asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.

          This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it’s impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.

          It’s like you know there’s an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you’d say making fake bombs shouldn’t be illegal because they can’t harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.

          • Cryophilia@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            Sucks to be law enforcement then. I’m not giving up my rights to make their jobs easier. I hate hate HATE the trend towards loss of privacy and the “if you didn’t do anything wrong then you have nothing to hide” mindset. Fuck that.

    • CeruleanRuin@lemmings.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      22 days ago

      Not a great comparison, because unlike withh violent games or movies, you can’t say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.

      There’s also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It’s not methadone for them, as some would argue. It’s just fueling their addiction, not replacing it.

    • TallonMetroid@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 days ago

      Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don’t know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.

      • MagicShel@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        23 days ago

        An AI that is trained on children and nude adults can infer what a nude child looks like without ever being trained specifically with those images.

              • Cryophilia@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                21 days ago

                No, I’m admitting they’re stupid for even bringing it up.

                Unless their argument is that all AI should be illegal, in which case they’re stupid in a different way.

                • LustyArgonianMana@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  20 days ago

                  Do you think regular child porn should be illegal? If so, why?

                  Generally it’s because kids were harmed in the making of those images. Since we know that AI is using images of children being harmed to make these images, as the other posters has repeatedly sourced (but also if you’ve looked up deepfakes, most deepfakes are of an existing porn and the face just changed over top. They do this with CP as well and must use CP videos to seed it, because the adult model would be too large)… why does AI get a pass for using children’s bodies in this way? Why isn’t it immoral when AI is used as a middle man to abuse kids?

          • LustyArgonianMana@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            21 days ago

            Yes exactly. That people are then excusing this with “well it was trained on all.public images,” are just admitting you’re right and that there is a level of harm here since real materials are used. Even if they weren’t being used or if it was just a cartoon, the morality is still shaky because of the role porn plays in advertising. We already have laws about advertising because it’s so effective, including around cigarettes and prescriptions. Most porn, ESPECIALLY FREE PORN, is an ad to get you to buy other services. CP is not excluded from this rule - no one gets free lunch, so to speak. These materials are made and hosted for a reason.

            The role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries (intelligence groups around the world host illegal porn ostensibly “to catch a predator,” but then why is it morally okay for them to distribute these images but no one else?). And it’s used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy’s Destruction and Peter Scully?

            So it’s important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it’s AI generated, but it’s really just an ad for their monkey torture productions. And even if NONE of the footage was from illegal or similar events and was 100% thought of by AI - it can still be used as an ad for these groups if they host it. Cartoons can be ads ofc.

      • lunarul@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        22 days ago

        we don’t know that

        might

        Unless you’re operating under “guilty until proven innocent”, those are not reasons to accuse someone.

        • emmy67@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          21 days ago

          It didn’t generate what we expect and know a corn dog is.

          Hence it missed because it doesn’t know what a “corn dog” is

          You have proven the point that it couldn’t generate csam without some being present in the training data

          • ContrarianTrail@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            21 days ago

            I hope you didn’t seriously think the prompt for that image was “corn dog” because if your understanding of generative AI is on that level you probably should refrain from commenting on it.

            Prompt: Photograph of a hybrid creature that is a cross between corn and a dog

            • emmy67@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              21 days ago

              Then if your question is “how many Photograph of a hybrid creature that is a cross between corn and a dog were in the training data?”

              I’d honestly say, i don’t know.

              And if you’re honest, you’ll say the same.

              • ContrarianTrail@lemm.ee
                link
                fedilink
                English
                arrow-up
                0
                ·
                21 days ago

                But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.

                This is because it doesn’t need to have been seen one before. It knows what corn looks like and it knows what a dog looks like so when you ask it to combine the two it will gladly do so.

                • emmy67@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  edit-2
                  20 days ago

                  But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.

                  Yeah, except photoshop and artists exist. And a quick google image search will find them. 🙄

        • Saledovil@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          22 days ago

          Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we’d roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.

  • recapitated@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    22 days ago

    To be clear, I am happy to see a pedo contained and isolated from society.

    At the same time, this direction of law is something that I don’t feel I have the sophistication to truly weigh in on, even though it invokes so many thoughts for me.

    I hope we as a society get this one right.

  • Mubelotix@jlai.lu
    link
    fedilink
    arrow-up
    0
    ·
    22 days ago

    It’s not really children on these pics. We can’t condemn people for things that are not illegal yet

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      21 days ago

      It’s not really children on these pics.

      You are certain about this? If so, where are you getting that info, because it’s not in the article?

      Generative image models frequently are used for the “infill” capabilities, which is how nudifying apps work.

      If he was nudifying pictures of real kids, the nudity may be simulated, but the children are absolutely real, and should be considered victims of child pornography.

    • Microw@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      It’s Florida. They will simply book him and then present him a deal for “only x years prison”, which he’ll take and therefore prevent this from going to court and actually be ruled upon.

    • Nuke_the_whales@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      I’ve always wondered the same when an adult cop pretends to be a kid only to catch pedos. Couldn’t a lawyer argue that because there actually wasn’t a child, there wasn’t a crime?

    • Phen@lemmy.eco.br
      link
      fedilink
      arrow-up
      0
      ·
      23 days ago

      I would imagine that AI having been trained on both pictures of kids and on adult sexual content would be somewhat enough to mix the two. Even if the output might end up uncanny.

    • ContrarianTrail@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 days ago

      One doesn’t need to browse AI generated images for longer than 5 seconds to realize it can generate a ton of stuff that you for absolute certainty can know wasn’t on the training data. I don’t get why people insist on the narrative that it can only output copies of what it has already seen. What’s generative about that?

      • MataVatnik@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        22 days ago

        If you took a minute to read the article:

        A 2023 study from Stanford University also revealed that hundreds of child sex abuse images were found in widely-used generative AI image data sets.

        “The content that we’ve seen, we believe is actually being generated using open source software, which has been downloaded and run locally on people’s computers and then modified,” Internet Watch Foundation chief technology officer Dan Sexton told The Guardian last year. “And that is a much harder problem to fix.”

        So not only do the online models have CSAM, but people are downloading open source software and I’d be very surprised if they weren’t feeding it CSAM

        • ContrarianTrail@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 days ago

          That doesn’t dispute my argument; generative AI can create images that are not in the training data. It doesn’t need to know what something looks like as long as the person using it does and can write the correct prompt for it. The corn dog I posted below is a good example. You can be sure that wasn’t in the training data yet it was still able to generate it.

    • Cryophilia@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      If that’s the basis for making it illegal, then all AI is illegal.

      Which…eh maybe that’s not such a bad idea

  • hightrix@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    21 days ago

    He wasn’t arrested for creating it, but for distribution.

    If dude just made it and kept it privately, he’d be fine.

    I’m not defending child porn with this comment.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      21 days ago

      I’m not defending child porn with this comment.

      This is one of those cases where, even if you’re technically correct, you probably shouldn’t say out loud how you personally would get away with manufacturing child porn, because it puts people in the position of imagining you manufacturing child porn.

  • BilboBargains@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    22 days ago

    If no children were involved in the production of porn, how is it pedophilic? That’s like claiming a picture of water has the same properties as water.

    • Revan343@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      21 days ago

      It’s pedophillic because it’s sexual images of children; fake or not does not change that. Drawn images of child pornography are still pedophillic.

      The more important question is, is it CSAM? Whether drawn images that represent no real child are or not depends on the legal jurisdiction. Should drawn and AI generated child porn be legal or banned? I think the actual answer to that would require significant research into whether its existence acts as an outlet to prevent pedophiles from harming actual children, or whether it encourages their proclivities and makes them more likely to hurt actual children.

      Preventing harm to children should be the goal, but actual research of the effects of simulated child porn vis-a-vis pedophiles harming actual children is as unlikely to happen as any other research into pedophilia

    • derpgon@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      However, a picture of water makes me thirsty. But then again, there is no substitute for water.

      I am not defending pedos, or defending Florida for doing something like that.

      • Sarmyth@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        21 days ago

        That might be a you thing. Pictures of water dont make me thirsty. I get the metaphor you are attempting to make though.

  • NauticalNoodle@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    21 days ago

    Show me multiple (let’s say 3+) small-scale independent academic studies or 1-2 comprehensive & large academic studies that support one side or another and I may be swayed, Otherwise I think all that is being accomplished is that one guys life is getting completely ruined for now and potentially forever over some fabrications and as a result he may or may not get help, but I doubt he’ll be better off.

    —My understanding was that csam has it’s legal status specifically because there are victims that are hurt by these crimes and possession supports a broader market that faciltates said harm to these victims. It’s not as easy to make a morality argument (especially a good one) for laws that effect everybody when there are no known victims.

    • emmy67@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      21 days ago

      Are you stupid? Something has to be in the training model for any generation to be possible. This is just a new way to revitalise kids

        • emmy67@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          21 days ago

          No, I am telling you csam images can’t be generated by an algorithm that hasn’t trained on csam

          • NauticalNoodle@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            21 days ago

            That’s patently false.

            I’m not going to continue to entertain this discussion but instead I’m just going to direct you to the multiple other people who have already effectively disproven this argument and similar arguments elsewhere in this post’s discusion. Enjoy.

      • ameancow@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        19 days ago

        Not necessarily, AI can do wild things with combined attributes.

        That said, I do feel very uncomfortable with the amount of defense of this guy, he was distributing this to people. If he was just generating fake images of fake people using legal training data in his own house for his own viewing, that would be a different story. The amount of people jumping in front of the bullet for this guy when we don’t really know the details is the larger problem.

  • BonesOfTheMoon@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    22 days ago

    Could this be considered a harm reduction strategy?

    Not that I think CSAM is good in any way, but if it saves a child would it be worthwhile? Like if these pedos were to use AI images instead of actual CSAM would that be any better?

    I’ve read that CSAM sites on the dark web number into the hundreds of thousands. I just wonder if it would be a less harmful thing since it’s such a problem.

    • Todd Bonzalez@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      21 days ago

      Think of it this way - what if the government said one day: “All child porn made in the before this date is legal, all child porn made after this date is illegal”.

      You would end up with a huge corpus of “legal” child porn that pedophiles could use as a release, but you could become draconian about the manufacture of new child porn. This would, theoretically, discourage new child porn from being created, because the risk is too high compared to the legal stuff.

      Can you see the problem? That’s right, in this scenario, child porn is legal. That’s fucked up, and we shouldn’t do that, even if it is “simulated”, because fuck that.

    • pregnantwithrage@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 days ago

      You would think so but you basically are making a patch work version of the illicit actual media so it’s a dark dark gray area for sure.

        • medgremlin@midwest.social
          link
          fedilink
          arrow-up
          0
          ·
          22 days ago

          Generative AI is basically just really overpowered text/image prediction. It fills in the words or pixels that make the most sense based on the data it has been fed, so to get AI generated CSAM…it had to have been fed some amount of CSAM at some point or it had to be heavily manipulated to generate the images in question.

          • CommanderCloon@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            22 days ago

            so to get AI generated CSAM…it had to have been fed some amount of CSAM

            No actually, it can combine concepts that aren’t present together in the dataset. Does it know what a child looks like? Does it know what porn looks like? Then it can generate child porn without having ever had CSAM in its dataset. See the corn dog comment as an argument

            Edit: corn dog

            • emmy67@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              21 days ago

              A dumb argument. Corn and dog were. But that’s not a corn dog like what we expect when we think corn dog.

              Hence it can’t get what we know a corn dog is.

              You have proved the point for us since it didn’t generate a corn dog.

            • medgremlin@midwest.social
              link
              fedilink
              arrow-up
              0
              ·
              22 days ago

              Some of the image generators have attempted to put up guard rails to prevent generating pictures of nude children, but the creators/managers haven’t been able to eradicate it. There was also an investigation by Stanford University that showed that most of the really popular image generators had a not insignificant amount of CSAM in their training data and could be fairly easily manipulated into making more.

              The creators and managers of these generative “AIs” have done slim to none in the way of curation and have routinely been trying to fob off responsibility to their users the same way Tesla has been doing for their “full self driving”.

          • BonesOfTheMoon@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            Ok makes sense. Yuck my skin crawls. I got exposed to CSAM via Twitter years ago, thankfully it was just a shot of nude children I saw and not the actual deed, but I was haunted.

    • RandomlyNice@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 days ago

      Many years ago (about 25) I read an article in a newspaper (idk the name, but it may have been the The Computer Paper, which is archived on line someplace}. This article noted that a study had been commissioned to show that cp access increases child abuse. The study seemed to show the opposite.

      Here’s the problem with even AI generated cp: It might lower abuse in the beginning, but with increased access it would ‘normalise’ the perception of such conduct. This would likely increase abuse over time, even involving persons who may not have been so inclined otherwise.

      This is all a very complex. A solution isn’t simple. Shunning things in anyway won’t help though, and that seems to be the current most popular way to deal with the issue.

      • Facebones@reddthat.com
        link
        fedilink
        arrow-up
        0
        ·
        22 days ago

        Actual pedophiles (a lot of CSA is abuse of power, not pedophilia - though to be clear fuck abusers either way) have a high rate of suicidal ideation because they think its as fucked up as everyone else. Of course we can’t just say “sure AI material is legal now” but I could imagine a regulated system accessed via doctors akin to how controlled substances work.

        People take this firm “kill em all” stance but these people just feel the way they do same as I do towards women or a gay man feels toward men. It just is what it is - we all generally agree gay isnt a choice and this is no different. As long as they dont act on it, I think we should be sympathetic and be open to helping them live a less tortured life.

        I’m not 100% saying this is how we do it, but we should be open to exploring the issue instead of full stop demonization.

        • HonorableScythe@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          22 days ago

          Dan Savage coined the term “gold star pedophile” in a column years ago, referring to people who acknowledge their attraction to children but never act on it by harming a child or accessing CSAM. I do feel bad for these people because there are no resources to help them. The only way they can access actual therapeutic resources for their condition is by offending and going to jail. If the pedophile goes to a therapist and confesses attraction to children, therapists are mandated reporters and will assume they’re going to act on it. An article I read a few years back interviewed members of an online community of non-offending pedophiles who essentially made their own support group since no one else will help them, and nearly all research on them is from a forensic (criminal) context.

          There’s a pretty good article by James Cantor talking about dealing with pedophiles in a therapeutic context here.

          Don’t get me wrong - I think offenders need to be punished for what they do. I unfortunately have a former best friend who has offended. He’s no longer in my life and never will be again. But I think we could prevent offenders from reaching that point and hurting someone if we did more research and found ways to stop them before it happened.

          • Liz@midwest.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            22 days ago

            We really gotta flip the standard and make therapist sessions 100% confidential. We should encouraging people to seek help in stopping their bad behavior, no matter what it is, and they’re less likely to do that if they think a therapist could report them.

            • LustyArgonianMana@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              21 days ago

              You’re asking therapists to live with that information. It’s not so easy to hear that a child is being actively raped and not legally being allowed to report it.

              We already lose tons of social workers in CPS because they can’t help those kids much or save them. Most normal adults can’t really mentally handle child torture without doing something about it. How many unreported child abuse cases before a therapist kills themselves?

              • Liz@midwest.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                21 days ago

                I mean, how many children get abused because people are too afraid to seek help? It’s not an area with an easy answer, and I don’t have hard numbers on how much harm either scenario would produce.

                • LustyArgonianMana@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  21 days ago

                  Well, if all the therapists kill themselves, then that system will be worse than the current one because no one will be getting help

        • fine_sandy_bottom@lemmy.federate.cc
          link
          fedilink
          arrow-up
          0
          ·
          22 days ago

          I agree for the most part, particularly that we should be open minded.

          Obviously we don’t have much reliable data, which I think is critically important.

          The only thing I world add is that, I’m not sure treating a desire for CSAM would be the same as substance abuse. Like “weaning an addict off CSAM” seems like a strange proposition to me.

          • Facebones@reddthat.com
            link
            fedilink
            arrow-up
            0
            ·
            22 days ago

            Maybe I was unclear, when I relate potential generated material to controlled substances, I mean in relation to how you obtain it.

            You go see a psych, probably go through some therapy or something, and if they feel it would be beneficial you would be able to get material via strictly controlled avenues like how you need a prescription for xanax and its a crime to sell or share it.

            (and I imagine… Some sort of stamping whether in the imagery or in the files to trace any leaked material back to the person who shared it, but thats a different conversation)

      • Cryophilia@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        22 days ago

        “Normalized” violent media doesn’t seem to have increased the prevalence of real world violence.

        • Spacehooks@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 days ago

          I actually think video games reduce crime in general. Bad kids are now indoors getting thier thrills.

    • xta@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      22 days ago

      by the same metric, i wonder why not let convicts murderers and psichopaths work at Slaughterhouses

    • JovialMicrobial@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      22 days ago

      I guess my question is does access to regular porn make people not want to have real sex with another person? Does it ‘scratch the itch’ so to speak? Could they go the rest of their life with only porn to satisfy them?

      It depends on the person. I feel like most people would be unsatisfied with only porn, but that’s just anecdotal.

      I honestly think ai generated csam isn’t something the world needs to be produced. It’s not contributing to society in any meaningful ways and pedophiles who don’t offend or hurt children need therapy, and the ones who do need jailtime(and therapy, but Im in the US so thats a whole other thing). They don’t ‘need’ porn.

      My own personal take is that giving pedophiles csam that’s AI generated is like showing alcohol ads to alcoholics. Or going to the strip club if you’re a sex addict. It’s probably not lead to good outcomes.

  • Nollij@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    22 days ago

    This creates a significant legal issue - AI generated images have no age, nor is there consent.

    The difference in appearance between age 16 and 18 is minimal, but the legal difference is immense. This is based entirely on a concept that cannot apply.

    How do you define what’s depicting a fictional child? Especially without including real adults? I’ve met people who believe that preferring a shaved pubic area is pedophilia. This is even though the vast majority of adult women do so. On the flip side, teenagers from the 70s and 80s would be mistaken for 40+ today.

    Even the extremes aren’t clear. Adult star “Little Lupe”, who was 18+ in every single appearance, lacked most secondary sex characteristics. Experts testified in court that she could not possibly be an adult. Except she was, and there’s full documentation to prove it. Would AI trained exclusively on her work be producing CSAM?

    • CeruleanRuin@lemmings.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      22 days ago

      To paraphrase someone smarter than me, “I’ll know it when I see it.”

      But naturally I don’t want to see it. One of the things I miss the least about reddit is the constant image posts of anime characters, who may be whatever age they say but which are clearly representative of very young girls with big tiddies bolted on. It’s gross, but it is also a problem thatsl’s more widespread and nebulous than most people are willing to admit.

      • Nollij@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        22 days ago

        Just when trying to guess someone’s age (we’ll assume completely family-friendly and above board), think back to high school. How old did you and your peers look? Now go take a look at high schoolers today. They probably seem a lot younger than you did. The longer it’s been (i.e. the older you are), the younger they look. Which means, “when I see it” depends entirely on the age of the viewer.

        This isn’t even just about perception and memory- modern style is based on/influenced heavily by youth. It’s also continuing to move in the direction. This is why actors in their 30s - with carefully managed hair, skin, makeup, and wardrobe - have been able to convincingly portray high schoolers. This means that it’s not just you - teens really are looking younger each year. But they’re still the same age.

        • LustyArgonianMana@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          21 days ago

          Wtf. Style is what makes kids look young or old to us because we have been heavily marketed to and follow trends. That’s why when the mullet/porn stache style came back, those Dahmer kids looked in their 40s.

          You’re getting older each year so teens look younger to you.

          Name even one actor in their thirties who convincingly played a high schooler. Literally who

      • Xatolos@reddthat.com
        link
        fedilink
        arrow-up
        0
        ·
        22 days ago

        “I’ll know it when I see it.”

        I can’t think of anything scarier than that when dealing with the legality of anything.