Thanks ahead of time for your feedback

  • Tarquinn2049@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    It’s a bit of a blend of it has always been a big deal, and that it is indeed more of a big deal still now because of how easy, accessible, and believable the AI can be. Like even nowadays, Photoshop hits only one point of that triangle. But it was even less capable back in the day. It could hit half of one of those points at any given time.

    Basically, a nude generated by a good AI has to be proven false. Because it doesn’t always immediately seem as such at first. If you have seen obvious AI fakes, they are just that, obvious. There are many non-obvious ones that you might have seen and not known they were fake. That is, of course, assuming you have looked.

    The other reason it can be more of a big deal now is that kids have been doing it of other kids. And since the results can be believable, the parents didn’t know they were fake to start with. So it would blow up as if it was real before finding out it was AI. And anything involving that is gonna be a big deal.

      • Tarquinn2049@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I mean, that was an issue in the first month or so. Though I could see if the automated tools people use for this specific purpose might not stay up to date. I haven’t specifically interacted with those. But proper AI tools have in-filling to correct mistakes like that, you can keep the rest of the image and just “reroll” a section of it until whatever you didn’t like about it is fixed. Super quick and easy.

  • simple@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Because now teenagers can do it with very little effort whereas before it at least required a lot of time and skill

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Doctored photos have always been a problem and, legally speaking, could lead to the faker being sued for defamation, depending on what was done with the person’s image.

    AI Photos are only part of the problem. Faking the voice is also possible, as is making “good enough” videos where you just change the head of the actual performer.

    Another part of the problem is that this kind of stuff spreads like wildfire within groups (and it’s ALWAYS groups where the victim is) and any voices stating that it’s fake will be drowned by everyone else.

  • snooggums@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    In addition the the reduced skill barriers mentioned, the other side effect is the reduced time spent finding a matching photo and actually doing the work. Anyone can create it in their spare time, quickly and easily.

  • southsamurai@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Well, I think everyone has already covered that it was a big deal at the time, it simply wasn’t something we could wipe out as a society.

    And it’s still a big deal.

    However, I don’t think anyone touched on why fake nudes, even ones that are obviously fake, or even labeled as fake by the creator are a problem.

    It comes back to the entire idea of consent. That’s for anyone, but women in particular are heavily sexualized, even well before they’re women. There is a constant, unending pressure on women of knowing that they are going to be sexually objectified. It might not be every day, by everyone alive around them, but it is inescapable.

    One can debate whether or not nudity should be a big deal, whether or not it is sexualized because of the rules a given culture has around nudity, but the hard truth is that nudity is sexualized. Ergo, images of a woman’s body is something that they deserve to have control over access to. If someone consents to images being available, great! If they don’t, then there’s a problem.

    Fakes, even obvious and declared fakes, violate that barrier of body autonomy. They directly ignore the person’s wishes regarding their naked body.

    The better the fake, the worse that violation is, because (as others said), once a fake is good enough, the subject of the fake is put in the position of having to deny it’s them. They shouldn’t have to ever be in that position, no matter who it is.

    Even a porn performer should have the ability to be free of fakes because they didn’t consent to those fakes. They also have a very valid claim on it infringing on their income as well. Now, I’m certain that legal fakes will someday be a thing. There will be contracts for likeness rights to produce fake porn. Bet on it. If I had free income, I would immediately invest in such an endeavor because I guarantee it will make money.

    But, as things stand, fakes are no better than someone taking a picture through a window shade, or using infrared to sneak by clothing. It’s digital, and it’s fake, but it is the direct equivalent of violating someone’s privacy and body autonomy.

    That’s why it’s a big deal to begin with.

    And, yeah, it is something that’s here to stay, it’s unavoidable. And someone is bound to comment that they wouldn’t care. Great, good for you. That doesn’t obligate others to not care too. But, put it to the test and provide a few pictures of yourself in your comment so that someone can make a fake nude of you, then plaster it online with zero context and labeled with at least your user name so everyone running across it can direct responses to it to you.

    It’s all about personal privacy, consent, and body autonomy.

  • lmaydev@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    It was always a big deal. But back then it was often pretty obvious when it was a fake. It’s getting harder and harder to tell.

  • DrownedRats@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Because now, anyone can do it to anyone with zero effort and a single photo.

    Sure, before anyone with decent Photoshop skills could put together a halfway decently convincing fake nude but its still significantly more effort and time than most would be bothered with and even then its fairly easy to spot and dispute a fake.

    Most people weren’t concerned if a celebrity’s fake nudes were spread around before but now that a colleague, student, teacher, family member, or even a random member of the public could generate a convincing photo the threat has become far more real and far more conceivable

  • EveryMuffinIsNowEncrypted@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    Honestly? It was kind of shitty back then and is just as shitty nowadays.

    I mean, I get why people do it. But in my honest opinion, it’s still a blatant violation of that person’s dignity, at least if it’s distributed.

    • Zorque@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It’s not that now it’s bad… it’s that now it’s actually being addressed. Whereas before it was just something people would sweep under the rug as being distasteful, but not worthy of attention.

  • Cyteseer@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    It’s always been a big deal, it just died down as Photoshop as a tool became normalized and people became accustomed to it as a tool.

  • Azzu@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    It’s only a big deal because of puritan society like in the US or UK or similar in the first place. There are societies where nudity is not a big deal anyway, so nude photos of someone are also no big deal.

    Look at Germany for example. Lots of FKK (nude) areas. No one really bats an eye. Of course there nudity is also not perfectly normalized, especially in some circles. But still, not many are concerned about nude pictures of themselves.

    Obviously AI makes more nude pictures faster than someone skilled at Photoshop. So if your society has a problem with nudity, there will be more problems than before.

    But really, there shouldn’t be a problem anyway.

    • Don_alForno@feddit.org
      link
      fedilink
      Deutsch
      arrow-up
      0
      ·
      2 months ago

      Look at Germany for example. Lots of FKK (nude) areas. No one really bats an eye.

      We still don’t appreciate our nudes posted online, fake or not.

      • Azzu@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        Of course there nudity is also not perfectly normalized, especially in some circles.

        Also obviously because of privacy reasons people don’t like their pictures posted online, nude or not.

    • Etterra@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      They’re not even actual nudes - they’re fake. It seems to me to be no different than putting someone’s head on a Big Bird photo.

      That said, nobody gets to decide what’s offensive to other people. Just do as they ask and don’t do the fake nudes. It’s not like there’s a shortage of porn done by willing adults.

      • Azzu@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        I’m not saying people should do it. I’m just talking about a fundamental principle to keep yourself happy: to not be hurt by what other people are doing. In the end, you can’t control what other people will do. But you can control your reaction to what they do.

        Of course, everyone is also allowed to disagree with this advice. I’m just sharing what works for me. If someone wants to feel bad/offended about someone making a fake nude of them, I don’t want to stop anyone doing that (and I can’t).

  • HubertManne@moist.catsweat.com
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I sorta feel this way. Before that people would make cutout mashups or artistic types might depict something. I do get that its getting so real folks may things the people actually did the thing.

  • Ziggurat@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I have a similar opininipn. People have been forging!/editing photographs and movies for as long as the technique existed.

    Now any stupid kid can do it, the hard part with AI is actually not getting porn. If it can teach everyone that fake photo are a thing, and make nudes worthless (what’s the point of a nude anyway ? Genitals looks like… Genitals)

            • jet@hackertalks.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              2 months ago

              People who are thinking of dating, or who are dating have been known to send nudes to each other.

              This serves several purposes, it indicates interest, it’s meant to entice interest, it’s a sign that things are going well, sets the mood sets the tone etc it is a form of sexual communication which is necessary, and not unheard of, when people are dating

              • Azzu@lemm.ee
                link
                fedilink
                arrow-up
                0
                ·
                2 months ago

                Oh, I thought you were talking about where fake nudes matter. I didn’t think we were talking about “legitimate” uses of nudes, because this whole thread is about fakes :D

                • jet@hackertalks.com
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  2 months ago

                  It’s not necessary.

                  This whole thread was in response to “What is the point of a nude anyway”

  • Melvin_Ferd@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    I got a few comments pointing this out. But media is hell bent on convincing people to hate AI tools and advancements. Why? I don’t know.

    Tin foil hate is that it can be an equalizer. Powerful people that own media like to keep power tools to themselves and want the regular folk to fear and regulate ourselves from using it.

    Like could you imagine if common folk rode dragons in GOT. Absolutely disgusting. People need to fear them and only certain people can use it.

    Same idea. If you’re skeptical, go look up all the headlines about AI in the past year and compare them to right wing media’s headlines about immigration. They’re practically identical.

    “Think of the women and children.”

    “They’re TAKING OUR JOBS”

    “Lot of turds showing up on beaches lately”

    “What if they kill us”

    “THEY’RE STEALING OUR RESOURCES”

    • Randomgal@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      You’re looking for a cat’s fifth leg. There is no conspiracy. It’s just new technology and what’s new is scary, specially big leaps, which this new age of machine learning seems to be part of.

  • shastaxc@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    If AI is so convincing, why would anyone care about nudes being controversial anymore? You can just assume it’s always fake. If everything is fake, why would anyone care?