• Marxism-Fennekinism@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    10 months ago

    The last quote danced around it but if the implication is that they were seeking out and collecting CSAM which is a sex crime to access, possess and distribute, why the fuck are the boards of both companies not in prison and on the sex offender list?!

    I mean, I know why, but

    • smooth_tea@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I really find this a bit alarmist and exaggerated. Consider the motive and the alternative. You really think companies like that have any other options than to deal with those things?

    • SacrificedBeans@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’m sure there’s some loophole there, maybe between countries’ laws. And if there isn’t, Hey! We’ll make one!

    • Clbull@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Isn’t CSAM classed as images and videos which depict child sexual abuse? Last time I checked written descriptions alone did not count, unless they were being forced to look at AI generated image prompts of such acts?

      • Strawberry@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        That month, Sama began pilot work for a separate project for OpenAI: collecting sexual and violent images—some of them illegal under U.S. law—to deliver to OpenAI. The work of labeling images appears to be unrelated to ChatGPT.

        This is the quote in question. They’re talking about images