Not a good look for Mastodon - what can be done to automate the removal of CSAM?

    • duncesplayed@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      If I can try to summarize the main findings:

      1. Computer-generated (e.g…, Stable Diffusion) child porn is not criminalized in Japan, and so many Japanese Mastodon servers don’t remove it
      2. Porn involving real children is removed, but not immediately, as it depends on instance admins to catch it, and they have other things to do. Also, when an account is banned, the Mastodon server software is not sending out a “delete” for all of their posted material (which would signal other instances to delete it)

      Problem #2 can hopefully be improved with better tooling. I don’t know what you do about problem #1, though.

  • priapus@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Seems odd that they mention Mastodon as a Twitter alternative in this article, but do not make any mention of the fact that Twitter is also rife with these problems, more so as they lose employees and therefore moderation capabilities. These problems have been around on Twitter for far longer, and not nearly enough has been done.

    • Dave@lemmy.nz
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 year ago

      The actual report is probably better to read.

      It points out that you upload to one server, and that server then sends the image to thousands of others. How do those thousands of others scan for this? In theory, using the PhotoDNA tool that large companies use, but then you have to send the every image to PhotoDNA thousands of times, once for each server (because how do you trust another server telling you it’s fine?).

      The report provides recommendations on how servers can use signatures and public keys to trust scan results from PhotoDNA, so images can be federated with a level of trust. It also suggests large players entering the market (Meta, Wordpress, etc) should collaborate to build these tools that all servers can use.

      Basically the original report points out the ease of finding CSAM on mastodon, and addresses the challenges unique to federation including proposing solutions. It doesn’t claim centralised servers have it solved, it just addresses additional challenges federation has.

  • whatsarefoogee@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    Mastodon is a piece of software. I don’t see anyone saying “phpBB” or “WordPress” has a massive child abuse material problem.

    Has anyone in the history ever said “Not a good look for phpBB”? No. Why? Because it would make no sense whatsoever.

    I feel kind of a loss for words because how obvious it should be. It’s like saying “paper is being used for illegal material. Not a good look for paper.”

    What is the solution to someone hosting illegal material on an nginx server? You report it to the authorities. You want to automate it? Go ahead and crawl the web for illegal material and generate automated reports. Though you’ll probably be the first to end up in prison.

    • redcalcium@lemmy.institute
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I get what you’re saying, but due to federated nature, those CSAMs can easily spread to many instances without their admins noticing them. Having even one CSAM in your server is a huge risk for the server owner.

      • MinusPi (she/they)@pawb.social
        link
        fedilink
        arrow-up
        0
        ·
        1 year ago

        I don’t see what a server admin can do about it other than defederate the instant they get reports. Otherwise how can they possibly know?

        • krimsonbun@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          1 year ago

          This could be a really big issue though. People can make instances for really hateful and disgusting crap but even if everyone defederates from them it’s still giving them a platform, a tiny tiny corner on the internet to talk about truly horrible topics.

          • priapus@sh.itjust.works
            link
            fedilink
            arrow-up
            0
            ·
            1 year ago

            Those corners will exist no matter what service they use and there is nothing Mastodon can do to stop this. There’s a reason there are public lists of instances to defederate. This content can only be prevented by domain providers and governments.