Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Does this come to the surprise of anyone? This is the case on large social networks and decentralised networks by definition are free and open for anyone to moderate how they please. Unfortunately including pedos

  • Thomas Gray@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Hi, since Mastodon is no longer acceptable due to the 0.04 percent of instances found to have abusive material, would someone please suggest the alternative social network with 0 percent of these incidents? Companies like Facebook and Twitter are driven by shareholders and greed, Mastodon is a community effort and you’ll certainly find bad actors there, but I feel less dirty contributing to a community project, versus helping billionaires like Zuck and Elon line their pockets harvesting my data.

  • Jordan Lund@lemmy.one
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    “massive child abuse material problem”

    “112 instances of known CSAM across 325,000 posts”

    While any instance is unacceptable, does 112/325,000 constitute a “massive problem”?

    0.0000034462% of posts are unacceptable! Massive problem!

  • teawrecks@sopuli.xyz
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I for one am all for instances being forcibly taken down by police if they can’t moderate CSAM appropriately.

    Moderation is a very real challenge. The internet at large aimed to solved it by centralizing everything to a few mega corps with AI moderation. The fediverse aims to solve it by keeping instances small and holding both mods and users accountable.

  • stravanasu@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’m not fully sure about the logic and perhaps hinted conclusions here. The internet itself is a network with major CSAM problems (so maybe we shouldn’t use it?).

  • 🦊 OneRedFox 🦊@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Yeah I recall that the Japanese instances have a big problem with that shit. As for the rest of us, Facebook actually open sourced some efficient hashing algorithms for use for dealing with CSAM; Fediverse platforms could implement these, which would just leave the issue of getting an image hash database to check against. All the big platforms could probably chip in to get access to one of those private databases and then release a public service for use with the ecosystem.

      • Paradoxvoid@aussie.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        As much as we can (and should) lambast Facebook/Meta’s C-Suite for terrible decisions, their engineers are generally pretty legit.

    • zephyrvs@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 year ago

      That’d be useless though, because first, it’d probably opt-in via configuration settings and even if it wasn’t, people would just fork and modify the code base or simply switch to another ActivityPub implementation.

      We’re not gonna fix society using tech unless we’re all hooked up to some all knowing AI under government control.

      • crystal@feddit.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        That’s not the point. Yes, child porn sites can host child porn. Other sites/instances can’t stop that. But what other instances can stop, is redistributing said child porn. And for that purpose, such technology would be useful.

  • while1malloc0@beehaw.org
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    While the study itself is a good read and I agree with the conclusions—Mastodon, and decentralized social media need better moderation tools—it’s hard to not read the Verge headline as misleading. One of the study authors gives more context here https://hachyderm.io/@det/110769470058276368. Basically most of the hits came from a large Japanese instance that no one federates with; the author even calls out that the blunt instrument most Mastodon admins use is to blanket defederate with instances hosted in Japan due to their more lax (than the US) laws around CSAM. But the headline seems to imply that there’s a giant seedy underbelly to places like mastodon.social[1] that are rife with abuse material. I suppose that’s a marketing problem of federated software in general.

    1. There is a seedy underbelly of mainstream Mastodon instances, but it’s mostly people telling you how you’re supposed to use Mastodon if you previously used Twitter.
    • jherazob@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      The person outright rejects defederation as a solution when it IS the solution, if an instance is in favor of this kind of thing you don’t want to federate with them, period.

      I also find worrying the amount of calls for a “Fediverse police” in that thread, scanning every image that gets uploaded to your instance with a 3rd party tool is an issue too, on one side you definitely don’t want this kinda shit to even touch your servers and on the other you don’t want anybody dictating that, say, anti-union or similar memes are marked, denounced and the person who made them marked, targeted and receiving a nice Pinkerton visit.

      This is a complicated problem.

      Edit: I see somebody suggested checking the observations against the common and well used Mastodon blocklists, to see if the shit is contained on defederated instances, and the author said this was something they wanted to check, so i hope there’s a followup