For instance, say I search for “The Dark Knight” on my Usenet indexer. It returns to me a list of uploads and where to get them via my Usenet provider. I can then download them, stitch them together, and verify that it is, indeed, The Dark Knight. All of this costs only a few dollars a month for me.

My question is, why can’t copyright holders do this as well? They could follow the same process, and then send takedown requests for each individual article which comprises the movie. We already know they try to catch people torrenting so why don’t they do this as well?

I can think of a few reasons, but they all seem pretty shaky.

  1. The content is hosted in countries where they don’t have to comply with takedown requests.

It seems unlikely to me that literally all of it is hosted in places like this. Plus, the providers wouldn’t be able to operate at all in countries like the US without facing legal repercussions.

  1. The copyright holders feel the upfront cost of indexer and provider access is greater than the cost of people pirating their content.

This also seems fishy. It’s cheap enough for me as an individual to do this, and if Usenet weren’t an option, I’d have to pay for 3+ streaming services to be able to watch everything I do currently. They’d literally break even with this scheme if they could only remove access to me.

  1. They do actually do this, but it’s on a scale small enough for me not to care.

The whole point of doing this would be to make Usenet a non-viable option for piracy. If I don’t care about it because it happens so rarely, then what’s the point of doing it at all?

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    why is the DMCA the one fucking law that actually gets enforced at a high rate when there are literally billions of things more important that we could spend money on

  • DosDude👾@retrolemmy.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    As far as I know, they do get dmca’d. But they delete a single file so it’s incomplete. But if you have 2 different newsgroup providers they usually didn’t delete the same file, so you can still download it.

    But I could be totally wrong because I haven’t really looked into this, and this is all from a very old memory.

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    They do receive takedown notices, however files uploaded to usenet are mirrored across many providers across many jurisdictions while also split into many parts as you noted. Usenets implementation of file sharing is quite robust; being able to rebuild a file that’s missing a significant portion of it’s data. To successfully take down a file, you need to remove many of these parts across almost all of the usenet backbones which requires cooperation across many nations/jurisdictions that are governed by varying laws. It’s not an easy task.

    Here’s a somewhat limited map of usenet providers:

  • Rikudou_Sage@lemmings.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Off topic, but is there any tutorial on how to do this Usenet thing? Feel free to contact me on Matrix, it’s on my profile.

    • hoanbridgetroll@midwest.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      No expert, but here’s my quick and dirty version:

      1. Find an unlimited Usenet provider that works with your budget and location. Plenty of debate out there on which are best, and if you need a second pay-per-GB provider for filling in missing parts or not.

      2. Spin up SABnzbd+ or a similar Usenet client on a local PC/NAS/etc.

      3. the hard part - find a quality private Usenet indexer site that you can get an invite or has open registration.

      4. Download the nzb files for the Linux ISO that you want from your indexer and open it with your Usenet client. (There are ways to feed the nzb file directly to your client, but that’s for next lesson).

      5. Client looks for all of the parts listed in the nzb file on your Usenet provider, then downloads and unpacks them.

      6. Et Voila - Linux ISO appears in your downloads directory.

      A VPN is probably unnecessary, as most Usenet providers don’t log who downloads which files. Also, you can often hit your ISP’s max download rate from your Usenet provider, and there is no “seeding” to worry about.

      Good luck!

    • Agathon 🏴‍☠️@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Find a Usenet provider. A quick web search and some reading should get you to the right place. I’m not sure if any good free servers are available anymore, but there’s probably one that’s cheap enough.

      Looks like https://sabnzbd.org/ is a free and open source Windows/MacOS/Linux client that can download files. I haven’t tried it, but it’s highly rated on alternativeto.net

    • LazerDickMcCheese@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      I’d like to know too, but people are so cryptic about it every time this shit is brought up that I’m overwhelmed before I even begin. So I just stick to the tried and true methods I know

    • Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      You’ll need 3 things:

      A usenet client such as SABnzbd. This is equivalent to a torrent client like qbittorrent.

      An NZB indexer such as NZBGeek, again equivalent to torrent indexers, but for nzb files.

      And finally a usenet provider such as FrugalUsenet. This is where you’re actually downloading articles from. (there are other providers listed in the photo in my other comment here)

      Articles are individual posts on usenet servers. NZB files contain lists of articles that together result in the desired files. There are also additional articles included so if some are lost (taken down due to dmca/ntd) they can be rebuilt from the remaining data. Your nzb client handles the process of reading nzb files, trying to download the articles from each of your configured usenet providers, then decompressing, rebuilding lost data, and finally stitching it all together into the files you wanted.

  • Nollij@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    First, a massive amount of content is removed. You won’t find a lot of popular, unencrypted content these days on usenet. It’s all encrypted and obfuscated now to avoid the bots

    Speaking of bots, I don’t think you realize how much of this process is automated, or how wide of a net is being used. The media corporations all have enormous collected libraries of material. It gets posted constantly to all sorts of places. This includes public torrents, public usenet, YouTube, PornHub (yes, really, even for non-porn), Facebook, TikTok, Tumblr, GNUtella, DDL sites…

    The list goes on and on. Each one gets scanned for millions of potentially infringing items, often daily. No actual people are doing those steps.

    Now, throw in things like private torrents, encrypted usenet posts, invite-only DDL, listings that use ‘3’ instead ‘e’ or those other character subscriptions… These require actual humans to process. Humans that cost money, and a considerable amount of it. As a business, you have to show a return on investment. Fighting piracy, even at its theoretical best, doesn’t increase revenues by a lot.

    You mention revenue and breaking even, but you left out an important detail. Your time is free. They don’t have to pay $10/month, they have to pay $10/month + $20/hour for someone to deal with it. And most pirates of that level will just find another method.

    • Darkassassin07@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      You won’t find a lot of popular, unencrypted content these days on usenet. It’s all encrypted and obfuscated now to avoid the bots

      That’s not been my experience at all. Pretty much everything I’ve looked for has been available and I rarely come across encrypted files. I do regularly have to try 2 or 3 nzbs before I find a complete one, but I almost always find one.

      • Nollij@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Are they obfuscated in any way? Depending on your client, you may not be able to see the names and subjects. But if you didn’t have the NZB, is there any real chance you could find it otherwise?

        • Darkassassin07@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          10 months ago

          But if you didn’t have the NZB, is there any real chance you could find it otherwise?

          No, but that’s just the nature of NZB file sharing. The individual articles aren’t typically tagged/named with the actual file names, that info is pulled from the NZB and the de-compressed + stitched together articles.

          I’m not using any special indexers, just open public registration ones. The NZBs aren’t hard to find, for me or for IP claimants.