It’s kind of silly, but I still really dig the idea behind torrenting and peer to peer sharing of data. It’s cool to think about any old computer helping pass along some odd bits & bytes of data, whether a goofy drawing or strange story.

  • zerakith@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Its a really interesting question. I wonder what the underlying economics and ideologies are at play with its decline. Economies of scale for large server farms? Desire for control of the content/copyright? Structure and shape of the network?

    I guess it has some implications for stream versus download approaches to content?

    • ShittyBeatlesFCPres@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      If I recall, Spotify moved away from it just because the client/server model got way cheaper and the P2P model had some limitations for their future business plans. I remember them mentioning that offering a family plan was a challenge with their P2P architecture when people on the same network/account were using it at the same time.

      It was probably also part of the move to smartphones. Spotify was just a desktop program for a long time and, while I’m not an expert, I would guess the P2P model made a lot more sense on desktop with a good connection than early smartphones on flaky 2G/3G connections. They might have had to run a client/server model for iOS and/or Android anyway.

  • XTL@sopuli.xyz
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    One funny use I discovered when I was cloning a lot of computers is that even on a closed lan, BT with local discovery was stupidly fast in distributing a big set of files across a pile of computers instead of rsync. Also, setting it up was much easier.

  • LWD@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    This might be stretching the definition of “common” and “torrenting,” but BitTorrent created BitTorrent Sync with similar tech for personal file synchronization. It was later rebranded Resilio and still exists today.

    https://www.resilio.com/

    An open-source alternative that works in a similar fashion, SyncThing, also exists.

    https://syncthing.net/

    • AdamEatsAss@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      4 months ago

      I would consider this to be one of the intended functions of torrent files. Torrents started as faster ways to share files peer to peer. If a few people had a large file on their machines they could each upload part to someone who needs it essentially multiplying their upload bandwidth. This became less popular as internet speeds increased, except for “illegal” stuff. I would definitely try one of these…if I had more than one computer.

      • LWD@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        A common use case for SyncThing is keeping a password file up to date between, say, your PC and your phone. It’ll even work remotely, thanks to the presence of relays.

        (The downsides include pretty heavy battery usage )

  • CharlesReed@kbin.run
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    I torrent old out of print books that I can’t find anywhere else. The scans are usually pretty good. There was also a podcast I used to listen to called Caustic Soda. When they ended it, they released all of their episodes through torrenting so the fans could have them.

  • Max-P@lemmy.max-p.me
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    I think a good chunk of the Internet Archive is available as torrents, at least the software collections and public domain media.

    You can also download a torrent of the whole of Wikipedia, with and without images.

      • Barry Zuckerkorn@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        4 months ago

        As of last year, English Wikipedia, articles only, text only, was about 22GB compressed (text compresses pretty efficiently), according to the current version of this page:

        As of 2 July 2023, the size of the current version of all articles compressed is about 22.14 GB without media

        Some other sources describe the uncompressed offline copies as being around 50 GB, with another 100 GB or so for images.

        Wikimedia, which includes all the media types, has about 430 TB of media stored.

  • Rentlar@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Transferring files to several other computers. I’ve done it in the past before I used KDE connect to transfer files rather than use ftp or just memory sticks. It would be useful at a LAN party to get several copies of the software distributed. (Kinda piracy but doesn’t have to be if the game is free or everyone owns it legitimately).

  • FiskFisk33@startrek.website
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    I don’t think they do it anymore, but spotify started out with a p2p network on the backend.
    Super smart way of bootstrapping such a thing without having to upfront huge server costs.

    • NaN@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 months ago

      Took it out ten years ago. It was super smart, and there are still situations where it would be helpful, like when a new Taylor Swift album drops that takes the service offline.

  • HarriPotero@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    PeerTube uses Webtorrents to offload hosting of hueg files.

    Odysee uses something similar to do the same. (At least they claim to, but last time I took a dig at it it seemed to be hosted “regularly”)

    Spotify famously had their own p2p-thing going in their desktop apps in the early days. Saved them a pretty coin back when hosting was expensive.

    Coming to a browser near you is IPFS.

  • QuarterSwede@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    Any large file is going to be much quicker getting through BT as long as there are enough seeders. OS distros, patches, P2P files, 4K anything, etc.

  • rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    4 months ago

    Clonezilla uses bittorrent for one of its massive deployment modes. I work at a university, and whenever we have to deploy an OS image, the ten gigabit uplink between the storage server and the classroom switches always gets saturated in unicast/interactive mode. Using bittorrent mode gets around this issue because once a computer has downloaded a chunk of the image, it can seed it for the rest of the computers within the subnet. One massive limitation is that the target computer has to have enough storage space for both the downloaded image and the deployed OS too.

  • Dave@lemmy.nz
    link
    fedilink
    arrow-up
    0
    ·
    4 months ago

    If you just mean peer to peer, I feel like magnet links (often using bittottent) are still found for downloading large files from time to time (not just ISOs). Things like open source games and software, though if I’m being honest I can’t think of a single one that still uses them. You used to find magnet links all over the open source scene but I guess with github offering free hosting it’s not so common anymore.