In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

  • InverseParallax@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 days ago

    Intel has excellent transcode, even in their igpus.

    I use an arc750 specifically for transcode, av1 runs at ludicrous speeds, but don’t do an Nvidia, they kind of suck because they dont support vaapi, only nvenc/nvdec and vdpau.

        • Shimitar@feddit.itOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 days ago

          Is jellyfin getting enshittified? Why you say that? Doesn’t seems like its following plex

          • interdimensionalmeme@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 days ago

            As long as it’s not released under a copyleft license, it remains a possibility. That’s not to say they’ve actually done anything untoward, but this is a warning to the people who have been bit by this dynamic one too many times

            • Shimitar@feddit.itOP
              link
              fedilink
              English
              arrow-up
              0
              ·
              5 days ago

              If and when, I will have plenty of time to migrate. I hate plexification and will indeed switch to emby IF and WHEN.

              So far, seems jellyfin is under a copyleft license, so unless they break it, I think I am fine.

              And sincerely, jellyfin totally rocks.

              • interdimensionalmeme@lemmy.ml
                link
                fedilink
                English
                arrow-up
                0
                ·
                4 days ago

                I have fucked up!!! It is emby that is closed source fuckkkkkk!!!

                “Emby has been relicensed and is now closed-source, while open source components will be moved to plugins. Due to this, a free open source fork of Emby was created called Jellyfin.”

  • monkeyman512@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 days ago

    If the iGPU is getting the job done, I would leave that alone. You could add a GPU and pass it through to a gaming VM. But that is an entirely different project.

    • Shimitar@feddit.itOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 days ago

      Could be an interesting project tough, will definitely think about that. Not top priority, but why not since the hardware its free?

  • Eskuero@lemmy.fromshado.ws
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    For an old nvidia it might be too much energy drain.

    I was also using the integrated intel for video re-encodes and I got an Arc310 for 80 bucks which is the cheapest you will get a new card with AV1 support.

      • Eskuero@lemmy.fromshado.ws
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 days ago

        Is x266 actually taking off? With all the members of AOmedia that control graphics hardware (AMD, Intel, Nvidia) together it feels like mpeg will need to gain a big partner to stay relevant.

    • Shimitar@feddit.itOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 days ago

      Thanks!

      Both the 2060 and the 1060 don’t support AV1 either way, so I guess its pointless to me.

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    I only have a GPU because my CPU doesn’t have any graphics. I don’t use the graphics anyway, but I need it to boot. So I put our crappiest spare GPU in (GTX 750 Ti) and call it good.

    I wouldn’t bother. If you end up needing it, it’ll take like 15 min to get it installed and drivers set up and everything. No need to bother until you actually need it.

  • variants@possumpat.io
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    Host steam-headless and use the GPU for that so you can have remote gaming on your phone anywhere you have 5G

    • model_tar_gz@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 days ago

      Holy shit that’s awesome.

      My work gives me unlimited uptime on a dedicated A10. AI engineer working for NVIDIA. Doubt they’d care if I set something up like this, or if they’d even know.

      But I have a 4090 at home anyway so like, do I even need or is this just another way to explore infinite hobby of tinkering with computing.

  • kevincox@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    Most Intel GPUs are great at transcoding. Reliable, widely supported and quite a bit of transcoding power for very little electrical power.

    I think the main thing I would check is what formats are supported. If the other GPU can support newer formats like AV1 it may be worth it (if you want to store your videos in these more efficient formats or you have clients who can consume these formats and will appreciate the reduced bandwidth).

    But overall I would say if you aren’t having any problems no need to bother. The onboard graphics are simple and efficient.

  • precarious_primes@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    I ran a 1650 super for a while. At idle it added about 10W and would draw 30-40W while transcoding. I ended up taking it out because the increased power wasn’t worth the slight performance increase for me.

  • exu@feditown.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 days ago

    QuickSync is usually plenty to transcode. You will get more performance with a dedicated GPU, but the power consumption will increase massively.

    Nvidia also has a limit how many streams can be transcoded at the same time. There are driver hacks to circumvent that.