In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

  • exu@feditown.com
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 months ago

    QuickSync is usually plenty to transcode. You will get more performance with a dedicated GPU, but the power consumption will increase massively.

    Nvidia also has a limit how many streams can be transcoded at the same time. There are driver hacks to circumvent that.

  • Eskuero@lemmy.fromshado.ws
    link
    fedilink
    English
    arrow-up
    8
    ·
    2 months ago

    For an old nvidia it might be too much energy drain.

    I was also using the integrated intel for video re-encodes and I got an Arc310 for 80 bucks which is the cheapest you will get a new card with AV1 support.

      • Eskuero@lemmy.fromshado.ws
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Is x266 actually taking off? With all the members of AOmedia that control graphics hardware (AMD, Intel, Nvidia) together it feels like mpeg will need to gain a big partner to stay relevant.

        • InverseParallax@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          Google is pushing av1 because of patents, but 266 is just plain better tech, even if it’s harder to encode.

          This same shit happened with 265 and vp9, and before that, and before that with vorbis/opus/aac.

          They’ll come back because it’s a standard, and has higher quality.

          Maybe this is the one time somehow av1 wins out on patents, but I’m encoding av1 and I’m really not impressed, it’s literally just dressed up hevc, maybe a 10% improvement max.

          I’ve seen vvc and it’s really flexible, it shifts gears on a dime between high motion and deep detail, which is basically what your brain sees most, while av1 is actually kind of worse than hevc at that to me, it’s sluggish at the shifts, even if it is better overall.

  • InverseParallax@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 months ago

    Intel has excellent transcode, even in their igpus.

    I use an arc750 specifically for transcode, av1 runs at ludicrous speeds, but don’t do an Nvidia, they kind of suck because they dont support vaapi, only nvenc/nvdec and vdpau.

    • Shimitar@feddit.itOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 months ago

      Thanks!

      Both the 2060 and the 1060 don’t support AV1 either way, so I guess its pointless to me.

  • monkeyman512@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 months ago

    If the iGPU is getting the job done, I would leave that alone. You could add a GPU and pass it through to a gaming VM. But that is an entirely different project.

    • Shimitar@feddit.itOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 months ago

      Could be an interesting project tough, will definitely think about that. Not top priority, but why not since the hardware its free?

        • Shimitar@feddit.itOP
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Yes, but if I can stream games to my mobile device that could be an acceptable treadeoff, if the card doesn’t drain too much when idle

      • lowdude@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I would avoid it, if you care at all about availability and downtime. The result will probably not be great, you need to ensure the server side gets enough resources under load, and setting it up may require constant restarts if things aren’t immediately working as expected.

        Nonetheless, here is a link where someone did essentially exactly that on NixOS: https://astrid.tech/2022/09/22/0/nixos-gpu-vfio/

  • sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    I only have a GPU because my CPU doesn’t have any graphics. I don’t use the graphics anyway, but I need it to boot. So I put our crappiest spare GPU in (GTX 750 Ti) and call it good.

    I wouldn’t bother. If you end up needing it, it’ll take like 15 min to get it installed and drivers set up and everything. No need to bother until you actually need it.

  • variants@possumpat.io
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Host steam-headless and use the GPU for that so you can have remote gaming on your phone anywhere you have 5G