• Bandicoot_Academic@lemmy.one
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Most people probably don’t have a dedicated GPU and an iGPU is probably not powerfull enough to run an LLM at decent speed. Also a decent model requires like 20GB of RAM which most people don’t have.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      It doesn’t just require 20GB of RAM, it requires that in VRAM. Which is a much higher barrier to entry.

        • T156@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          7 months ago

          Not exactly. Most integrated chips have a small pool of dedicated VRAM, and then a bit more that they share with the system memory, though it’s generally only a portion, not all of it. It’s only Apple’s unified memory, and maybe other mobile chips that has them both share memory pool entirely, for better or worse, as far as I’m aware.

          But it is worth noting that if you don’t have enough VRAM and have to put it into RAM, the minimum expectation is that you have twice the amount of RAM space. So if you have a GPU with 4GB of VRAM, and need to offload the extra to the system, you don’t need 16 GB, you need 32 GB.