Abacus.ai:

We recently released Smaug-72B-v0.1 which has taken first place on the Open LLM Leaderboard by HuggingFace. It is the first open-source model to have an average score more than 80.

  • TheChurn@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Every billion parameters needs about 2 GB of VRAM - if using bfloat16 representation. 16 bits per parameter, 8 bits per byte -> 2 bytes per parameter.

    1 billion parameters ~ 2 Billion bytes ~ 2 GB.

    From the name, this model has 72 Billion parameters, so ~144 GB of VRAM