Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • PlasmaDistortion@lemm.ee
    link
    fedilink
    English
    arrow-up
    79
    arrow-down
    3
    ·
    edit-2
    11 months ago

    My RTX 4060 has 16GB of RAM. What on earth makes them think people would go for 12GB?

    • elvith@feddit.de
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      11 months ago

      I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.

      • AProfessional@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        11 months ago

        Do you think there is a large overlap of people who buy $600-$900 cards and like 1080p?

        My 3080 10GB runs out of VRAM personally at 1440p. I would never get <16GB again.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        Thanks, that was going to be exactly my question. I don’t see anyone choosing low memory for video but had no idea what ai needs

        • elvith@feddit.de
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          11 months ago

          You can run Stable Diffusion XL on 8GB of VRAM (to generate images). For beginners, there’s e.g. the open source software Fooocus, which handles quite a lot of work for you - it sends your prompt to a GPT-2 model (running on your PC) to do some prompt engineering for you and then uses that to generate your images and generally features several presets, etc. to get going easily.

          Jan (basically an open source software that resembles ChatGPT and allows you to use several AI models) can run in 8GB, but only for 3B models or quantized 7B models. They recommend at least 16GB for regular 7B models (which they consider “minimum usable models”). Then there are larger, more sophisticated models, that require even more.

          Jan can run on CPU in your regular RAM. Since it’s chatting with you, it’s not too bad, when it spits out words slowly, but GPU is / would be nice here…