Graphics card memory, or VRAM, has been a hot topic of discussion as of late because we're not getting enough of it. Here's our latest testing of GPUs and VRAM capacity at 8GB, 12GB and 16GB.
Graphics card memory, or VRAM, has been a hot topic of discussion as of late because we're not getting enough of it. Here's our latest testing of GPUs and VRAM capacity at 8GB, 12GB and 16GB.
Picked up a RTX4060ti 8GB for $300 during the winter, I know the cards limitations but it plays all the games I want to play at 1440p ultra settings. If I need to dial down a setting I will. I also own the RTX3060 12GB and the 4060ti easily outperforms it. When the cards no longer give me the performance I want I will upgrade to a reasonably priced & better performing 12GB card but until that time I'm not going to waste a bunch of money just so I can run a benchmark faster.
Did you see the video showing the textures disappearing on the 8GB GPU? Performance isn't just about framerate.Yea techspot perpetuates the whole "8gb is not enough" thing alot. I guess it garners them clicks or something.
I feel like the bad performance of the 4060 ti 8gb is more to do with the 128-bit bus as opposed to just the amount of vram.
Every time I see a story like this, I instantly know who wrote it: Steve. He seems to just dislike Nvidia.
You make a good point. Steve needs to verify that his observations apply to the other vendors' cards. (Even if they, almost certainly, will end up about the same.)Seeing how they use compressed frame buffers etc. Is the vram allocation same for amd, nvidia, intel in same scenario?
most gamers don't mess with settings; also just because amout of VRAM shows allocated doesn't mean that much is necessary; most games run fine on 1080p medium to low with 4GB VRAM; even a 2015 980ti or 2016 Titan X with 6GB of VRAM does well @4k Medium to High; still impressed how many games my 780Ti and 290X can play as well.I think this review proves the point that the ideal GPU should have more than 8GB of VRAM. This is especially so when most games are focusing on offering higher texture quality.
What I do think is unrealistic is that most gamer who buy a mid or low end card will not expect to run most games at max/ ultra settings, regardless of resolution. If the game runs well at max settings, then its a bonus. For example, when I was using a mid tier GPU to run AAA game titles at 4K, I will naturally start off with medium settings, with DLSS/ FSR at Balance quality, to start off with. Some people may try and max things out initially, but will adjust downwards if the performance is not satisfactory. But again, it is good to know the limitations of 8GB VRAM.
With the exception of entry-level options, you shouldn't be buying 8GB graphics cards anymore. 12GB is now the bare minimum, and 16GB is the ideal target.
Of course he does, because he is knows how bad NVIDIA is screwing over people that purchase their cards.Every time I see a story like this, I instantly know who wrote it: Steve. He seems to just dislike Nvidia.
You must be new, because he has benchmarked all those games numerous times. Maybe try looking them up? Those benchmark articles are posted on this site too. LOLI promise I'm not trying to be an ***, but to me, all those graphs are useless without FPS data. Just because a game will allocate the memory doesn't mean it's actually using it or actually needs it.
After Steve told me that I needed 16GB+ on my next graphic card I went with AMD Radeon RX 7900 XT, upgraded from my old work horse the NVIDIA GeForce GTX 1080 Ti.Excellent test! I said it years ago, 16GB is the minimum. One reason I went with the AMD RX6800.