AMD Pokes Fun at Nvidia for Offering Less VRAM Than Radeon GPUs

AMD Radeon
Credit: AMD

AMD Radeon

Over the past few months, VRAM has become much more important for gamers playing AAA titles. Speaking from personal experience, both Hogwarts Legacy and The Last of Us for PC crushed my 10GB RTX 3080, causing poor frame rates, stuttering, and an overall bad experience. If you're one of those gamers who thought an 8GB card would last a few years, you're in the same boat. Now AMD is capitalizing on this trend with a new marketing push to point out the generous amounts of VRAM it offers in its previous-generation GPUs and its newest cards. AMD's latest campaign is likely meant to impact tomorrow's lifting of the review embargo on the 12GB Nvidia GeForce RTX 4070.

In its latest blog post, AMD points out that most of its upper-midrange GPUs offer much more video memory than their Nvidia counterparts. The hardest hit Green Team members are those who bought the RTX 3070 and 3070 Ti, which only provide 8GB of GDDR6/x memory. Though those GPUs were priced at $329 and $499 at launch, they cost about twice that due to the pandemic and were/are very good 1440p GPUs. However, it's now becoming apparent they will be held back significantly by their VRAM allotment, especially in titles with heavy ray tracing. For example, RTX 3070 owners probably cannot play the new Cyberpunk 2077 in Overdrive mode with path tracing.

AMD Marketing
AMD recommends at least 12GB of VRAM for any title played at 1440p and above. Credit: AMD

AMD attempts to point out the difference having more VRAM can make by comparing its GPUs with Nvidia using 4K benchmarks across a wide range of titles. However, few gamers currently use that resolution even on high-end GPUs, opting instead for 1440p at 144Hz or higher. Still, AMD also points out you can get its RX 6800 with 16GB of RAM for just $500, while there is no Nvidia GPU in that price range with equivalent memory. The closest Nvidia comes is the one-off RTX 3060 with 12GB of RAM, but that card is generally incapable of 4K gameplay.

As much as we don't enjoy passing on PR spin, AMD has a point here: it's known for offering more VRAM than Nvidia. Although its current 7900 XTX flagship matches Nvidia's with 24GB of VRAM, the 7900 XT has 20GB compared with the RTX 4080's 16GB, making it a bit more future-proof. Nvidia would likely counter that argument by saying even 16GB of VRAM won't necessarily make ray tracing run any faster since the two companies approach the problem differently. Not to mention Nvidia is already one generation ahead of AMD when it comes to ray tracing, and AMD is also not willing to go to the mat at the very high end of the market.

RIP 8GB VRAM (meme)
Not even 10GB VRAM is enough for some of the latest games with ray tracing. Credit: Reddit

Subreddits are already lamenting the early death of cards like the RTX 3070 series (above). The 20 and 10 series cards appear to be officially DOA for high-resolution gaming unless you're just playing older or less intensive games. However, AMD does have the upper hand here, assuming it can convince gamers to buy last-generation GPUs. It also highlights that AMD is now four months out from RDNA 3's launch, with only two GPUs under its belt and an RX 7800/7700 XT card nowhere in sight.


Older Post Newer Post