Here’s why PC gamers aren’t upgrading GPUs as often as they used to

Is there any point in upgrading?

RTX 4090

You can trust PC GuideOur team of experts use a combination of independent consumer research, in-depth testing where appropriate – which will be flagged as such, and market analysis when recommending products, software and services. Find out how we test here.

Last Updated on

The recent Steam Hardware Survey has revealed that more PC gamers are choosing to run older GPU hardware than upgrade to the likes of the RTX 40 or AMD Radeon RX 7000 series. Despite advancements made in the technology and availability for the best part of two years, it hasn’t seemed to have made any real impact.

In fact, according to the March 2024 Hardware Survey results, no RTX 40 series or RX 7000 series GPU even cracks the top five with the most popular GPUs still being the RTX 3060, followed by the RTX 2060, GTX 1650, RTX 3060 Ti, RTX 3070, and GTX 1060. Not only are Ampere and Turing dominating ranking, but even Maxwell is represented before Ada even gets on the board at a distant no. eight position.

When we finally get to Ada, the leading model accounting for 2.59% of the client’s 120 million monthly active users (around 310,800) are running the mainstream RTX 4060. While an impressive card for the money, the budget offering is far from a powerhouse the likes of the RTX 4090 or RTX 4080, meaning that most PC gamers do not appear to be concerned with bleeding edge performance, high refresh rates, or gaming in higher resolutions, instead preferring value plays.

Based on this data, it’s hard to paint an encouraging picture for the future of GPU technology when most people are content running older gear at higher price-to-performance ratios, and it’s hard to blame people. This could be due to a combination of factors which I’ll touch upon further down the page, including the semiconductor shortage that happened over the pandemic, the price increases in the market, and proprietary technology. It all culminates where people aren’t convinced anymore, as the days of upgrading for each GPU generation may be finally done for good.

GPUs became scarce and then significantly more expensive

Three Nvidia GeForce RTX 4090 GPUs showcased against a dark background, highlighting their advanced cooling fan design and sleek, metallic finish, perfect for PC gamers upgrading GPUs.
The Nvidia RTX 30 series debuted back in 2020 but was hard to find for several years (Source: Nvidia)

If I’m to map a starting point with how things could have turned out this way then you’ll need to cast your mind back to the paper launch of the RTX 30 series. Everything was scheduled to happen just as it always did; Nvidia announced Ampere hardware would be coming out in September 2020, however, then unforeseen issues with the then-fledgling semiconductor shortage meant these video cards were in seriously short supply.

What was previously as easy as walking into a retail store or ordering online became a logistics nightmare. Everything from top-end RTX 3090s to entry-level RTX 3060 Tis became incredibly difficult to track down, and when you did, you were often faced with the grim reality of paying over the odds for the privilege. It wasn’t just scalpers getting in on the action either, some retailers were even facilitating secondary sellers through their platforms to sell the GPUs at grossly inflated rates.

I covered the launch of Ampere at the time and closely followed the restocks as retailers fought hard with everything from virtual queues and strict limitations to try and cut down on the scalpers hoarding graphics cards. Those wanting to buy an RTX 3080 or RTX 3070 Ti for MSRP during this time window were frankly out of luck, as it meant paying close attention to the likes of Telegram pages, Discord servers, and stock trackers to try to get ahead. It was a truly horrific time to be a PC gamer, and it’s not going to be forgotten any time soon.

GPU manufacturers seized the opportunity and gamers paid

Various models of high-performance GPU upgrades displayed on a reflective surface.
Various RTX 40 series cards from some of Nvidia’s partners (Source: Nvidia)

Fast forward to the end of 2022 (two years later) and RTX 30 series cards were finally available for their respective MSRP, right until the RTX 40 series debuted and that’s where the problems started. Nvidia had seen that consumers were paying over the odds for its hardware and this was met with price increases going from Ampere to Ada. For example, the RTX 3080 ($699) became the RTX 4080 ($1,119). This extended to 70-class, too, with the RTX 3070 Ti ($599) up to RTX 4070 Ti ($799).

Simply put, it was a sting in the tail that burned a lot of people who were holding out for GPUs to become cheaper and more available, only for the latest and greatest to roll out with a price hike. It’s something that Nvidia itself would later attempt to course correct earlier this year with the RTX 40 series, in particular the RTX 4080 Super, which knocked $200 off the lofty MSRP of the original, but this was too little too late for some. It went to show that this generation wasn’t exactly pro-consumer.

Asking gamers and creators to pay significantly more money for the new equivalents of what they had been pining for a few years had to sting. This was a time when I saw a change in real time from friends around me who upgraded every generation and decided that their mid-range RTX 3070 Ti or RX 6800 XT was good enough after all. That’s before realizing that proprietary tech was also locked behind a paywall, too.

DLSS 3 Frame Generation is locked behind new GPUs

Diagram illustrating Nvidia DLSS 3 features, including super resolution, motion vectors, multi-frame generation, and optical flow with fps measurements displayed for PC gamers.
How DLSS 3’s Frame Generation works with the RTX 40 series GPUs (Source: Nvidia)

There’s no faulting the performance of the RTX 40 series as the best graphics cards for gaming. However, one thing that burned a ton of people is the fact that DLSS 3’s Frame Generation was only possible by upgrading. While Team Green had typically made performance increases generation by generation, not since the adoption of RTX with Turing in 2018 did it wall out an innovation. Even now, if you want Frame Generation, you need to upgrade, meaning you’re artificially missing out.

It paints a picture that Nvidia could usher in a new technology and then solely lock it behind the RTX 50 series which is rumored to be releasing at the end of this year. In a sense, that takes away the agency from gamers and instead replaces it with a sense of urgency. Before, you would want to upgrade your hardware to play your games at higher framerates or higher resolutions, but now, a company is telling you that if you don’t do it, you’ll be left behind, removing the choice from you.

That’s not exactly the best sell at a time when many countries are suffering from inflation where splashing out on pricey hardware is something that few people can justify. It also casts doubt around exactly how long a leading graphics card could be relevant, as while the RTX 4090 is a truly incredible GPU, it could be made obsolete by its successor should it do something proprietary, which could be a bitter pill to swallow for users splashing out anywhere from $1,599 to $2,000+ right now.

Where this leaves us in 2024

Looking through the statistics on how the older graphics cards are dominating the hardware survey, it’s clear that these factors have had an impact on consumer spending habits. If you’re not actively been given organic reasons to want to upgrade, instead of being essentially forced, it’s going to make you less likely to open your wallet, especially in trying times. Combined with this the price increases and it makes buying a new graphics card every time a harder sell, even at the mainstream end.

How often should you upgrade your GPU?

Generally speaking, you should upgrade your GPU when the games you enjoy playing are no longer able to provide playable framerates in your chosen target resolution. This could be 60fps in 1080p, 1440p, or 4K respectively. Alternatively, you may be upgrading your monitor for a new model with a higher resolution and refresh rate and then after something more powerful. We recommend upgrading every five years to stay current with the console parity of console hardware.

Aleksha McLoughlin is Hardware and News Editor for PC Guide and she oversees buying guides, reviews, news, and features on site. She was previously Hardware and Affiliates Editor at VideoGamer.