Older GPUs still reign supreme among PC players as prices are high still
By Daniel Davis
The annual Steam Survey just released it’s yearly numbers and it finds that gamers are still holding strong on the 10 series and the 16 series of GPU’s.
The 1650 reigns supreme as it makes up about 5% of the overall player base giving notion to the inflated market of GPU pricing right now. In second place is the RTX 3060 followed by the 1060 and then the 2060 with the 3060 Laptop in there as well.
This brings into question how the GPU market is doing as far as sales go. The new 40 series offers only marginal performance upgrades from the 30 series for a much higher price tag. The only time it makes sense up upgrade is if you are jumping from a mid-tier 10 or 16 series to a 40 series for the price. However, $1,500+ for a GPU is ridiculous.
This all being said console players are here for it. For years console players have heard PC gamers and their obnoxious boasting about graphics and the minimal additions to the games. Meanwhile, console players haven’t had any issues largely. Even when PC gamers were complaining about frame rate issues with Jedi: Survivor, console players had no issues.
Why aren’t gamers upgrading to the 40 series yet?
There are two reasons why gamers aren’t upgrading right now. One is cost, and the other is practicality. My PC is an older PC that I upgraded in 2016 to a GTX 980 and a Ryzen 5 2600x with a b450 mother board and DDR4 16GB of RAM. Even with this, I can run Elden Ring at 1080p and 30 frames a second on medium graphics just fine. I’ve also has no problem playing Battlefield 2042, Assassins Creed: Odyssey, and Horizon: Zero Dawn as well as The Masterchief Collection. In fact I’ve been flirting with finally upgrading my GPU as I’d like to get up to the 80-100FPS area. Even then, I can find a 3060 on Facebook Marketplace for less than $200 and I can find a 2070 for $150.
Not to mention the main issue with PC gaming is the lack of support developers give to the PC ports of games. The Last of Us is a perfect example of developers ignoring the PC gamers in favor of just using DLSS or FRS artificial resolution to make up the failures they have during development. What’s also true is the amount of power and resolution you can get out of computer hardware is nearing the limits of known technology. So developers have two choices to make. They can develop games for the mass of people that use mid-tier cards or they can rely on FSR and DLSS to upres their games. Developers have chosen the later.
Are GPU manufacturers encouraging developers to release poorly optimized games?
They’ve relied on GPU software to make up their failings as developers instead of just making a game that can run well on all hardware. This being said, I’m not a programmer or a developer so I’m not privy to the behind the scene stuff. But what I can say is I’m noticing a trend going all the way back to Assassins Creed: Unity with all the technical issues and even more recently the Battlefield Series which had many issues upon launch and The Last of Us which needed many updates to run smoothly, even on higher end hardware.
Gamers aren’t hard to please, at least the majority of us. All we want is our $70 not to be wasted and have to wait for patches before playing a game smoothly. Jedi: Survivor launched with a 45GB day one patch which is as close to unacceptable as you can get. I built my PC as the Xbox One was being phased in after the decade long dominance of the Xbox 360. When Microsoft announced I could no longer play my vast library of games on my XBO, I was done with console gaming.
As it stands, gamers are perfectly fine waiting around until the prices come down, including me.