Well I am shocked, SHOCKED I say! Well, not that shocked.
It’s like how banks figured there was more money in catering to the super rich and just shit all over the rest of us peasants, GPU manufacturers that got big because of gamers have now turned their backs to us to cater to the insane “AI” agenda.
Also, friendly advice, unless you need CUDA cores and you have to upgrade, try avoiding Nvidia.Nvidia doesn’t really care about the high-end gamer demographic nearly as much as they used to, because it’s no longer their bread and butter. Nvidia’s cash cow at this point is supplying hardware for ML data centers. It’s an order of magnitude more lucrative than serving consumer + enthusiast market.
So my next card is probably gonna be an RX 9070XT.
even the RX9070 is running around $900 USD, I cannot fathom affording even state-of-the-art gaming from years ago at this point. I am still using a GTX1660 and playing games from years ago I never got around to and having a grand time. Most adults I know are in the same boat and either not even considering upgrading their PC or they’re playing their kid’s console games.
Every year we say “Gonna look into upgrading” but every year prices go up and wages stay the same (or disappear entirely as private-equity ravages the business world, digesting every company that isn’t also a private equity predator) and the prices of just living and eating are insane, so at this rate, a lot of us might start reading again.
It makes me wonder if this will bring more people back to consoles. The library may be more limiting, but when a console costs less than just a gpu, itll be more tempting.
For me it’s the GPU prices, stagnation of the technology (most performance gains come at the cost of stupid power draw) and importantly being fed up with AAA games. Most games I played recently were a couple years old, indie titles or a couple years old indie titles. And I don’t need a high powered graphics card for that. I’ve been playing far more on my steam deck than my desktop PC, despite the latter having significantly more powerful hardware. You can’t force fun through sheer hardware performance
IT litterally costs $3000
Thats almost 4 time the cost of my 3090.
Thats almost a year of work on my country lol…
1 2-week take home check for me. AMD is where I stay. $650 card lasting me 5 years on a 2k UWHD monitor at more than 180 FPS.
I’ll never understand NVIDIA owners.
The good games don’t need a high end GPU.
Terraria minimum specs: “don’t worry bro”
Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.
Clair obscur runs like shit on my 3090 at 4k :(
Problem is preordering has been normalized, as has releasing games in pre-alpha state.
Anyone that preorders a digital game is a dummy. Preorders were created to assure you got some of the limited physical stock.
I just looked up the price and I was “Yikes!”. You can get a PS5 Pro + optional Blu-ray drive, Steam Deck OLED, Nintendo Switch 2 and still have plenty of money left to spend on games.
I remember when High-end-GPUs were around 500 €.
Ah capitalism…
Endless infinite growth forever on a fragile and very much finite planet where wages are suppressed and most money is intentionally funneled into the coffers of a small handful of people who are already so wealthy that their descendants 5 generations down the line will still be some of the richest people on the planet.
Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.
Ex-fucking-actly!
Ajajaja, gamers are skipping. Yeah, they do. And yet 5090 is still somehow out of stock. No matter the price or state of gaming. We all know major tech went AI direction disregarding average Joe about either they want or not to go AI. The prices are not for gamers. The prices are for whales, AI companies and enthusiasts.
5090 is kinda terrible for AI actually. Its too expensive. It only just got support in pytorch, and if you look at ‘normie’ AI bros trying to use them online, shit doesn’t work.
4090 is… mediocre because it’s expensive for 24GB. The 3090 is basically the best AI card Nvidia ever made, and tinkerers just opt for banks of them.
Businesses tend to buy RTX Pro cards, rent cloud A100s/H100s or just use APIs.
The server cards DO eat up TSMC capacity, but insane 4090/5090 prices is mostly Nvidia’s (and AMD’s) fault for literally being anticompetitive.
deleted by creator
It doesn’t help that the gains have been smaller, and the prices higher.
I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.
I’m in the same boat.
In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.
I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.
I just picked up a used RX 6800 XT after doing some research and comparing prices.
The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I’m very happy with my purchase. Solid upgrade from my 1070 Ti
Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.
Well that depends on your definition of significant. Don’t get me wrong, the state of the GPU market is not consumer friendly, but even an RX 9070 provides over a 50% performance uplift over the RX 6800.
I have a 6700xt and 5700x and my pc can do vr and play star citizen, they are the most demanding things I do on my pc, why should I spend almost £1000 to get a 5070 or 9070 and an am5 board+processor?
I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.
I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.
5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.
Sticking with 1440p on desktop has gone very well for me. 2160p isn’t worth the costs in money or perf.
It’s never been normal to upgrade every year, and it still isn’t. Every three years is probably still more frequent than normal. The issue is there haven’t been reasonable prices for cards for like 8 years, and it’s worse more recently. People who are “due” for an upgrade aren’t because it’s unaffordable.
If consoles can last 6-8 years per gen so can my PC.
Your PC can run 796 of the top 1000 most popular games listed on PCGameBenchmark - at a recommended system level.
That’s more than good enough for me.
I don’t remember exactly when I built this PC but I want to say right before covid, and I haven’t felt any need for an upgrade yet.
“When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff
Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.
Nowadays the new cards are 10% faster for 15% more money.
I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.
When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?
Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.
Those cards were like what though, $199?
That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.
Still rocking a GTX 1070 and I plan on using my Graphene OS Pixel 8 Pro till 2030 (only bought it (used ofc) bc my Huawei Mate 20 Pro died on my in October last year 😔)
It’s just because I’m not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I’m waiting until AMD gets a little better with ray tracing and switching to team red.
deleted by creator
Yeah I have a 3080ti, If I had an older card I would 100% be buying AMD right now though.
I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.
Don’t worry, you haven’t missed anything. Sure, the games are prettier, but most of them are designed and written more poorly than 99% of indie titles…
The majority sure, but there are some gems though.
Baldurs Gate 3, Clair Obscur: Expedition 33, Doom Eternal, Elden Ring, God Of War, … for example
You can always wait for a couple of years before playing them, but saying they didn’t miss anything is a gross understatement.
It’s funny, because often they aren’t prettier. Well optimized and well made games from 5 or even 10 years ago often look on par better than the majority of AAA slop pushed out now (obviously with exceptions of some really good looking games like space marine and some others) and the disk size is still 10x what it was. They are just unrefined and unoptimized and try to use computationally expensive filters, lighting, sharpening, and antialiasing to make up for the mediocre quality.
The irony is it is optimized in several notable cases, like cyberpunk 2077 and most major ue5 engine based games. It’s just all the mipmap levels from distant to 4k up close really add up when the game actually has a decent amount of content.
I wonder how many people really run games at settings that require the highest detail. I bet a lot of people would appreciate half the DL size or more just to leave 'em out and disable ‘ultra’ settings.
Indies are great. I can play AAA titles but don’t really ever… It seems like that is where the folks with the most creativity are focusing their energy anyways.
GPU prices are what drove me back to consoles. It was time to overhaul my PC as it was getting painfully out of date. Video card alone was gonna be 700. Meanwhile a whole ass PS5 that plays the same games was 500.
It’s been 2 years since and I don’t regret it. I miss mods, but not nearly as much as I thought. It also SOOO nice to play multiplayer games without cheaters everywhere. I actually used to be one of those people who thought controllers gave an unfair advantage but… you can use a M/KB on PS5 and guess what? I do just fine! Turns out that the problem was never controllers, it was the cheaters.
But then there is that. The controller. Oh my lord it’s so much more comfortable than even the best gaming mouse. I’ve done a complete 180 on this. So many game genres are just so terrible to play with M/KB that I now tell people whining about controller players this:
Use gaming equipment for gaming and leave office equipment in the office.
Uhhh, I went from a Radeon 1090 (or whatever they’re called, it’s an older numbering scheme from ~2010) to a Nvidia 780 to an Nvidia 3070 TI. Skipping upgrades is normal. Console games effectively do that as well. It’s normal to not buy a GPU every year.
Ain’t nobody got time (money) for that!
As long as you make an upgrade that’s equivalent or better than the current console generation, you’re then basically good-to-go until the next generation of consoles comes.
I don’t really care if my current graphics are better or worse than the current console generation, it was just an illustration comparing PC gaming to console gaming.
I have a 3080 and am surviving lol. never had an issue
Still running a 1080, between nvidia and windows 11 I think I’ll stay where I am.
I have a 3080 also. It’s only just starting to show it’s age with some of these new UE5 games. A couple weeks ago discovered dlssg-to-fsr3 and honestly i’ll take the little bit of latency for some smoother gameplay
Pretty wise, that’s the generation before the 12HVPWR connectors started burning up.
Afaik the 2080was the last FE with a regular PCIe power connector.
3090s weren’t burning up, though.