News Nvidia announces RTX 50-series at up to $1,999

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's fair then. We can agree to disagree and I'll stop here.

Just as a final comment: I can't stand graphical glitches in any capacity and when testing both FG and upscaling it's just worsens the experience for me way too much. This is not even talking about latency for FG. This reminds me back when the first 120Hz monitors were popping and people was like "nah; 60Hz is alright", or the usual "the eye can't see more than 24 FPS", where in this case it's on the opposite extreme of the spectrum.

Regards.

So Frame Gen isn't rendering anything the game is doing. Instead its just looking at the previous few frames and rending a third based on what's most likely to follow.

You walking forward, then turn left, except its already rendered the frame with you walking forward so next cycle it renders the frame of going turning left. Then an enemy target is moving one way only for the game to move it the other way and the AI frame ends up being wrong again. What you see on the screen is not only behind what the game is processing, but might not even reflect what you should be seeing.

DLSS Upscaling is a bit better, the game is rendering at a lower resolution and it's using past models to guess what a higher res image might look like. Better then some of the other upscaler methods out there, decent for someone with a weaker system trying to push a 4K display.
 
  • Like
Reactions: Loadedaxe
I would not be so sure, the trick here is the proliferation of AI-based techniques, especially given all the vendors are now playing ball with it, with AMD joining the fun.

It's not only that a gen step up would increase mildly raster performance, but it will also considerably increase AI compute on one hand AND will also have 2 more years of game releases and updates to introduce that in the games.

I actually think that the next two years will result in a major step up there because of that. And then you will end up having a new console generation too, which is practically assuredly going to utilize those capabilities too.
That's the thing. 8 out of the top 10 GPU's support DLSS, which means they are going to last people a lot longer than pre-DLSS cards. The third most used card is still a 1650. Think about that. A 4060Ti is 4th. How long is that card going to last people with DLSS 3.5 support, when so many people are still somehow gaming on a 1650?

There's no new meaningful console generation coming out soon. PS5 Pro launched 2 months ago, we're not going to see a PS6 for another 4 or 5 years. Switch 2 games aren't getting ported to PC and it's still going to be way slower than PC hardware. Early rumors are the X Box may see a new console towards the end of 2026, but no one really cares. Xbox is typically outsold 2:1 or more vs Playstation.
 
That's the thing. 8 out of the top 10 GPU's support DLSS, which means they are going to last people a lot longer than pre-DLSS cards. The third most used card is still a 1650. Think about that. A 4060Ti is 4th. How long is that card going to last people with DLSS 3.5 support, when so many people are still somehow gaming on a 1650?

There's no new meaningful console generation coming out soon. PS5 Pro launched 2 months ago, we're not going to see a PS6 for another 4 or 5 years. Switch 2 games aren't getting ported to PC and it's still going to be way slower than PC hardware. Early rumors are the X Box may see a new console towards the end of 2026, but no one really cares. Xbox is typically outsold 2:1 or more vs Playstation.

The whole point is exactly that - it's no longer a novelty and every half-decent game that is at least somewhat GPU demanding is now not only coming with DLSS or alternatives but also relying on it.

In other words, just about every decent title going forward will have AI based rendering techniques baked in, and what's more - it's quite easier to go from DLSS 2 to DLSS 3/4 for the devs, than from nothing to DLSS.

Going forward AI processing capabilities for GPUs will matter a lot more and so will frame generation at least in some capacity especially for more budget GPUs.

IMO, the most impressive tech demo we did not see would be maxed out CP77 running at 4k 100+ FPS on something like 5060 and we will get to the point in 2-3 years from now where game devs will start to assume you have at least 40 series cards of some sort to enable the visuals they want to provide that would be impossible with just pure base rendering for mainstream gamers.
 
A ton of good opinions here. made my morning reading a worth while with some good coffee.

The future of gaming is changing, and has been for the last few years. Accept it and be part of the growth, or deny it and play at 1080p 60Hz, just because... reasons.

I see a lot of good points and agree with parts of everyone's points. But we are hitting a ceiling with current hardware, and gamers are wanting,..... no, demanding more. Nvidia just happens to be the best at pulling off this "trickery".

It is what it is, all we can do is wait for all the review goodness coming.
 
The whole point is exactly that - it's no longer a novelty and every half-decent game that is at least somewhat GPU demanding is now not only coming with DLSS or alternatives but also relying on it.
How many games without raytracing currently require DLSS to run medium settings at 1080p on a 3060 level card?

If you include raytracing, 9 out of the top 10 cards aren't going to perform well if they perform at all. Game devs know this.
 
It was a simple question. What is a mainstream gamer today. As you referenced. Steam shows 2/3's of gamers game at 1080p or lower and 70% currently have a GPU with 8GB or less VRAM. 1080p, probably 60hz maybe creeping up to 100hz. That's a mainstream gamer.

Yep. I've read that before. It's kinda shocking actually because I'm one that embraces technology. 1080p debuted in like 2007 or something. I remember getting my first blu ray player and saying goodbye to DVD.

My first 4K display for my PC was somewhere around 2015... and every TV in my house as well as my PC display is a 4K OLED.

I'd be willing to bet those same people from the steam survey are rocking a 4K panel in their living room.., so why not their PC? They all love their fps obviously. 🤣
 
Yep. I've read that before. It's kinda shocking actually...I'd be willing to bet those same people from the steam survey are rocking a 4K panel in their living room.
Really, you're shocked that mainstream typical gamers aren't using high-end components?

Or you think that, given the choice between two monitors at the same kind of price, 4K but low-spec (in colour reproduction, response, contrast etc.) or 1080p but high-spec, it's better to buy the 4K?

Typical people spend a lot more time watching TV than playing games, plus younger gamers often don't have the budget (either themselves or through their parents) to justify spending high on a gaming monitor.

You are not typical, and have lost sight of who a typical gamer is.
 
Really, you're shocked that mainstream typical gamers aren't using high-end components?
justify spending high on a gaming monitor.

You are not typical, and have lost sight of who a typical gamer is.

High end? I'm looking at 32" 4K 165hz panels right now for $400. I see 77" 4K panels for less than $1000. The only panels that are even remotely "expensive" are OLEDs.

My entire point was 1080p is approaching 20 years old... and 4K panels are as cheap as they ever have been... so yeah... it's shocking that more haven't made the jump to 4K resolutions. As I said... it has to be the fps because it's obviously not about the eye candy that comes with higher resolutions.

I haven't lost sight of jack... but I did just get a text from Mr. 2007... he told me to let you know that he wants his 1080p resolution back.
 
My entire point was 1080p is approaching 20 years old... and 4K panels are as cheap as they ever have been... so yeah... it's shocking that more haven't made the jump to 4K resolutions. As I said... it has to be the fps because it's obviously not about the eye candy that comes with higher resolutions.
The cost of entry into 4K may have gone down, but the cost to stay there compared to remaining at lower resolutions may be keeping folks from switching over.

I've gotten the impression that folks who play 4K are less interested in turning down settings when age starts to show, and more likely to go and buy the next big thing(s), so they can keep playing at those high settings... and those playing at lower resolutions are more willing to make compromises with lower settings before spending the money on hardware.

So, one who entered the 4K realm back yonder with a 2080Ti is more likely to keep spending on the higher tiers of cpu and gpu up 'til now and later on, vs another who stayed at 1080p with something like a 2060 and 10600K, turning down settings, and/ skipping a couple of generations before their next upgrade.

TL;DR: Could be a money issue.
 
The cost of entry into 4K may have gone down, but the cost to stay there compared to remaining at lower resolutions may be keeping folks from switching over.

I've gotten the impression that folks who play 4K are less interested in turning down settings when age starts to show, and more likely to go and buy the next big thing(s), so they can keep playing at those high settings... and those playing at lower resolutions are more willing to make compromises with lower settings before spending the money on hardware.

So, one who entered the 4K realm back yonder with a 2080Ti is more likely to keep spending on the higher tiers of cpu and gpu up 'til now and later on, vs another who stayed at 1080p with something like a 2060 and 10600K, turning down settings, and/ skipping a couple of generations before their next upgrade.

TL;DR: Could be a money issue.

Looking into their signature says it all. Within a few months it'll list a 5090.

The thing with 2160p is that unlike previous generations of resolution jumps, this one was ridiculous. 1080p to 2160p is 4x the amount of pixels that need rasterized and while the compute required isn't quite linear, it's still goes up pretty significantly with resolution. This is the reason why upscalers and fake-FPS generators are all the talk now, it allows for under powered cards to kinda sorta do "4k gaming".

1080p is the norm and has been the norm for awhile, 1440p is starting to catch on more and more, it's only 77% more pixels and a much easier target to hit.
 
High end? I'm looking at 32" 4K 165hz panels right now for $400.
must me a crap panel, or not available here, as the only 32" 4k screen a store here has is 240 hz, and $1500, but it is an oled.. even then, the least expensive IPS 4k screen is a 32" 144hz quantum dot @ $980

not to mention.. you also kind of need a higher end video card to be able to use that 4k screen @4k... which most may not have.. hence the 1080p res, or maybe 2k....
 
The cost of entry into 4K may have gone down, but the cost to stay there compared to remaining at lower resolutions may be keeping folks from switching over.

I've gotten the impression that folks who play 4K are less interested in turning down settings when age starts to show, and more likely to go and buy the next big thing(s), so they can keep playing at those high settings... and those playing at lower resolutions are more willing to make compromises with lower settings before spending the money on hardware.

So, one who entered the 4K realm back yonder with a 2080Ti is more likely to keep spending on the higher tiers of cpu and gpu up 'til now and later on, vs another who stayed at 1080p with something like a 2060 and 10600K, turning down settings, and/ skipping a couple of generations before their next upgrade.

TL;DR: Could be a money issue.
Money is definitely one factor. I'd argue the bigger issue, pun intended, is that people don't want large monitors on their desk, and this will never change. It's not the living room where there is more space to wall mount or have a TV stand up against a wall and bigger is always better for a TV. 1440p shouldn't go below 27", and 4k" shouldn't go below 32" if you want to use it without text scaling. The normal office desk people have in their home won't fit a 32" and if it does, they don't want such a large screen monopolizing so much space on their desk. I've said it for years that for this reason 4k will never be mainstream for desktop computer users.
 
The cost of entry into 4K may have gone down, but the cost to stay there compared to remaining at lower resolutions may be keeping folks from switching over.

I've gotten the impression that folks who play 4K are less interested in turning down settings when age starts to show, and more likely to go and buy the next big thing(s), so they can keep playing at those high settings... and those playing at lower resolutions are more willing to make compromises with lower settings before spending the money on hardware.

So, one who entered the 4K realm back yonder with a 2080Ti is more likely to keep spending on the higher tiers of cpu and gpu up 'til now and later on, vs another who stayed at 1080p with something like a 2060 and 10600K, turning down settings, and/ skipping a couple of generations before their next upgrade.

TL;DR: Could be a money issue.
I mean, many people here are the enthusiasts or gamers that can't fathom the idea of anything less than no compromise visuals. We do not represent the mainstream here with our thousands of posts in this forum.

For example, I have 3080Ti and 1440p OLED ultrawide, and for a hot second or two I considered 5080 or even 5090 and the only thing that's stopping me is 16GB vram on 5080 and $2k+ MSRP tag on 5090. That is despite 3080Ti being still really all you ever need to run 99% of the existing titles at Ultra 1440p widescreen.

If they'd pop out 5080Ti - somewhere midway between 5090 and 5080 - I'd bite in a split second and then I'd giggle happily at those remaining 1% of the games where it would actually be useful for me over 3080Ti.

And I suspect - eventually they will pull 5080Ti out, whether it is from enough faulty big Blackwell dies or 5080 Super with 24GB ram thanks to 3GB modules availability.
 
High end? I'm looking at 32" 4K 165hz panels right now for $400. I see 77" 4K panels for less than $1000. The only panels that are even remotely "expensive" are OLEDs.
Feel free to link to those panels. I'll bet they're mediocre, and/or something substantially better can be found at 1080p or 1440p at the same price.

Quick example without too much trying, a Samsung Odyssey G7 28 inch 144 Hz 1 ms IPS 4K Gaming monitor is what I found here for £399. Review. £399 because it's three years old now. Generally praising it but then little things like "I think it's disingenuous to call it a 1ms monitor when the real average performance is more like 5ms." and "Samsung advertises the Odyssey G7 S28 as an HDR monitor...a terrible HDR experience with massive blooming and an insufficient contrast ratio...The HDR experience is bad and miles off a true HDR panel".

Starting at $400 for a monitor...there are people in the forums here trying to build or upgrade their whole PC with that as their budget. But the fact that you don't even consider up to $1000 expensive for a computer monitor pretty much says everything about your perspective in these things.

The large majority of people on Steam aren't obsessive rich gamers: some will be kids with little income and parents who have other things they need to spend their money on. Plenty will be ordinary people who play a game for a few hours now and then, don't post in the communities or forums, don't know or care when the next NVidia card is coming out...they play a game on Steam for a bit and then go back to doing other stuff. And as others have said above, typical people don't stick massive monitors on their desks to sit three feet from, or have any interest in forking out for the hardware required to run a 4K resolution, or are willing to pay the same money for a poorer experience just to say "4K". Latest Steam survey results: 56% 1080p; 20% 1440p. A miserable 4% running 4K. People aren't interested.
 
TL;DR: Could be a money issue.

With some salt mixed in...

Money is definitely one factor. I'd argue the bigger issue, pun intended, is that people don't want large monitors on their desk, and this will never change.
I've said it for years that for this reason 4k will never be mainstream for desktop computer users.

Well you know what they say.... don't knock it till you try it. I ran a 27" for years... and in 2017 went to a 48" 4K OLED on the desk.

I will never go back. The real estate alone is worth it... and lets be real... it's not like I'm sitting 3 ft from a 65 or 80 inch screen. 48" is amazing... and 42" might be as well... it wasn't available in OLED back in 2017.

Doesn't matter what I'm doing... Word, CAD, gaming... the additional real estate paired with the eye candy is something I will never give up.

Anyway... it will be interesting to see if 1080p is still mainstream in another 20 years. 🤣

Starting at $400 for a monitor...there are people in the forums here trying to build or upgrade their whole PC with that as their budget.
Latest Steam survey results: 56% 1080p; 20% 1440p. A miserable 4% running 4K. People aren't interested.

I like how you claim that Steam is indicative of the gaming world as a whole. Not all of us live at home working at McD's... and what kind of PC can you build with $400?

Nothing else to see in this thread. Haters gonna hate. I'm out. 😍
 
High end? I'm looking at 32" 4K 165hz panels right now for $400. I see 77" 4K panels for less than $1000. The only panels that are even remotely "expensive" are OLEDs.

My entire point was 1080p is approaching 20 years old... and 4K panels are as cheap as they ever have been... so yeah... it's shocking that more haven't made the jump to 4K resolutions. As I said... it has to be the fps because it's obviously not about the eye candy that comes with higher resolutions.

I haven't lost sight of jack... but I did just get a text from Mr. 2007... he told me to let you know that he wants his 1080p resolution back.
And you can find excellent 1080p panels for under 200. If you are on a budget, you aren't looking at the best stuff out there. You settle for less. A monitor resolution that can be driven by the same mid-level GPU for years is far more tasty to the vast majority of people than some high resolution monitor you have to get a new card for every generation. When was the last time you left your ivory tower? Life isn't getting cheaper; most people's money is better spent elsewhere, not on a stupid gaming rig that really only has to be so powerful because of the gaming and might not see much use else. And most people, believe it or not, do not need nor want, and certaily can't afford, a system that costs enough to buy food for several months for.

I helped build a PC for someone who dearly needed it the other day, they were still gaming on an old Xeon and GTX 980. The whole system cost maybe $1100-1200. Please get me a system that can drive a 4k monitor without having to drop graphics to low. My nephew also built a new system recently for around that money. Guess what his monitor resolution is...
 
That's Founders. Aftermarkets will be the same or higher, depending on product tier. Expect scalped ones to be even higher.

And who knows, maybe Nvidia moved the goalposts again, and it's actually a 5070. Those model names don't mean jack anymore.



But they didn't..?
$1,999. Come on, there's likely 99 cents behind that.
The AIO and custom block models would come close to that $2,500 though.
I don't waste time with water coolers.
 
Think i might wait for better Vram on a 5080 ..

5090 would be a better buy be it crazy more expensive if its going to destroy the 4090 its a card i could sit on for a long time but its just TOO expensive !!

AIB in AUS will be around the 5k mark and for a GPU its stupid pricing !!

Ill look at what AMD is offering but i think a solid 5080 with better Vram will be a replacement for my 7900xtx at some point !!

If AMD isnt going to offer a true ( at the very least ) 5080 competitor at $1000usd then sorry AMD you've lost my business
 
Generally speaking, the most vocal whining about Nvidia tends to be from the Nvidia fans, not the neutral fans. Whining on Nvidia's behalf, not against them. I don't expect this to change, and thank you for the validation. 😀
making or characterized by a long, high-pitched cry or sound.
"a whining voice"

Nowhere in that definition does it say showing support for a product that you use and appreciate.