Review Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Careful with strawmans: "humans can't fly, but if we grow wings and lower our bone density we can". Yes, to my eyes, there's some things which in the current state can't be fixed unless the approach changes to make it so the engine takes on more of the "interpolation" than nVidia's or AMD's drivers do for it. Much like upscaling, graphical engines have been doing it for years before nVidia, AMD and Intel came up with DLSS/FSR/XeSS. I just don't have enough in-depth knowledge on the interpolation (FrameGen) on how it works at the driver level that is not being included in engines like upscalers are. Point is: upscalers can work outside of UI, but so far, FrameGen can't work pre-UI data is in the scene. That needs to change and it means moving the technique of "interpolation" closer to the game engines themselves.
Whatever needs to be done will be done, it's a simple matter of where the hardware is going.

If every GPU starting this year will be capable of FG/MFG and every console starting next gen will be capable of that - you can bet the game engines will catch up to that.

Would that only Nvidia was pushing it, but with whatever we know about Team Red is cooking for next gen consoles/GPUs, they also are in on the party. Their FSR4 is the first step of them catching up to the reality of raster diminishing returns as opposed to mostly unexplored ML/AI-based techniques.

As for the game engines - in my opinion the shoe will drop with next gen consoles, which will all but certain use both upscaling and frame generation at least to some extent. The game engines will fall in line or will be left behind, because they just won't keep up with the visual fidelity achieved through those techniques.
 
In the meantime, it looks like all the MSRP 5090s at Newegg are already out of stock:

Yeah I haven't seen anything in stock anywhere except for eBay scammer listings selling a photo of a 5090 for $2300.

The ones that have legit (maybe?) 5090's in hand have bids in the $5k-$6k range. Whatever...

Good news is I don't have to take Newegg's $1350 trade offer... I got a buyer here at work who wants my 4090 for $1500. Maybe I could get more but I don't care. It's cash in hand.


It opens a lot more doors... now I can buy the first 5090 I see from any vendor. 🤣

I could absolutely tell the difference between 60, 120, and 240 in the demos. They could switch it and let you try. 240 and 360, and later 480? No, I didn't usually notice much gain there. But I could still spot the differences given time. The other hardware tech people I was with could also tell the difference.

Anyone who can't tell the difference between 60 and 120+ isn't really a PC gamer is my take

Really? No offense taken but I'll tell you right now I am 100% a PC gamer. I tested the 60/120 fps thing recently on my 120hz OLED running 4K resolution.

I saw 0 difference between 60hz and 120hz. Is it my eyes? I am 50 years old. So for that reason I run at 60hz for all my PC gaming which is mainly RPG titles like Diablo 4 and BG3 as well as MS Flight Simulator.

My resume consists of an Atari 2600 for Christmas in 1979... Colecovision gaming around 1981... NES in 1985... first computer gaming was Amiga 500 in 1987... and first PC was a Pentium 75mhz in 1997. Ultima Online! I played that religiously until WoW launched in 2004.

I'm definitely a PC gamer. 🤣
 
Last edited:
1440p 240hz is where I will stick for the next several years. Can't get most of my games up to 240 FPS, but that doesn't matter so much with G-Sync/FreeSync. OLED though for the extra fast response times. But really I like darker screens, so a low brightness mode OLED is right up my alley.

Also bought 1440p 165hz when that was brand new, lasted me 8 years.

Back in the day I actually lamented having to go from my 85hz CRT to a 60hz LCD, but the screen was quite a bit bigger. (17" CRT was a lot smaller than a 17" LCD at the time) Was a noticeable drop in responsiveness, but I got used to it.

That first 144hz panel was quite the change and took some getting used to. The move to 240hz hasn't really impacted me too much, but it is definitely there when I play the likes of Borderlands. Haven't really picked up too many FPS titles the last few years.
 
  • Like
Reactions: Jagar123
Really? No offense taken but I'll tell you right now I am 100% a PC gamer. I tested the 60/120 fps thing recently on my 120hz OLED running 4K resolution.

I saw 0 difference between 60hz and 120hz. Is it my eyes? I am 50 years old. So for that reason I run at 60hz for all my PC gaming which is mainly RPG titles like Diablo 4 and BG3 as well as MS Flight Simulator.
Okay, go to https://www.testufo.com/ and tell me you see 0 difference there.

The object is moving at the same velocity between all the moving lines, what differs is FPS.
 
Okay, go to https://www.testufo.com/ and tell me you see 0 difference there.

The object is moving at the same velocity between all the moving lines, what differs is FPS.

Difference in 15-30-60? Yeah it's obvious. I'm on a work PC with a 24" screen so I'll have to have a look when I get home on the 48" OLED I have for a PC display because I was talking about 60-120.

I notice the difference between 60Hz and 120Hz at the desktop, just moving the mouse around. And I'm 51. 🤷‍♂️

I didn't check the mouse movement... I was just looking at the game I was playing at the time which was D4. At any rate... maybe it is my eyes. My up close vision has tanked in the last 2 years and I need bifocals/readers to see anything clearly up close.

FPS has never been a huge concern of mine regardless. I've always been into the high resolution eye candy. Outside of Wolfenstein:Enemy Territory and Medal of Honor back around 2001 I've never cared for the competitive shooters which "require" high frame rates. The RPG/MMO/Flight Sim gaming I do doesn't require it. MSFS 2024 is flawless at 4k 60hz... same for D4 and BG3.

Anyone that claims you need 240 fps for flight simulator is delusional.
 
Funny thing is that on a 240 Hz display, if I take a photo of the screen, it actually looks opposite of what my eyes see due to the speed of the camera and pixel persistence.

Yeah that's another thing... looking at a game at 60hz and then refreshing the display at 120hz and looking for differences isn't quite the same as what is shown on that link... IMO.

At any rate... as said in the previous comment I'm not a fps junkie and am definitely not trying to debate the experts. 🤣

The 5090 upgrade is a no brainer when I've got a 4090 buyer for $1500... just need to find one that is at MSRP.
 
  • Like
Reactions: YSCCC
Difference in 15-30-60? Yeah it's obvious. I'm on a work PC with a 24" screen so I'll have to have a look when I get home on the 48" OLED I have for a PC display because I was talking about 60-120.
Once you're at home with 120 hz screen you will see. It will give you your max refresh FPS and a step below it.

It is probably the most obvious representation of how FPS matters for motion on screen beyond 60FPS. It will be obvious.
 
  • Like
Reactions: TeamRed2024
Once you're at home with 120 hz screen you will see. It will give you your max refresh FPS and a step below it.

It is probably the most obvious representation of how FPS matters, beyond 60FPS. It will be obvious.

Yeah I hear all the time how it matters in competitive gaming shooters... which I don't do outside of COD zombie solo mode. Don't have the time or the desire to try to compete with the 12 year olds who can snipe me from 500 yards out the moment I spawn. Been there done that years ago in W:ET and MoH. :cheese:

Anyway... wasn't trying to start a fps debate. I'm getting a 5090 but it's not because I need more fps. 🤣
 
Really? No offense taken but I'll tell you right now I am 100% a PC gamer. I tested the 60/120 fps thing recently on my 120hz OLED running 4K resolution.

In virtually all situations you won't because your visual cortex is disposing of the additional information. I did a whole breakdown a bit ago but basically the visual processing part of your brain is only really looking for contrast changes. Everything else is getting smeared and filled in what what you previously saw. It's how optical illusions work.

Now if there is a sudden large change in contrast, you visual cortex will absolutely notice that and prioritize the details around that change. I've had people absolutely swear they could "tell the difference", I had them look at two monitors and they "identified the 120 one", only for me to show they were both locked at 60. The trick is to use backgrounds that don't contrast heavily with the mouse cursor since what they are really looking for is duration of contrast.

Fake frames are fake frames, it's interpolation plain and simple. If tomorrow AMD released a drive update that would render a frame, then change 1 pixel slightly and resend the frame to the output, effectively doubling the measured "FPS, would people claim how amazing the double performance was? Or would people say it was BS? Yeah thought so.,

But remember guys, the company with over 80% gaming GPU market share has "advised" people to benchmark their cards "differently" this time around.

https://www.tomshardware.com/pc-com...will-require-some-changes-according-to-nvidia

473448592_4133086116918832_9146267560386356368_n.jpg
 
Last edited:
In virtually all situations you won't because your visual cortex is disposing of the additional information. I did a whole breakdown a bit ago but basically the visual processing part of your brain is only really looking for contrast changes. Everything else is getting smeared and filled in what what you previously saw. It's how optical illusions work.

Very interesting. I swear I didn't see a difference when I looked at D4 in 60 and 120 fps... and this conversation has me wanting to check it again when I get home.

The fact you said people have failed the dual 60 fps monitor test says a lot too. I swear I saw a meme recently about how some guy back in the day brought up "you need high fps" and PC gamers bought into it and have never looked back. Maybe it was Nvidia... 🤣
 
In virtually all situations you won't because your visual cortex is disposing of the additional information. I did a whole breakdown a bit ago but basically the visual processing part of your brain is only really looking for contrast changes. Everything else is getting smeared and filled in what what you previously saw. It's how optical illusions work.

Now if there is a sudden large change in contrast, you visual cortex will absolutely notice that and prioritize the details around that change. I've had people absolutely swear they could "tell the difference", I had them look at two monitors and they "identified the 120 one", only for me to show they were both locked at 60. The trick is to use backgrounds that don't contrast heavily with the mouse cursor since what they are really looking for is duration of contrast.

Fake frames are fake frames, it's interpolation plain and simple. If tomorrow AMD released a drive update that would render a frame, then change 1 pixel slightly and resend the frame to the output, effectively doubling the measured "FPS, would people claim how amazing the double performance was? Or would people say it was BS? Yeah thought so.,

But remember guys, the company with over 80% gaming GPU market share has "advised" people to benchmark their cards "differently" this time around.

https://www.tomshardware.com/pc-com...will-require-some-changes-according-to-nvidia

473448592_4133086116918832_9146267560386356368_n.jpg
Hm... Not quite as I understand how it works, but fair.

Humans are incredibly fast at recognizing motion changes, but bad at "understanding" what they're looking at. Peripheral vision for humans is top notch, as we evolved from pack hunters. We can detect very fine grained motion outside of our* focused view very easily, even if we can't make it out immediately until we focus our vision to it. This includes focal point latency and such. Ironically enough: human's inherent "latency" on their peripheral view is way lower than their focused view. If you "diffuse" your view (not focus on any particular element in front), your reaction times will improve a lot based on image information.

Or at least, that's how I've always operated and acted. Seems to be true for myself and I'm totally average.

Regards.
 
  • Like
Reactions: TeamRed2024
Whatever needs to be done will be done, it's a simple matter of where the hardware is going.

If every GPU starting this year will be capable of FG/MFG and every console starting next gen will be capable of that - you can bet the game engines will catch up to that.

Would that only Nvidia was pushing it, but with whatever we know about Team Red is cooking for next gen consoles/GPUs, they also are in on the party. Their FSR4 is the first step of them catching up to the reality of raster diminishing returns as opposed to mostly unexplored ML/AI-based techniques.

As for the game engines - in my opinion the shoe will drop with next gen consoles, which will all but certain use both upscaling and frame generation at least to some extent. The game engines will fall in line or will be left behind, because they just won't keep up with the visual fidelity achieved through those techniques.
Just be careful with the two intertwined topics (thanks nVidia!) when talking FSR or DLSS. Upscaling with "AI help" is already a solved and understood problem. As I said, game engines have had upscaling in one way or another for decades. Supersampling is just the reverse and it's been around for 30+ years? Point is: I agree from a higher level perspective. Game engines need to adapt and that is nothing new. My argument stems from a very simple observation: which engines are doing "native" RT? It's been how many years? Not even what nVidia calls RayTracing is "proper" RayTracing. And even with that limitation, there's no engines out there which have implemented RT in a native way. I remember John Carmack talking about Ray Tracing with ID-Tech3's engine (DOOM3) and how it could actually do native RT if implemented and future hardware could accelerate it accordingly. Without giving my memory too much credit, the gist was: "we're years and years away from being able to do so, but it's there". I think using Quake2 as the first RT showcase was no coincidence, but nVidia won't help Epic or any other game engine developer (Unity, Source, Bethesda on ID-Tech, etc) re-write theirs to use RT coding over traditional raster any time soon.

I guess the main point is: I don't recall any engine that can do accelerated ray tracing natively, but happy to be corrected. The thing we have now is a lame acceleration path via libraries and the DX API. I'd say it needs a rehaul.

And this bring me back to the FrameGen part: for game engines to adapt it, well, first nVidia would need to drive a "standard" other vendors could use, but we know that is NOT going to happen. No standard, then no game dev would re-write their engines to accomodate it, unless nVidia showers them with money. Which they have, to be fair. Hairworks anyone?

Time will tell, but I won't hold my breath to see FrameGen implemented in such a way that will remove the drawbacks in a way I'm personally comfortable with.

Regards.
 
But remember guys, the company with over 80% gaming GPU market share has "advised" people to benchmark their cards "differently" this time around.

Please stick to the facts.

What Nvidia advised is to use a different metric that actually works with MFG and also provides a better representation of the user experience. And of course it wants reviewers to evaluate MFG, but the difference between MsBetweenPresents and MsBetweenDisplayChange (as I point out in that very article) is less about making Nvidia look better and more about making the measurements work at all. Intel has said the same thing, and I wouldn't be surprised if AMD agrees as well. If you want a proper view into when framegen is sending the actual frames to the display, you need to measure at the right point in time: when the frame gets sent to the display.

The "fake frame" memes aren't important because we can actually test the hardware and experience it. I have done so in the past and am working to do so with MFG. And for all the "knowledgeable" people that claim people can't see or do certain things because that's not the way our eyes or brains work, it's a whole lot of FUD. Like the university professor that insisted humans can't see more than 20 Hz. See it in discrete chunks? No. But see it in analog accumulation of data? Absolutely. As pointed out above, TestUFO clearly shows differences between 15/30/60/120/240. TLDR: Brains are complex and can be trained, along with sight.

I've repeatedly said that framegen providing a 50-70 percent boost to the frames to monitor rate is not the same as increasing the rendering rate by the same amount, but that — depending on the game — it can also absolutely look and feel better, even if the sampling rate drops a bit. MFG isn't really any different, in that a hypothetical doubling from framegen running at 80 to MFG4X running at 160 feels more like a ~20% improvement rather than 100%. How much better framegen or MFG is more subjective, but anyone who claims it's universally bad? They're not acting or reporting in good faith.
 
  • Like
Reactions: Gaidax
How much better framegen or MFG is more subjective, but anyone who claims it's universally bad? They're not acting or reporting in good faith.
I think this is the gist of it.

While I'm not thrilled to open this can of worms, I have a feeling a lot of it yet another case of those partisan issues we forever had in gaming graphics hardware for the eternity. All the way back from "32 bit color vs big D FPS" three-way Mexican standoff between ATI, Nvidia and 3Dfx, just before GeForce was a thing.

That and yeah, AI bad of course too.
 
I think this is the gist of it.

While I'm not thrilled to open this can of worms, I have a feeling a lot of it yet another case of those partisan issues we forever had in gaming graphics hardware for the eternity. All the way back from "32 bit color vs big D FPS" three-way Mexican standoff between ATI, Nvidia and 3Dfx, just before GeForce was a thing.

That and yeah, AI bad of course too.
You know what it really is? It's that completely bogus "RTX 5070 is as fast as an RTX 4090" marketing BS. Because Nvidia 100% knows that's not remotely true. It was one of the most annoying claims Nvidia made at CES for the 50-series launch.

Even if MFG4X could match the number of frames delivered to the monitor with a 4090, as stated it's not a fair comparison. MFG4X getting "160 FPS" would be better than Famegen getting "80 FPS", but MFG4X getting 120 FPS would feel obviously worse than framegen getting 120 FPS.
 
Pretty darn dissapointing.
Sit back and let the impatient suckers get fleeced.
I'll happily wait for March and see if AMD can offer something worth buying at $500.
Ditto, was thinking of switching from Team Red 6900XT to Team Green with the 5080, but this just seems underwhelming after the reviews. I'll probably go with a slight grade to a 9070XT for the RT and better AI support for far less $$$ and wait for the next generation. I'm also not keen on the multi-PCB design and/or issues potentially.
 
  • Like
Reactions: Jagar123
The fact you said people have failed the dual 60 fps monitor test says a lot too.
It's called the placebo effect and medical research has to account for it when they do human trials. It all comes down to human confirmation bias and our tendency to see what we want to see.

Knowing the biology helps understand what is going on with the whole "ultra refresh rate" crowd.
 
  • Like
Reactions: TeamRed2024
You know what it really is? It's that completely bogus "RTX 5070 is as fast as an RTX 4090" marketing BS. Because Nvidia 100% knows that's not remotely true. It was one of the most annoying claims Nvidia made at CES for the 50-series launch.
The issue is calling it performance in the first place. It's a quality of life feature for low FPS situations.


Things like DLSS upscaling and interpolation allow you to take a situation where you are at 30 FPS and make it "feel" smoother. NVidia is asking you to focus on it because of the miniscule generational difference experienced otherwise. NVidia knows this, it's why they level set and baited the market with the monstrous datacenter GPU lite that is the 5090, caused quite a lot of hype and buzz. If they had just released the 5080 (along with the other lower tiers) without focusing on fake frames, everyone would of rated it into the ground as more of product refresh instead of a new generation.
 
Last edited:
  • Like
Reactions: YSCCC
All this upscaling, FG, and MFG stuff done gave me the confusion status, with a side of a headache.
-FG and MFG add a little input latency, slightly negating some of the potential frame increase.
-If a game is already at a playable frame rate, FG and MFG don't add a whole lot more to the experience. If the game isn't... same deal - OR, 'when someone needs them, they're 'meh', when they don't, still 'meh'.[HWUB video by Tim]
-The raster techniques... the ray/path tracing techniques... it's all fake anyway.
...

How about this? Until "Everything just works", I won't even bother with 'em.
 
All this upscaling, FG, and MFG stuff done gave me the confusion status, with a side of a headache.

Please, don't take this the wrong way, but... thank God!

For a second there, i thought i was the only one confused around here! 🤣
 
  • Like
Reactions: Phaaze88
I could absolutely tell the difference between 60, 120, and 240 in the demos. They could switch it and let you try. 240 and 360, and later 480? No, I didn't usually notice much gain there. But I could still spot the differences given time. The other hardware tech people I was with could also tell the difference.

Anyone who can't tell the difference between 60 and 120+ isn't really a PC gamer is my take, and anyone that thinks 24 or 30 fps is smooth has never played games at 60 or 120+ FPS. 24 FPS is garbage, 30 FPS is barely playable. Even in Flight Simulator, 24 FPS just isn't a good experience. My brothers probably couldn't tell the difference between 60 and 120, because they're not gamers, at all. I meanwhile grumble when we play four player Mario Kart on the Wii (yeah, the original Wii) because it drops from 60 FPS down to 30 and feels crappy.

It frankly feels like you're making stuff up to argue the point now. Because on the one hand you're complaining about "fake frames" with framegen, and on the other you're spouting nonsense about people not being able to tell the difference between 60 and 120 FPS, while 24 or 30 FPS is "smooth enough."

Have you used framegen on an RTX 4080 or faster, or even on an RTX 4070 at reasonable settings (i.e. something like 1440p with upscaling where you go from 50 to 80 FPS)? Or are you just spouting off what others have said online? And if you've only used FSR3 framegen, that doesn't count. It is provably inferior, in most games. The image artifacting with FSR3 is so much worse than DLSS3 framegen that they're not even comparable other than in terms of how many FPS you might be able to push out.

So which is the correct point? Because if 24 to 30 FPS is "smooth enough" then MFG 4X pulling 150+ FPS will definitely be smooth enough and framegen is fine. But if it's "fake frames" then higher rendered FPS and input matter a lot and 60 FPS is the bare minimum we should aim for.
Read carefully… I said I tried to limit the fps to 60, in 144 hz monitors, I tested those in CS with a bunch of friends, and none really reliably feel when I have run to run switched to 60 fps limitation. In the demos I personally think that I could feel the difference. But I do feel it’s due to confirmation bias. The issue for 60 fps is that when the average fps is at around 60, the lows will drop much lower , which is where the stutters kicks in. Maybe you could try limit a 240 fps+ game to 60, maybe 70 and try did you feel that in a blind test when someone flip a coin and decide to limit it or not.
The 24-30 is smooth enough is not for 90% of the games, I said that is in extreme condition like flight simming, go look for msfs 2024 addon, for the airliners ppl are literally targeting 24-30 fps low because the cockpit view in flight sim moves very slowly, essentially a slow motion video for most of the part, can be seen here in the 30s won’t notice the lagging.
View: https://youtube.com/shorts/6-6Br8xcrBc?si=6n-jltFWLkLa7yWf


Of course any games with action would need 60+, but that’s for a display GTG IME. I am using 3070Ti and 4070 with dlss in games like star field and cod6, with a asus 144hz tug gaming monitor. in campaign mode clearing in highest difficulty I really can’t tell any difference when the fps counter hit more than 60, but sadly when FG is on I could see the red dot aiming sight defect so ended up I only turn on DLSS up scaling but not FG.
 
  • Like
Reactions: Peksha
Status
Not open for further replies.