Review Nvidia GeForce RTX 5080 Founders Edition review: Incremental gains over the previous generation

Page 9 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Read carefully… I said I tried to limit the fps to 60, in 144 hz monitors, I tested those in CS with a bunch of friends, and none really reliably feel when I have run to run switched to 60 fps limitation. In the demos I personally think that I could feel the difference. But I do feel it’s due to confirmation bias.
Okay, try this: https://www.testufo.com/

Tell me it's "confirmation bias" there.
 
Okay, try this: https://www.testufo.com/

Tell me it's "confirmation bias" there.
This is exactly the type of test I refer to when confirmation bias comes in play, it labels what FPS is what, and then puts one in a really stuttry in the bottom, so the brain literally fills in "fastest"->"fast"->"laggy". try the trick of finding someone to randomly flip around 60 fps limiter vs unlimited in the same game in same setting and try again. maybe you can reliably tell when it is limited to 60 or 144.

Just fyi,
View: https://www.youtube.com/watch?v=zDaGKLDAn4E
around 6minute mark, this is the example of what I say when in flight sim, the lower FPS is perfectly fine, look how 4x fps looks smooth in external view, all we target is a minimum of 24-30 in simming, of course
 
  • Like
Reactions: Peksha
In virtually all situations you won't because your visual cortex is disposing of the additional information. I did a whole breakdown a bit ago but basically the visual processing part of your brain is only really looking for contrast changes. Everything else is getting smeared and filled in what what you previously saw. It's how optical illusions work.

Now if there is a sudden large change in contrast, you visual cortex will absolutely notice that and prioritize the details around that change. I've had people absolutely swear they could "tell the difference", I had them look at two monitors and they "identified the 120 one", only for me to show they were both locked at 60. The trick is to use backgrounds that don't contrast heavily with the mouse cursor since what they are really looking for is duration of contrast.

Fake frames are fake frames, it's interpolation plain and simple. If tomorrow AMD released a drive update that would render a frame, then change 1 pixel slightly and resend the frame to the output, effectively doubling the measured "FPS, would people claim how amazing the double performance was? Or would people say it was BS? Yeah thought so.,

But remember guys, the company with over 80% gaming GPU market share has "advised" people to benchmark their cards "differently" this time around.

https://www.tomshardware.com/pc-com...will-require-some-changes-according-to-nvidia

473448592_4133086116918832_9146267560386356368_n.jpg
This was the whole issue to me personally, it literally tells you to ignore the induced latency and call them victory by 300%+... while it is only meaningful to have a high base FPS to smooth out the occasional drop out is what it can do best
 
The issue is calling it performance in the first place. It's a quality of life feature for low FPS situations.


Things like DLSS upscaling and interpolation allow you to take a situation where you are at 30 FPS and make it "feel" smoother. NVidia is asking you to focus on it because of the miniscule generational difference experienced otherwise. NVidia knows this, it's why they level set and baited the market with the monstrous datacenter GPU lite that is the 5090, caused quite a lot of hype and buzz. If they had just released the 5080 (along with the other lower tiers) without focusing on fake frames, everyone would of rated it into the ground as more of product refresh instead of a new generation.
actually even when they market as they did now, the 5080 is still rated into the trash bin... and 5090 in all seriousness, ignoring the scalpers and unavailability plus the crazy cost it puts out, still failed to impress without the MFG jacked up numbers.

For the 30x0 and 40x0, ppl literally look at their last gen TOTL GPU and wanted the massive upgrade, now it's "meh, I will be sticking to 4090". The only impressive design in this gen IMO is the founders edition 5090 cooler, which sadly didn't trickle down to the 5080. but while that is impressive in size/performance, it cooks your home and other components..
 
actually even when they market as they did now, the 5080 is still rated into the trash bin... and 5090 in all seriousness, ignoring the scalpers and unavailability plus the crazy cost it puts out, still failed to impress without the MFG jacked up numbers.

For the 30x0 and 40x0, ppl literally look at their last gen TOTL GPU and wanted the massive upgrade, now it's "meh, I will be sticking to 4090". The only impressive design in this gen IMO is the founders edition 5090 cooler, which sadly didn't trickle down to the 5080. but while that is impressive in size/performance, it cooks your home and other components..

ya prices are so far out of whack im out for the 5000 series lol.
 
And if you've only used FSR3 framegen, that doesn't count. It is provably inferior, in most games.
This is your subjective statement!
"Look, even their FG is worse"😂
Personally, I used both (4090 and 7900xtx) and did not notice the difference.
And in general, what does comparing FG have to do with the fact that it has a very limited use, and cannot be used in comparisons and especially advertising of equipment.
 
Why bother getting 5090? Just get 6090 next year or two, basically iphone at this point. NVIDIA CEO worth a billions while you lot cannot pay for rent but have enough dollars to get a worthless RTX 9999 Ultra Max Pro

Watch those cables melt and low quality connectors with it.
 
Shifting the measuring parameter of a fake frame, made not via traditional calculations, onto how fast this fake frame is served to the display, wont make this fake frame a real calculated frame all of a sudden.

No matter how much Nvidia want you to think otherwise. A fake frame being served to the display at a fast rate is still a fake frame. It was not drawn the old way. One frame at a time.

MSBDC, of a fake frame, is still a time measurement of a fake frame, as to how long or short it takes to reach the display.

Does MSDBC care if it is served a fake frame or a real one?

No. Its job is to only observe the time needed to serve the frame. It does not care if it is tracking fake or real frames. To MSDBC they are both the same.

And of course they want you to use that bench marking method now. This will validate their fake frame FPS marketing. " We are reaching X frame in X ms." While in reality 2/3 of said frames were regurgitated copied and renamed frames. Control+C and Control+V. Job done. $3600 please.

If they were on the other shoe, however, they would insist on raw calculations being the method to draw frames and not some marketing spin fake frame crap. And that is guaranteed.

A benchmark should measure the actual calculations needed to produce the frames and not take into consideration, copied and renamed, frames that took zero calculations to make.

Nvidia stated that they have zero interest in a two tier, forked, GPU market. One for china with their D models and one for the rest of us "lucky" chaps.

So, if they can convince the rest of you to buy their D model strength GPU's for non-D model prices then they are laughing all the way to the bank. It seems it is working.

I am on the 3060, still. Was planning on "upgrading" my GPS this year. But i am still on the PCIe limited AM4. I wont be getting the 5000. Not because i dont like it.

If the marketing spin and actual real world results to our eyes do make a 5070=4090 then i am ok with this, actually.

But the reason i am going back to AMD is cos AMD plays better with linux. And since i am ditching windows come october i willl need a card that can play ball inside of linux.

Nvidia cant. Neither on my pascal 1060 nor on my modern 3060. Their drivers on linux are a royal PITA to deal with and nothing works for me. At least not using my hardware that i chose.

I dont need bleeding edge tech and eye popping eye candy. I play a game from 1999 on a 3060 12GB. I could run nearly 30 whole copies of the game on my gpu's VRAM alone. The game will let you multi-client copies of itself as long as you have the resources to do so. A well known fella ran 40 or more copies of the game at once. All bots, mind you.

Asherons Call was the game in case you are interested. It has nearly been totally recreated since the servers shut down. The players, some of whom are serious pro C++ devs in the real world, built an emulator from scratch starting 4 years ago and it is now %98 done. 18 years worth of content was reproduced and packaged into an emulator that can you run at home on your own machine off line.

So eye candy is nice.....but when you play a game from 1999 then you dont really need a 3060.

Anyway. As much as i like the 5070 giving me 4090 performance, supposedly. I am moving to linux. So holding out for a 9000 series.
 
  • Like
Reactions: Peksha
So eye candy is nice.....but when you play a game from 1999 then you dont really need a 3060.

Anyway. As much as i like the 5070 giving me 4090 performance, supposedly. I am moving to linux. So holding out for a 9000 series.

I get the reasons you 're opting for AMD. What i don't get, is why you would specifically pick a 9000 series GPU, for a 1999 game.

Shouldn't a 7000, or a 6000 series AMD GPU be more than enough?

Wouldn't it also be a lot cheaper?
 
5090 is on the way! I secured one via eBay for $6000!

(Actually I didn't... but I am laughing at the people bidding at those prices)

I do have a buyer here at work for my 4090... $1500 cash which means I don't have to worry about Newegg's $1350 trade offer. I can buy from any vendor but unfortunately the nearest Microcenter is 1000 miles away so I am at the mercy of online vendors.

On a side note... anyone else getting the "5090 in stock" advertisements in their social media feeds from Newegg and Nvidia? Funny because they were never in stock according to the comments.
 
  • Like
Reactions: palladin9479
5090 is on the way! I secured one via eBay for $6000!

(Actually I didn't... but I am laughing at the people bidding at those prices)

Look at the bright side: when you finally do purchase 5090, and provided that prices have stayed the same, you can always sell it for $6,000 at eBay, and make some profit out of the damn thing! 🤣

On a side note... anyone else getting the "5090 in stock" advertisements in their social media feeds from Newegg and Nvidia? Funny because they were never in stock according to the comments.

Haven't seen one so far, but it wouldn't surprise me.

In Greece, we only have one retail store that claims to have 5090s in stock.

The prices, are scalper-like, reaching as high as 3,500$.

However, anyone who makes the mistake of ordering a 5090, finds himself waiting 'til mid Febrouary, at best.
 
Last edited by a moderator:
5090 is on the way! I secured one via eBay for $6000!

(Actually I didn't... but I am laughing at the people bidding at those prices)

I do have a buyer here at work for my 4090... $1500 cash which means I don't have to worry about Newegg's $1350 trade offer. I can buy from any vendor but unfortunately the nearest Microcenter is 1000 miles away so I am at the mercy of online vendors.

On a side note... anyone else getting the "5090 in stock" advertisements in their social media feeds from Newegg and Nvidia? Funny because they were never in stock according to the comments.

This is why I reserved my 7900 XTX from a nearby microcenter the moment I saw the 5080 was mediocre. Picked it up on Friday afternoon, now just waiting on my waterblocks and the new water distributor reservoir to arrive to rebuild my system. This should last me another three or more years, past this generation and maybe the next.
 
Well, this was very informative:
View: https://www.youtube.com/watch?v=Nh1FHR9fkJk


Looks like nVidia is doing nVidia things in order to measure things in a way they won't allow anyone else to be measured against. Shocking, LOL. Well, the "shocking"was sarcasm to the reader without their sarcasm detector.

Regards.

Yeah I pretty much figured this would happen. Now we know what the "special review guidance" from NVidia was, to use their own special benchmarking tool.
 
I get the reasons you 're opting for AMD. What i don't get, is why you would specifically pick a 9000 series GPU, for a 1999 game.

Shouldn't a 7000, or a 6000 series AMD GPU be more than enough?

Wouldn't it also be a lot cheaper?
Asherons Call is not the only game i play. I also play modern games that require a half way decent GPU. I say modern, Shadow of the Tomb Raider is now, what, eight years old? Makes this 3060 work hard though with all eye candy on.

I have two systems side-by-side here next to my desk. While you can dual boot windows and linux on the same drive, or even separate drives in the same system, windows will never play nice with linux on the same system. Windows thinks it owns the system. And grub tends to use the windows disk for the EFI info, unless specified otherwise. PITA

So a windows system and a separate linux system it is Neither can screw the other up.. Neither of the OS's like to play with the other to be honest. A true dual boot environment seems to a pipe dream. It does work but my experience is that something always goes wrong.

Whether it is windows messing with linux or linux messing with the windows drive. Its always one or the other. So two systems it is.
 
Yeah I pretty much figured this would happen. Now we know what the "special review guidance" from NVidia was, to use their own special benchmarking tool.
It gets more disgusting when they do this to try hide the lack of progression this gen while still try to forever up their profit margin. Textbook case for harm of monoploy
 
Yeah I pretty much figured this would happen. Now we know what the "special review guidance" from NVidia was, to use their own special benchmarking tool.
Seriously, just stop. The guidance was effectively this: MsBetweenPresents will not yield useful data with MFG on the 50-series cards and you need to use MsBetweenDisplayChange to get anything useful.

That does not mean MFG is the same as rendered frames, of course, but if anyone is going to try to analyze the MFG frametimes and pacing then they need to use the new metric. Otherwise it’s like trying to talk about a car’s gearing and transmission by letting it idle in a parking lot.
 
  • Like
Reactions: Loadedaxe
Seriously, just stop. The guidance was effectively this: MsBetweenPresents will not yield useful data with MFG on the 50-series cards and you need to use MsBetweenDisplayChange to get anything useful.
To be fair that video unpacks what nvidia has done which certainly seems to be shady. This is not to say FrameView is wrong, bad or providing false information though. For MFG nvidia claims to have made some optimizations to/for MsBetweenDisplayChange however they've not contributed this to PresentMon. That would mean there's no way to independently verify results from FrameView.

PresentMon had already moved past using MsBetweenPresents and MsBetweenDisplayChange which is what allowed GN to figure out roughly what iteration FrameView was likely based on. This supposedly isn't sufficient for MFG results according to nvidia. Meanwhile PresentMon appears to properly support AMD/Intel FG and has added new features for identifying generated frames.

Again I'm not saying there's something wrong with FrameView but this is bad behavior on the part of nvidia. Of course this also seems to be par for the course when it comes to nvidia and the open source community.
 
In other news... just saw the highest price yet online for a 5000 series GPU.

Dude got his 5090 from Microcenter apparently... and it's on eBay for the low price of...

$13,999.

Totally worth it for a 27% uplift from the 4090. 🤣
 
In other news... just saw the highest price yet online for a 5000 series GPU.

Dude got his 5090 from Microcenter apparently... and it's on eBay for the low price of...

$13,999.

Totally worth it for a 27% uplift from the 4090. 🤣
What? did I mistaken it as HKD or RMB? what did he smoke
 
To be fair that video unpacks what nvidia has done which certainly seems to be shady. This is not to say FrameView is wrong, bad or providing false information though. For MFG nvidia claims to have made some optimizations to/for MsBetweenDisplayChange however they've not contributed this to PresentMon. That would mean there's no way to independently verify results from FrameView.

PresentMon had already moved past using MsBetweenPresents and MsBetweenDisplayChange which is what allowed GN to figure out roughly what iteration FrameView was likely based on. This supposedly isn't sufficient for MFG results according to nvidia. Meanwhile PresentMon appears to properly support AMD/Intel FG and has added new features for identifying generated frames.

Again I'm not saying there's something wrong with FrameView but this is bad behavior on the part of nvidia. Of course this also seems to be par for the course when it comes to nvidia and the open source community.
I do wonder about PresentMon vs OCAT vs FrameView. Intel created (and open sourced) PresentMon back in the day. It was usable but also severely lacking in features for a long time. When OCAT came out, I switched to that as it was more easily usable. Then Nvidia did FrameView and integrated support for its PCAT and I was sold — real in-line power data from every gaming benchmark I run! Intel has talked about doing a similar power capture device with a branch of PresentMon, and that would be fine by me as well. Really, I just want something that works.

Anyway, back to my thought: Intel "owns" the original PresentMon. If Nvidia were to come along and submit a bunch of changes to support some feature it wants, would people be happy or would they just reject the changes? Probably a bit of both, but the main owner of the Github repository has final say, right? So it's probably a lot easier and less politically charged for Nvidia to fork off PresentMon and then do everything it wants and not worry about it.

AMD did that with OCAT. CapframeX did it as well. So did Nvidia with FrameView. There must be a good reason these alternatives aren't just submitting direct code changes to PresentMon itself, right?

Now, there's also potentially stuff going on in Nvidia's code that it wants to keep secret. I have no idea. But the company does have a penchant for not open sourcing a lot of stuff. And I really wish it would, because FrameView isn't a shining pillar of UI either. It works, but you can't change the font size, you can't change the text color, you can't put a transparent but dark background on it to make the green text readable, you can only choose between two shortcut keys, etc.... There are a lot of things missing from FrameView. Not that I'd have the time to try and fix them, but it would be nice to have someone that would fix these things.

There have been requests to Nvidia to allow the use of a different font color and change the font size for years, basically since FrameView first came out. I have no idea why it doesn't just get fixed. Possibly it's just something that only gets maintained by one person in their spare time. I just wish something else supported the PCAT. (I'm not aware of any support in the base PresentMon, but maybe I'm wrong on that?)

And now of course we need support for MsBetweenDisplayChange or a similar metric, because with the new MFG stuff you get garbage data in MsBetweenPresents. Like this:

1738694269674.png
 
Status
Not open for further replies.