>As stated, the score is tentative based on other testing that I’m still working on completing.
A numeric score will always be controversial regardless of how "accurate" it is. People emphasize different things. It's like movie reviews. Some review have a score, some don't. I suppose a score is good for drive-by readers who spend less than a minute to read the pro/con and the TL/DR.
I find some aspects of the review superfluous, if not irrelevant. The notion of "value" is one. Gamers who pay $2K+ for a GPU don't care about bang/buck, or watts/perf--or at least, those are bottom priorities. Non-gamers aka professionals who buy 5090 for productivity work don't care either. $2K may be expensive for gaming purposes, but it is cheap for AI or any use by which earns you an income.
I get that every PC aspect revolves around gaming here. But 5090, like 4090, is a multi-purpose product, and gaming is arguably the least important for the main demographic target. I thought $2K pricing was on the low-end. Nvidia could've priced it $2.5K and it still would've been well received, albeit not by gamers.
Below is a review from a mainly non-gaming perspective. I also agree with its take of MFG: Regardless of how useful one thinks MFG is, it's better to have it than not have it. (Also, note that nowhere is notion of "value" raised. Value for pros is different from that for gamers.)
>Obviously, at $2000 this is going to be way out of reach of most gamers. And I really don’t think Nvidia cares that much. There will be plenty of businesses and professionals and AI researchers that will pay $2000 or more and consider themselves lucky.
Spot on. The problem you face is that 99% of your readers here are gamers. Your review, as with most other reviews, focuses on gaming. For a gaming review then, I think your readers would've preferred you take a more downbeat tone to match their negative sentiment. We like reviews more when they agree with our preconceived notions. (I don't care either way, as I don't have a dog in this hunt.)
A couple of other reviews call the 5090 as a "4090 Ti," which I think is an accurate and succinct take. But it's just a label, like people (you?) calling the 4060 the "4050." 4060 is selling well, just as 5090 will sell well, despite the moaning & groaning from handful of malcontents (who likely aren't in the target market anyway.)
>Now, if we were combining something like Reflex 2 in-painting and projection and time warp with frame generation? That’s what we actually need to see.
>Render frame one, sample user input while projecting frames two, three, and four. Then render frame five and project with user input sampling and projecting the next three frames… That would actually make a game look and feel responsive, rather than just smoothing frames.
I disagree, at least conceptually. For FG to work, you have to know the end point, ie the next actual frame. IOW, it has to be interpolation, not just (linear) projection from current & past motion. Your argument, from a previous post, that with a high enough frame rate, the projection errors would be fleeting enough to not be noticeable is wrong. The 3:1 ratio of generated-to-rendered frames is the same regardless of framerate, and will definitely be noticeable as visual artifacts. The only way to mitigate these errors is to use interpolation, ie to take the actual end frame into account. Faster framerate does nothing in this regard.
We've discussed this before. I've argued for FG to become a larger factor in gaming, and to a large extent, that has become true. FG is a recommended setting in many game optimization guides, and it has become a default setting in a number of AAA titles. Common sense says that latency can be further mitigated, eg by generating frames with data from the end-frame as that is being rendered, but not needing to wait for the end-frame to finish rendering.
Maybe Nvidia will do this, or maybe not. I think we can agree that Nvidia sets the pace here, and AMD & Intel are content to be followers, at least for gaming. I don't think we should expect much innovation in the gaming sphere, as most incentive to differentiate (and improve) will be on AI, for both consumer and business use. AI accelerator cum GPU will be a thing, and my projection is that's where Intel & AMD will try to make their mark.