Purely opinion
Most people don’t buy x90 cards, they don’t buy x80 cards. The x70 and x60 cards are the prevalent cards reported on surveys such as steam.
The x90 cards have given an indication of the x70 and x60 cards a few years down the line.
Developers produce AAA games that need ever stronger gpus. They make games that require the latest and greatest to run at 4k, max details and as such exclude the majority from experiencing their products with full eye candy.
The inclusion of “fake frames” (a horrible idea) mitigates this the shortfall in performance to some extent, the reported problem is an apparent sluggishness in response to inputs. This makes me think of my old 486sx. That sluggishness disappeared when I got a P120 in the games I played at the time.
Upscaling is less problematic. In fast moving scenes so long as it doesn’t smear it is acceptable. In more steady scenes more effort could/should be placed on fidelity. Ideally there should be both fidelity and motion. The idea of the balance is to maintain frame rates. Hardware is capable of achieving the ideal, or at least very close.
Target FPS
With the 486 people we’re looking for a solid 30fps. On the CRT monitors of the time that frame rates looked good. Pentium/Athlon the target hit 60fps and the display hardware available was pretty much locked there for a long time. It was that the commodity LCD displays could present.
Question, what can your monitor display?
I don’t need more than 165Hz to saturate my displays (not a flex, any frame rate greater than that is wasted).
Twitchy FPS shooters and mmorpg typically need a quick real refresh rate, see the bad guy and unalive him..or lose. The fake frames doesn’t help with this, going by the Nvidia slide, 1/4 of a screen upscaled and interpolated 3 times till the next 1/4 screen.. you won’t see the changes. New data is needed. Real frames whether raster or ray traced are needed to follow the unpredictable enemy.
(Hence my preference for raster and no “features” such as upscaling and fake frames)
Ray tracing is becoming usable across all new equipped cards but the implementation lies in the hands of the devs. Do they optimise the render path for Nvidia and throw in a token for AMD/Intel or do they properly implement for all? (Remembering the occluded objects/tesselation a few years ago). Could Windows ENFORCE a path on all manufacturers and level the playing field such that the best hardware shines?
People in the forum have been saying that AMD/Intel are copying Nvidia. Ray tracing, upscaling, fake frames, AI implementations of the 3 are the examples I can think of. Assume that ray tracing gives photorealistic images (it can look really close) then what else is there to develop?
Techniques to implement similar have been developed by AMD/Intel and they are improving. It will soon get to the point where the results are indistinguishable from any of the hardware sources. Improvements in visual fidelity will be in the realm of diminishing returns. What follows is a race for FPS again.
A few assumptions,
1, the monitors settle at a refresh rate of 240Hz. (A future target?j
2, the GPUs are generating photorealistic images at 4k
3, the games can feed the render pipelines at 240Hz and the GPUs can process that amount of data.
If photorealism is truly achieved there are no real improvements to make, the subtleties of detail, the near infinities of colour and shade, the difference between the edge if a leather jacket compared to a woollen sweater, the diffuse shadow from indirect light contrasting with the hard edge of sunlight, the light quality, sun/incandescent/fluorescent/led, realistic skin.. all rendered in real time will be incredible.
Once the hardware can do this at the assumed refresh rate and resolution across the board what else does a gamer want. The trio are relatively close to doing this and once achieved incremental improvements will give diminishing returns.
It is a matter of time and commercial pressure as you will need to upgrade less often if/when some degree of stability is achieved. Nvidia has an out, AI. AMD has an out, ML. Does intel have enough good will and penetration to get into AI in a significant way?
GPUs are still interesting but what makes them interesting is becoming commonplace. They wont be boring but I think the best times have passed. From simply being an adapter to drive a monitor to gaining rudimentary acceleration (rage, 3dfx, tnt), followed by the basic GPU processing ability of transform and lighting. 18 years of iterative advancement and in 2018, RTX. We are now 7 years into ray tracing and it is close to usable maturity across the manufacturers.