Who cares about Ray tracing when on Nvdia is STILL UNPLAYABLE other than showing off to a friend. It sucks so much that I can't even believe there are ppl saying that they actually use it for real gaming. Using like 50-70FPS sounds to me like going 10 years back on low end machine. All monitors now are 144hz+ and I personally hate getting below 100FPS on any game
@2k.
Also Im really happy that AMD are finally on par with Nvidia on rasterization which is the only thing that matters for real gamers currently and finally there is some real choise. We saw that new competition in action with the pricing of 3000 series already. I might swith to AMD next with next GPU.
60 Hz is more than enough for many people (not all).
All monitors is NOT 144hz+ and LOTS and lots of people, including me, is playing on 55 inch TVs.@4kand 60Hz. I can even play at 120HZ(1080p)natively (not interlaced) if I want to, but i don't.
I did a test some time ago and the highest framerate I could see a difference in was up to 47 FPS, beyond that no difference for me.
So for me, at 60 Hz there is no difference to higher refresh rates.
That is not to say that some people are more sensitive to it. And that even if you are not sensitive to it, you may still benefit from higher refresh rates for competitive online gaming, where lag and latency means everything. But I never play multiplayer games! And 4k@55 inch TV at max settings beats lower res 27 inch monitors with 144 HZ big time for me.