• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Is the RTX 2080ti Overkill for 2560 x 1440 144-65 Hz?

Alyus

Distinguished
Feb 10, 2012
699
1
18,995
I have a 2560x1440 G-Sync 144-65Hz monitor and would like to have the highest frames possible; will the RTX 2080ti be an over kill for my monitor? I'm not concerned about ray tracing as only a few games will utilize that capability.
 
Solution
that flagship GPU offered for those gamer who seek "close to reality visual experience" and those content creator who seek "faster result" GPU base compute machine, this market segment is relatively marginal.

it's not fair to say software side development belong to hardware requisite, nvidia tensor core was their solution of "cheap" deep learning convolutional neural network, real time ray tracing was their solution of "less power consumption with better imagery fidelity" to long used anti-aliasing sampling,

time will tell if AI driven and real time ray tracing going to be new standard in gaming industry, would be good to relocate CPU resource to GPU so maybe low CPU can be pair with flagship GPU.


That would be great if every game is maxed out in terms of frames for my 2560 x 1440 165 Hz monitor; don't recall any single ti card that could max out every game at 1440p @ 165 Hz.

The 1080 Ti can only play Shadow of the Tomb Raider on Ultra around 79 FPS on 1440p; we'll just have to wait to see how the 2080 ti will perform in a few weeks.
 
So far, from the Youtube reviewers, the 2080 ti is running popular games in 4K such as Shadow of the Tomb Raider at 60 FPS and passing 90 fps for other games. This is a very good sign!
 


That's what I thought; I wouldn't think any single card would be over kill, ever, esp at 1440p. 100 FPS is certainly better than 60 FPS by large. However, 144 or higher is even greater.

 


I get better than that with my GTX 1080Ti on Ultra settings, playing with everything maxed out, between 80 and 90 FPS at 1440P. It does dip into the 70's once in awhile depending.

I could lower the Anti-Aliasing down from SMAAT2X and the FPS would go up a lot. Would be over 100 FPS....

Would still be Ultra Preset too...
 


SOTR is a monster of a game with everything turned up to max.

And that is putting it lightly.

But the graphics are incredible.

No, I doubt a 2080Ti would be overkill at 1440P/Ultra....
 
Just to show what I’m saying. However if you drop the obsession to run ultra then things are very different. This is just from 1 older review so doesn’t even cover the latest games.

Primal_1080.png


F1_1080.png


DeusEx_1080.png


Division_1080.png




Hitman_1080.png
 
I know, all depends on the system and games, 1080P uses a lot more CPU power than 1440P though, shifts more to the GPU.

I play at both 1080P (GTX 1080 and 8700K) and 1440P (GTX 1080Ti and 8086K), yeah I have 2 machines.

I play with everything maxed out because I can. 😉
 
Every new generation of cards, we get people excited that their new card can play 4K without lowing much if any settings and within a few months, they are back to the same situation as always. The top in cards are generally good to go at 2560x1440 with only some reduction of settings and will remain that way for generations to come.

The cycle is very consistent;
1) GPU manufacturers make a faster card
2) the game developers make more demanding games.
 


Developers prioritize developing games for consoles such as the PS4 and XB1; they earn a living by selling games and to maximize their sales, they design games for the systems the majority of gamers possess. For those with above average systems, they can increase the game settings.

Most PC gamers are casuals and most average casual gamer will not spend $1000 or more dollars on a single GPU; my guess is that gaming enthusiast (gamers who spend $2000+ on their gaming PC) is outnumbered 2/10. The data easily found by looking at Steam's stats for their customer's hardware. Only 1.5% of Steam gamers possess the 1080 Ti and 1080p is dominant.

 


Most people buy prebuilt machines and not exactly expensive ones at that.

As far as gaming PC's go nothing much has changed over the years, MOST people buy on the cheap (cheapest crap they can buy) so they are lower midrange or under and or old lower midrange or under machines that they try and play games on.


Then they come places like here and ask why their POS box won't play games worth a crap.

We get the old I want to play (enter new AAA game here) at good frame rates and don't want to spend any money. :sarcastic:

Then complain when people tell them what they really need.

It doesn't help that some morons with YT channels cater to those types of people and post BS like low end machines playing the newest games. That and those POS refurbished SFF systems that idiots seem to buy and think they can turn them into gaming machines.

Then they come here and ask why THAT POS won't work the way the one in the BS YT video did?

And then complain when we tell them what reality is.

Realistically people don't need to spend $3,000 on a gaming machine, but they do need to be realistic based on what they want to do.
 


It doesn't matter, they dev's still will present the user with settings which give a good balance to the systems that are available. The lower end systems play on lower settings, and they always change the high end settings to challenge the high end systems. Nothing ever changes. The 2080ti will handle new games like the 1080ti handled the new games during its time at the top.
 


Developers design what the majority of systems can handle. The hardware people have dictates the direction where game developers can go, not the other way around. Here's an example, not a perfect one though. Just prior to digital downloads, do you see cassettes and 8 Track Tapes being the major medium? No, only an idiot would record his music on those medium because only a very minuscule number of people in the world have players to play them.

 


small edit: Developers design for what the majority of the systems can handle, as well as for what the low end can handle, and what the high end can handle. Low is for low end systems, medium for average systems, high for high end, and Ultra for the highest end.

The result is still the same. The highest end, the average and low end, handle the latest games the same in every generation. It is directly related to the fact you just presented. The developers will design their games around the current hardware (they often design around what is expected to be released when their games are released, and they often tweak the available settings based on how things pan out at release).
 
that flagship GPU offered for those gamer who seek "close to reality visual experience" and those content creator who seek "faster result" GPU base compute machine, this market segment is relatively marginal.

it's not fair to say software side development belong to hardware requisite, nvidia tensor core was their solution of "cheap" deep learning convolutional neural network, real time ray tracing was their solution of "less power consumption with better imagery fidelity" to long used anti-aliasing sampling,

time will tell if AI driven and real time ray tracing going to be new standard in gaming industry, would be good to relocate CPU resource to GPU so maybe low CPU can be pair with flagship GPU.
 
Solution

TRENDING THREADS