This has probably been addressed before however i have a few questions and a major lack of understanding.
Looking to upgrade towards the end of this year, though my intent isn't to upgrade for higher resolutions, its to upgrade so i can achieve higher FPS rates for higher refresh rates while being able to visually enjoy the game (as opposed to downgrading graphics to achieve higher refresh rates consistently).
I've been overwhelmingly told that both the 3080 and 3090 would be overkill for 1080p gaming. I still haven't really had the "why" explained.
My main confusions can be summed up in a few questions;
Looking to upgrade towards the end of this year, though my intent isn't to upgrade for higher resolutions, its to upgrade so i can achieve higher FPS rates for higher refresh rates while being able to visually enjoy the game (as opposed to downgrading graphics to achieve higher refresh rates consistently).
I've been overwhelmingly told that both the 3080 and 3090 would be overkill for 1080p gaming. I still haven't really had the "why" explained.
My main confusions can be summed up in a few questions;
- Lets make a hypothetical situation, if i was playing CoD Warzone and i wanted to run the game at nuts graphics, everything at absolutely maximum but also aim for 144Hz / 240Hz on my current rig i'd suffer, i typically have to downgrade graphics somewhat to achieve 120-ish on my 1080ti. I notice more than marginal difference on graphics settings in said game and rendering. If i were to hypothetically upgrade from say a 1080ti to a 3080 / 3090, where the difference in base statistics is almost double, on the maximum possible graphics in a game like Warzone, would i see higher FPS than on my 1080ti on medium settings?
- Why would say a 3080 or 3090 be overkill for achieving maximum possible frames in a modern title while also running on high graphical settings?
- Is CPU bottle-necking an important factor? If so how much would this effect potential gains at 1080p?