News RTX 4090 Twice as Fast as RTX 3080 in Overwatch 2 Without DLSS 3

salgado18

Distinguished
Feb 12, 2007
977
434
19,370
Of course it's not cheaper, the real competitor (RDNA3) is still a couple of months away. Until then, they will sell for as high as they can, using the new technologies and "50% more performance" as marketing push. Once AMD makes their move, prices will slowly adjust to the new reality (slowly, because I also think AMD won't push prices lower than necessary too).
 
Anyone else looking at these new cards and feel there is little point in upgrading with current games? At 1440p 240Hz my 3080 is going a cracking job still, to the point I doubt I’d notice any benefit from an upgrade. It feels like gpu performance has accelerated faster than game requirements?
 

lmcnabney

Prominent
Aug 5, 2022
192
190
760
In general, the higher your FPS is, the lower your input lag will be.
You know input lag is the delay between the input device and the pixels on the screen, right? It is going to be the same if you are playing an old game at 240fps or a brand new one with everything turned up to the max at 40.
 
  • Like
Reactions: KyaraM

salgado18

Distinguished
Feb 12, 2007
977
434
19,370
Anyone else looking at these new cards and feel there is little point in upgrading with current games? At 1440p 240Hz my 3080 is going a cracking job still, to the point I doubt I’d notice any benefit from an upgrade. It feels like gpu performance has accelerated faster than game requirements?
Upgrade when you need to, not when they tell you to. If your card is doing what you need it to do, why change it? Until heavier games come and it can't perform as you expect, don't give in to the buyer's impulse.

I say perform as you need because you mention 240Hz, but if you are happy with newer games running at 120 fps and your card does it, it still works, right?

My idea is: I'll get a 1440p card to play 1080p content for at least some 4-5 years, if possible more. If my current R9 280 is running games at 1080p medium/high today, the next card can live as long, right?
 
You know input lag is the delay between the input device and the pixels on the screen, right? It is going to be the same if you are playing an old game at 240fps or a brand new one with everything turned up to the max at 40.

False, input lag takes into consideration CPU and GPU rendering time as well. The faster your CPU and GPU are spitting out frames, the lower your input delay will be.
 
  • Like
Reactions: TJ Hooker

lmcnabney

Prominent
Aug 5, 2022
192
190
760
False, input lag takes into consideration CPU and GPU rendering time as well. The faster your CPU and GPU are spitting out frames, the lower your input delay will be.

The CPU/GPU are a factor only if scaling to a non-native resolution on the monitor. That's why the same input lag is detectable when typing in Word, pointing and clicking on the desktop, playing Halflife 2 at 360fps, or playing Cyberpunk with everything turned on. Using a wired mouse and and upgrading to low-latency displays is how you combat it. Yeah, the MB/CPU/GPU can cause problems, but modern gear is just not variable. They all are about as good as they can get.
 
  • Like
Reactions: KyaraM
The CPU/GPU are a factor only if scaling to a non-native resolution on the monitor. That's why the same input lag is detectable when typing in Word, pointing and clicking on the desktop, playing Halflife 2 at 360fps, or playing Cyberpunk with everything turned on. Using a wired mouse and and upgrading to low-latency displays is how you combat it. Yeah, the MB/CPU/GPU can cause problems, but modern gear is just not variable. They all are about as good as they can get.

I'm not sure where you're getting you're information from but that's all false. Any 3D game, that needs rendering time to output a frame, will have varying levels of input lag, depending on how fast the CPU and GPU can render out frames.

Frames per second is just a unit of time. The slower your FPS, the more lag you will have, because the image is taking longer to be displayed on screen. This can be combated by frame limiters somewhat, but again in general, higher FPS yields lower input lag, due to faster frame rendering.
 

DougMcC

Reputable
Sep 16, 2021
183
127
4,760
I'm not sure where you're getting you're information from but that's all false. Any 3D game, that needs rendering time to output a frame, will have varying levels of input lag, depending on how fast the CPU and GPU can render out frames.

Frames per second is just a unit of time. The slower your FPS, the more lag you will have, because the image is taking longer to be displayed on screen. This can be combated by frame limiters somewhat, but again in general, higher FPS yields lower input lag, due to faster frame rendering.

You've got it right. Here's a way to think about it:
Input lag = Start timer when you move your mouse. End timer when an effect from that is rendered on screen.
In between those two times, you have:
Signal travels via USB/bluetooth to memory/CPU.
OS processes signal, shares with active application (game).
Game changes state in response.
* Game renders a new frame via DX api calls.
Frame is transferred/swapped to active video memory.
Display renders the image.

It is during * that the frame rate of the game impacts the latency. Unless you are already running faster than twice the display's display frequency, getting faster improves your latency outcome.
 

bigdragon

Distinguished
Oct 19, 2011
1,142
609
20,160
Anyone else looking at these new cards and feel there is little point in upgrading with current games? At 1440p 240Hz my 3080 is going a cracking job still, to the point I doubt I’d notice any benefit from an upgrade. It feels like gpu performance has accelerated faster than game requirements?
Agreed. 3D content creation is what is pushing me to upgrade right now. Games have stagnated. We haven't had anything push performance requirements since the release of Half Life Alyx. There's a lack of performance-demanding and innovative gameplay experiences for the PC space. Too many console port sequels!
 
  • Like
Reactions: KyaraM

aberkae

Distinguished
Oct 30, 2009
132
43
18,610
I'm not sure where you're getting you're information from but that's all false. Any 3D game, that needs rendering time to output a frame, will have varying levels of input lag, depending on how fast the CPU and GPU can render out frames.

Frames per second is just a unit of time. The slower your FPS, the more lag you will have, because the image is taking longer to be displayed on screen. This can be combated by frame limiters somewhat, but again in general, higher FPS yields lower input lag, due to faster frame rendering.
Frame latency and input lag are different but are equally important.
Some argue that capping your performance to match your monitor maximum frequency helps 2 fold. 1) By mitigating tearing from when the frames per second go above the maximum frequency of your monitor. 2) by capping performance you can still hit the best latency output of the gpu and also prevent it from throttling with frame variance.
So in conclusion the term input lag should be referred to specifically to monitors, mice and keyboards etc; and frame latency should be referred to the latency it takes for a gpu to render a frame. The two should be exclusive and not mixed into the same thing.
 
  • Like
Reactions: KyaraM
Theoretically you could take your monitor off and game if you memorized every single input needed and or had audio cues.
For example a speed run of the first level of super mario 3.

If your gpu can give 500 fps, but your monitor is only 300 hertz then 2 out of every 5 frames you simply do not see.

If you could see 301 fps on a 300 hertz screen then advertisers would just advertise it as a 301 hertz screen.
 
  • Like
Reactions: KyaraM

PEnns

Reputable
Apr 25, 2020
702
747
5,770
Why is Ngridia comparing the 4090 to the 3080 and not the 3090 (it's true upgrade path)??

Might as well compare it against the 3060 and pat themselves even more on the back!!
 
Last edited:
  • Like
Reactions: Thunder64

JamesJones44

Reputable
Jan 22, 2021
853
786
5,760
The CPU/GPU are a factor only if scaling to a non-native resolution on the monitor. That's why the same input lag is detectable when typing in Word, pointing and clicking on the desktop, playing Halflife 2 at 360fps, or playing Cyberpunk with everything turned on. Using a wired mouse and and upgrading to low-latency displays is how you combat it. Yeah, the MB/CPU/GPU can cause problems, but modern gear is just not variable. They all are about as good as they can get.

It's the way the games are written. Most FPS use a single game loop to control inputs and outputs (meaning the inputs are processed and then assets rendered, then restart the loop), the faster the loop runs the faster you can process input. It's not universally true because it depends on how the game is written, but it's true for the mass majority of games, especially FPS. Games that do this will wait for the assets to render before processing the next loop, so it is true the faster they render the faster the loop runs.

Edit: Fixed typo.
 
Just a thing to mention about the "latency" debacle and something I've also mentioned a thousand times before: human reaction times average at 250ms. The best recorded reaction times are in the 150ms range (fighter pilots and such).

Let that sink for a moment.

After that, consider (as some have mentioned) network latency and server ticks (CS has the highest with 64Hz or 15.6ms per, IIRC; Ticks Explained). After that then your local latency, when under 30ms (exclusively for inputs), becomes absolutely and completely moot and placebo. At that point what starts counting is thinking ahead (strategy, for short), which is something no amount of hardware advantage can bring.

So what is the realistic "FPS" that guarantees you're there? Well, the panel will matter more when you're over 120FPS, which is frame times of just 8.3 ms for the frame to be rendered. Then 240 FPS is 4.17 ms. Do you guys believe that it matters after you're down to single digits of delay per frame?

Well, after all that, the nuance here is two things: pace (1% lows) and consistency (0.1%). As long as your frame time variance (1% or 0.1% if you want to be mega strict) is not too far away from the average, your latency won't suffer. Any spike caused by the GPU will put you well above any network, input and server tick rate, so for competitive games, if you all want to be super pedantic, should never talk about averages and always talk %1 or .1% lows. What I mentioned before this "nuance" section still applies if instead of averages you talk 1% or .1% lows.

Regards.
 

_dawn_chorus_

Distinguished
Aug 30, 2017
563
56
19,090
Anyone else looking at these new cards and feel there is little point in upgrading with current games? At 1440p 240Hz my 3080 is going a cracking job still, to the point I doubt I’d notice any benefit from an upgrade. It feels like gpu performance has accelerated faster than game requirements?


The kind of games you're playing that already get 240hz @1440p are definitely not worth the boost. But games with worthwhile ray-tracing like Cyberpunk or Dying Light 2 it would be nice to finally hit a solid 120hz with RT enabled.
But at the price they are asking I'll wait... I just can't justify spending over a grand on a GPU, that's nuts.

I got my 3080FE just after launch for msrp, so I also treasure it on a sentimental level..lol
 

Newoak

Commendable
Jul 26, 2021
12
1
1,515
In new Overwatch 2 gaming benchmark data, shared by Nvidia, the company's RTX 4090 is over twice as fast as its previous generation RTX 3080.

RTX 4090 Twice as Fast as RTX 3080 in Overwatch 2 Without DLSS 3 : Read more

I noticed the 12 GB 3080 was mentioned in the prices, but it only said 3080 in the benchmarks, are you sure NVIDIA used the 12GB for the comparison?
One game is not enough to be sure of benchmarks.
By the looks of the prices on ebay, NVIDA appears to be willing not to sell any video cards for A few quarters. In G-d I trust.
 

watzupken

Reputable
Mar 16, 2020
1,176
660
6,070
Why is Ngridia comparing the 4090 against the 3080 and and not the 3090 (it's true upgrade path)??

Might as well compare it against the 3060 and pat themselves even more on the back!!
I was thinking the same thing. The fact that their marketing material is always trying to avoid an apple to apple comparison shows that they have very little confidence in their product. I feel even 60 to 70 % performance uplift itself is a good number for a generational upgrade, but Nvidia choose to showcase things like DLSS 3 to inflate the numbers. With XeSS and FSR, I am not certain if developers are willing to spend even more time optimising for DLSS 3. Its like 3x the effort on top of time spent on the game development.
 
  • Like
Reactions: PEnns