[SOLVED] how close in milliseconds the shots have to be for double ko in games?

sxk1277

Great
Mar 19, 2020
129
3
85
So when in shooting games, both players die at the same time with each other's bullet, how close in milliseconds do their shots have to be for that to happen? Would have a 1000/2000 hertz Ultra polling mouse make a difference in me getting the kill and not dying?
 
Solution
There are several factors involved in this, and mouse polling is one of the least impactful.

First, games can use either "hitscan" or "projectile" hit registration, or a mix of both depending on the weapon. Some games like Overwatch also have beam (continuous damage) type weapons, which can also be programmed in different ways. Beams can have a lingering hitbox, which is intended to help the player with tracking, but is also means that a player can keep doing damage for a couple of milliseconds after they die. If both players are using projectile weapons, the travel time alone can be enough for both players to land a shot on each other. There is also the situation where one player, who has a projectile weapon, fires first, and the...

sxk1277

Great
Mar 19, 2020
129
3
85
Is it possible to find out information on any game like Apex or Battlefield or even Soul Calibur 6? Finding any information on even side games like Counter Strike or Call of Duty would help get an idea. Thanks
 

USAFRet

Titan
Moderator
Is it possible to find out information on any game like Apex or Battlefield or even Soul Calibur 6? Finding any information on even side games like Counter Strike or Call of Duty would help get an idea. Thanks
You'd have to analyze and understand the underlying code for the game engine.

Unless they've actually publisshed the source code, and you can read and understand it....no.
 

sxk1277

Great
Mar 19, 2020
129
3
85
Is there anyone out there who has done a test though? Like you could design an app that makes both computer shoot each other at the same exact time. And then you slowly differentiate the shooting time in ms till 1 person is left standing.
 

MasterMadBones

Distinguished
There are several factors involved in this, and mouse polling is one of the least impactful.

First, games can use either "hitscan" or "projectile" hit registration, or a mix of both depending on the weapon. Some games like Overwatch also have beam (continuous damage) type weapons, which can also be programmed in different ways. Beams can have a lingering hitbox, which is intended to help the player with tracking, but is also means that a player can keep doing damage for a couple of milliseconds after they die. If both players are using projectile weapons, the travel time alone can be enough for both players to land a shot on each other. There is also the situation where one player, who has a projectile weapon, fires first, and the second player, who has a hitscan weapon, kills the first while the projectile is still traveling.

When two hitscan weapons are involved, it all depends on the game's netcode. Most games these days use a "shooter's advantage" hit registration model, which means that for example when you shoot an enemy that is moving to hide behind a corner, your hit will still register even if your enemy was already behind that corner on their own screen. In some games this also leads to the "lagger's advantage" where it's possible to shoot people who have moved a considerable distance from where they look to be on your screen because your ping is high.

The shooter's advantage has the knock-on effect that if both game clients send a "shoot and hit" event to the server at roughly the same time, it takes time for that event to reach the server (due to ping). So if you have 40ms ping and you shoot your enemy and kill them, it's still possible for them to fire a shot in those 40ms, regardless of their ping.

Input lag can play a role in the moment when the hit is registered on the client level and mouse polling is part of that, but the majority of input lag is a result of the signal from your mouse having to go through the CPU first and then the GPU as they are processing the frames. That is why most competitive gamers play at 144 or 240 fps.

All in all, there is no one definitive answer to this question as it depends on the game, the hardware and your internet connection.
 
Solution

sxk1277

Great
Mar 19, 2020
129
3
85
There are several factors involved in this, and mouse polling is one of the least impactful.

First, games can use either "hitscan" or "projectile" hit registration, or a mix of both depending on the weapon. Some games like Overwatch also have beam (continuous damage) type weapons, which can also be programmed in different ways. Beams can have a lingering hitbox, which is intended to help the player with tracking, but is also means that a player can keep doing damage for a couple of milliseconds after they die. If both players are using projectile weapons, the travel time alone can be enough for both players to land a shot on each other. There is also the situation where one player, who has a projectile weapon, fires first, and the second player, who has a hitscan weapon, kills the first while the projectile is still traveling.

When two hitscan weapons are involved, it all depends on the game's netcode. Most games these days use a "shooter's advantage" hit registration model, which means that for example when you shoot an enemy that is moving to hide behind a corner, your hit will still register even if your enemy was already behind that corner on their own screen. In some games this also leads to the "lagger's advantage" where it's possible to shoot people who have moved a considerable distance from where they look to be on your screen because your ping is high.

The shooter's advantage has the knock-on effect that if both game clients send a "shoot and hit" event to the server at roughly the same time, it takes time for that event to reach the server (due to ping). So if you have 40ms ping and you shoot your enemy and kill them, it's still possible for them to fire a shot in those 40ms, regardless of their ping.

Input lag can play a role in the moment when the hit is registered on the client level and mouse polling is part of that, but the majority of input lag is a result of the signal from your mouse having to go through the CPU first and then the GPU as they are processing the frames. That is why most competitive gamers play at 144 or 240 fps.

All in all, there is no one definitive answer to this question as it depends on the game, the hardware and your internet connection.


Lol..that first paragraph was troll. Of course I'm referring to 2 hitscan weapons. I wasn't aware of shooter's advantage. This explains so much! In BF3, when I would hide but still die..now I know why! So annoying :mad: But doesn't that mean higher ping is advantageous? lol

I didn't understand this paragraph, could you elaborate on this please?

"The shooter's advantage has the knock-on effect that if both game clients send a "shoot and hit" event to the server at roughly the same time, it takes time for that event to reach the server (due to ping). So if you have 40ms ping and you shoot your enemy and kill them, it's still possible for them to fire a shot in those 40ms, regardless of their ping. "

On the monitor I'm buying, input lag is 2 ms & mouse with 2000 hertz polling will have effective lag of .5 ms so I think in that sense, it makes a significant difference to the input lag, without taking processing input lag into account (which could be classified as fps since faster GPU & CPU have higher fps)
 

MasterMadBones

Distinguished
But doesn't that mean higher ping is advantageous?
Yes, but most games also have mitigations in place that prevent this. For example the game could be programmed in such a way that a player can only gain shooter's advantage if the server received the player's input less than 100ms after it was sent by the client.

I didn't understand this paragraph, could you elaborate on this please?
In a shooter's advantage model, it is the shooter's game client that is primarily responsible for registering hits on targets. That means that it sends a "shoot" event as well as a "hit/miss" event. Because there is network latency, it's possible for both clients to send a "hit" event close enough that the second player's "hit" has been sent before the first one reached the server. In that case it is decided that both players die.

On the monitor I'm buying, input lag is 2 ms & mouse with 2000 hertz polling will have effective lag of .5 ms so I think in that sense, it makes a significant difference to the input lag, without taking processing input lag into account (which could be classified as fps since faster GPU & CPU have higher fps)
It's very dependent on the game engine, but inputs are generally processed over two frames (one CPU, one GPU). This results in at least 33.3ms of input lag at 60fps, and 13.9ms at 144fps. Add to that things like monitor response time and input lag (which are unrelated to the "double kill" scenario we're talking about) and input device latency. Going from a monitor with 5ms advertised response time to 2ms only reduces your visual input lag by less than 10% at 60fps and has no effect on input-to-server lag, whereas stepping up to 144fps improves visual input lag by nearly 50%, and also helps input-to-server lag slightly. This doesn't even take into account that monitor manufacturers often advertise their response time in an optimal grey-to-grey scenario. A 2ms advertised VA panel can have up to 50ms of response time in dark scenes. This will lead to tremendous ghosting at high refresh rates.

For input, most games use an internal polling rate that is either tied directly to the framerate or a separate internal clock. Each time the polling window is open, the game will take the last available input from the mouse and keyboard. Extreme polling rates do not have the same effect that high framerates do. A 500Hz mouse is only 1.5ms slower than a 2000Hz mouse when both are in a worst-case scenario. When both are in a best-case scenario, they have effectively 0 input lag.
 

sxk1277

Great
Mar 19, 2020
129
3
85
.
It's very dependent on the game engine, but inputs are generally processed over two frames (one CPU, one GPU). This results in at least 33.3ms of input lag at 60fps, and 13.9ms at 144fps. Add to that things like monitor response time and input lag (which are unrelated to the "double kill" scenario we're talking about) and input device latency. Going from a monitor with 5ms advertised response time to 2ms only reduces your visual input lag by less than 10% at 60fps and has no effect on input-to-server lag, whereas stepping up to 144fps improves visual input lag by nearly 50%, and also helps input-to-server lag slightly. This doesn't even take into account that monitor manufacturers often advertise their response time in an optimal grey-to-grey scenario. A 2ms advertised VA panel can have up to 50ms of response time in dark scenes. This will lead to tremendous ghosting at high refresh rates.

So if you buy really powerful CPU that doesn't need a GPU like an ACU, would your inputs be processed in 1 frame? Where would the input lag be at 240 fps for separate CPU & GPU? Would playing games in grey mode increase response time since it will be closer to advertised grey to grey number?
 
Last edited:

MasterMadBones

Distinguished
.

So if you buy really powerful CPU that doesn't need a GPU like an ACU, would your inputs be processed in 1 frame? Where would the input lag be at 240 fps for separate CPU & GPU? Would playing games in grey mode increase response time since it will be closer to advertised grey to grey number?
APUs don't have any benefit to input lag, because the GPU part still has to wait for the CPU to complete its work. A slight advantage may be that the CPU and GPU share the same memory space, so data from the CPU doesn't need to be moved to VRAM every frame, but you will sacrifice framerate because a discrete GPU is simply faster. The amount of data to be moved is rather small as well, since all the assets should already be in VRAM.

Input lag at 240 fps would be a minimum of 8.3ms. The calculation is 2/framerate.

Grey mode doesn't work like that. The pixels in monitors are divided into red, green and blue subpixels, which are combined to make other colours. If everything on the screen is on a greyscale, it means that each individual pixel has the same brightness for each subpixel, but that brightness is different from pixel to pixel. In theory, if a monitor has faster red subpixels for example, completely disabling green and blue could improve response time, but all you will see is shades of red.

Some monitors (especially VA panels) have a gaming mode, which adjusts the screen's overall brightness to minimise response time by avoiding slow brightness levels. This goes at the expense of colour accuracy. Many monitors also have a separate "quick response time" mode, which allows the pixels to "overshoot" the target brightness a little if that helps them get to their target brightness faster. For example, you need a pixel to go from 20% to 80% brightness, which takes 8ms. However, it takes 5ms to go from 20% to 100% and 2ms to go from 100% to 80%. So instead of going straight from 20% to 80%, it's faster to do 20%-100%-80%. A byproduct of this is that fast-moving objects can have oddly coloured edges.
 

MasterMadBones

Distinguished
Try the grey mode yourself though. You will notice youtube videos run significantly smoother. Why do you think that is?

I think that is human perception. We might be more sensitive to colour stutter than brightness stutter because the central part of the retina (macula lutea) is more sensitive to colour.

Another explanation is that YouTube uses VP9 encoding, which uses the YCbCr colour space. Grey mode effectively eliminates the need to decode for the Cb and Cr channels, which could make playback smoother.