Ryzen Versus Core i7 In 11 Popular Games

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

HavocdotEZ

Prominent
Mar 27, 2017
6
0
510


Lol if you have a 1440p monitor you are probably playing on 1440p. I doubt your computer is able to produce 144 frames on any new games at 1440p resolution so of course you will not see an improvement. If your graphics card and cpu can hold your frames around 144 you will notice exactly what I am talking about. The smoothness of your mouse movement, significantly reduced motion blur, and it's even been shown that objects on a high refresh rate monitor appear fractions of a second before a 60hz monitor. Whether or not you can react to that is your problem, but it definitely gives you a competitive advantage.
 

prudovik

Honorable
Sep 17, 2014
3
0
10,510
You say We don’t expect the Ryzen 7 processors to beat Intel’s Core i7-7700K and Core i5-5600K in most games.
Maybe it is Core i5-6600K or Core i5-7600K, but Core i5-5600K... Realy?
 


Over 60 FPS has less of an effect if you're using a 60 Hz monitor though.

And the thing is, most gamers still use 60 Hz monitors.
 

HavocdotEZ

Prominent
Mar 27, 2017
6
0
510


Yes, I know... But the benchmarks over 60 fps still matter for people who have high refresh rate monitors. People are discrediting it as if it doesn't matter at all.
 


Actually, there are a few wrong things in your arguments.
First, if frames over 60Hz don't matter, then why is the VR industry so focused at 90 or even 120Hz? I know, different application equals different requirements, but this is proof enough that it makes a difference.

Second, the part about reaction time:
Highly competitive fps players can have reaction times around 200ms (taking into account mouse/input lag, and being quite conservative).
So take 2 highly competitive players with similar skill levels and equipment, and place them one agains each other.
Add to that 16ms for a 60Hz display, or 7ms for a 144Hz one. Now you have a reaction time of 216ms vs 207ms in the pull of a trigger. A 4-5% reaction time difference may not mean much in low skill levels or very dissimilar players, but in high skill levels it can actually help someone quite a bit.

As for the subjective part: The IS a difference.
The difference might not be significant for some people, while it is vital for some others. If I had to choose, I'd rather play ~20fps high settings than 60fps low settings, and some consider that to be crazy.
Just like some people are erally bothered by flicker, and some don't even notice.
I am extremely bothered by the grainy look of anti-glare panels, and I'd much rather have an anti-reflect panel.
I also prefer a high dpi monitor (>120dpi) 40-50cm away from me, rather than a big screen 2m away. That is all about preferences, and everyone has their own. How much is it worth? Depends on how much you care about that particular aspect.
 
First, if frames over 60Hz don't matter, then why is the VR industry so focused at 90 or even 120Hz? I know, different application equals different requirements, but this is proof enough that it makes a difference.

Can't even ... what the ...


So much wrong with what you wrote.. this is basic biology. Human eyes don't see in "FPS" or "frames" and there aren't any super human cyborgs with special vision. The only difference higher refresh rates makes it in extreme contrast scenarios as it allows for multiple transitional frames to take place and have a gradual shift in light pattern vs a stark White -> Black -> White or Bright Red -> Blue -> Red type shift. It provides absolutely zero "competitive" advantage, it's one of those placebos that's about as effective as wearing your special underwear and socks.
 


My computer is a custom water cooled rig with dual GTX 980 TI's, it can do the required frame rates at that resolution. The entire reason I put that kind of money into it was for 3D Gaming, hence the Asus ROG monitor with overpowered graphics and CPU.

Mouse blur ... seriously that's a windows artifact ... and that's what your basing your assumptions on.
 

HavocdotEZ

Prominent
Mar 27, 2017
6
0
510


There is something wrong with your eyes. You can ABSOLUTELY see the difference between 60hz and 144hz. If you even bothered to search videos on youtube you wouldn't look like a fool right now. Below is just one of MANY videos that demonstrate the difference.
https://www.youtube.com/watch?v=928VyYQxKKo
 
Article above explains the biology better then I could.

“The middle part of your vision, the foveal region, which is the most detailed, is actually pretty much garbage when it comes to detecting motion, so if you’re watching things in the middle of the screen moving, it’s not that big a deal what the refresh rate is; you can’t possibly see it with that part of your eye.”

But out in the periphery of our eyes we detect motion incredibly well. With a screen filling their peripheral vision that’s updating at 60 Hz or more, many people will report that they have the strong feeling that they’re physically moving. That’s partly why VR headsets, which can operate in the peripheral vision, update so fast (90 Hz).

Chopin looks at the subject very differently. “It’s clear from the literature that you cannot see anything more than 20 Hz,” he tells me. And while I admit I initially snorted into my coffee, his argument soon began to make a lot more sense.

He explains to me that when we’re searching for and categorising elements as targets in a first person shooter, we’re tracking multiple targets, and detecting motion of small objects. “For example, if you take the motion detection of small object, what is the optimal temporal frequency of an object that you can detect?”

And studies have found that the answer is between 7 and 13 Hz. After that, our sensitivity to movement drops significantly. “When you want to do visual search, or multiple visual tracking or just interpret motion direction, your brain will take only 13 images out of a second of continuous flow, so you will average the other images that are in between into one image.”

Discovered by researcher Rufin vanRullen in 2010, this literally happens in our brains: you can see a steady 13 Hz pulse of activity in an EEG, and it’s further supported by the observation that we can also experience the ‘wagon wheel effect’ you get when you photograph footage of a spinning spoked object. Played back, footage can appear to show the object rotating in the opposite direction. “The brain does the same thing,” says Chopin. “You can see this without a camera. Given all the studies, we’re seeing no difference between 20hz and above. Let’s go to 24hz, which is movie industry standard. But I don’t see any point going above that.”

The important thing here is that Chopin is talking about the brain acquiring visual information which it can process and on which it can act. He’s not saying that we can’t notice a difference between 20 Hz and 60 Hz footage. “Just because you can see the difference, it doesn’t mean you can be better in the game,” he says. “After 24 Hz you won’t get better, but you may have some phenomenological experience that is different.” There’s a difference, therefore, between effectiveness and experience.

And while Busey and DeLong acknowledged the aesthetic appeal of a smooth framerate, none of them felt that framerate is quite the be-all and end-all of gaming technology that we perhaps do. For Chopin, resolution is far more important. “We are very limited in interpreting difference in time, but we have almost no limits in interpreting difference in space,” he says.

Your brain's got a clock rate...

Sensing a flicker, which is just a sharp Black -> White -> Black transition isn't the same as sensing motion nor the same as synthesizing visual information into information for the brain to act on. Higher frame rate monitors are mostly for minimizing the effects of those sharp contrast transitions by giving more intermediate frames for a smoother transition. Pulsing light vs strobe effect, won't do jack shit for "competitive ability". I'm typing this on an Asus ROG Swift at 144hz and right next to it is a large Acer monitor at 60hz, the Asus is much easier on my eyes after hours of reading and doing stuff (Black Text with White Background).
 


My post was mainly about added lag from the information (server) to the frame on screen. There is a difference, and that difference is exactly as big as comparing a 1ms mouse with a 10ms mouse. Not extreme, not even big, but enough to be able to give an extra "edge" in a close matched competition, where every single detail matters, and where a 4- 5% speed difference can be significant.
 


It doesn't add any lag once so ever because your brain only works so fast and any additional information just gets interpolated with the rest before the next action is done.

What you and every other half-informed person is thinking of isn't from the monitor to you but from you to the monitor and back again. Consoles from the PS2 era and earlier (maybe even PS3) had their input buffers wired to be read every time a frame was drawn, this practice was kept when they did those shitty console ports. Thus raising FPS also raised how often the input buffer was read. Whats worse is that many of those titles actually read the buffer every 4th frame instead of every frame and thus created massive lag at low or inconsistent FPS that magically "went away" at really high FPS. This problem has mostly been resolved by input buffers being handled on a separate thread then physics and rendering, though some games might still cheat and not actually DO anything about the input until a specific number of frames has happened. This is poor software coding and not your eyeballs having super human mutant capabilities enabling you to frag 6ms faster then the next guy. Frame rate consistency is far more important then absolute frame rate. A frame rate that's 60hz consistent will do better then one jumping between 60 and 80, precisely because the eyeball doesn't interpret information in "frames" and instead as a steady stream of information that it analyzed in predetermined intervals.

We are humans, humans have reaction times measured in hundreds of ms 200ms response is stupidly fast for a well trained fighter pilot, normally we'd be speaking of of 300~500ms time for someone's brain to observe input, analyze it, decide on action, then transmit those commands to the muscle groups. What happens is the brain is capable of batch processing, it can create an action consisting of multiple impulses and then hand it off to the muscle control part which executes all of those without needing any sort of additional input. That is how we can anticipate things and create the appearance of "faster" reacting times.

The science is pretty well known on the human visual system and it's conclusive that having high refresh doesn't do jack shit for reaction times and thus about as helpful for "competitive advantage" as lucky underwear.
 


If that is true, then I have admit my mistake. From my understanding, it IS a steady stream of information, but I understood it as being constantly processed, not in batches.
Meaning, if we get an impulse 10ms earlier (t=-10ms), it is analyzed (and processed, and etc) 10ms earlier than if we hipotetically got it at t=0ms.

I find this to be a very interesting topic, and I must admit to being incorrect. Although I will also say that I am not convinced yet, but you seem to be more informed than I am.
So, I will admit you are right for now, but read more on the subject to confirm it.

Thinking about it, what we need to process (in games) is actually an 'event', like the sudden appearance of an enemy. Our brain would be waiting for that, and react once it sees (and processes) that. Wouldn't that 'event' be like a 'trigger'? Hard to explain, I hope you get it and shed more light on this.



To further explain what I was saying, I will also add that no, I was not thinking about "behind the monitor". It was about the fact that you have a constantly changing stream of information. If you have double the framerate (or update cycle of that given information in your screen), you are getting that information faster.
To add to this specific point, I realised that, since this is a completely discrete stream of information, double the update rate actually means 50% chance of getting it one cycle (frame) earlier, and 50% chance of getting it at the same time (depending on when the specific change to be displayed happens), so the effects would be cut in half on average (if it was 5%, it would only apply to 50% of the cases, so 2.5% on average). I can give examples of what I'm trying to say if necessary.
 
Another thing to note is that rods and cones each have their own sensitivity and response times. Cones which are what do your color sensitivity and virtually all of your detail only work on light from the dead center focus and only transmit 10~14 times per second. Slower responce time is the price our color vision pays for having so much detail. Rods which are what do our luminescence can only transmit light as intensity and are basically monochromatic (black-white). They operate dozens of times per second and are extremely asynchronous yet are primarily focused on our peripheral vision and transmit low detail.

Essentially our visual system operates on two levels, the first is motion sensing which is primary a job of Rods in our peripheral vision. After a motion is detected, the eyes automatically focus and lock onto the motion and the Cones then sense much more information about what we are seeing so the brain can process it. The first part happens really fast and is the reason people *claim* to "notice 144hz!!!!" while the second part happens much slower and is what derives our OODA loop (Observe Orientate Decide Act).
 
If that is true, then I have admit my mistake. From my understanding, it IS a steady stream of information, but I understood it as being constantly processed, not in batches.
Meaning, if we get an impulse 10ms earlier (t=-10ms), it is analyzed (and processed, and etc) 10ms earlier than if we hipotetically got it at t=0ms.

The visual subsystem doesn't immediately act that way, there only T=0 moment is when you wake up an open your eyes. Different parts of your eyeball interpret information difference and transmit that information to the brain difference, your brain interprets that different information in different ways and reacts at different speeds. Your consciousness is really slow while your subconscious responds a bit faster (fight or flight says hi). The batch processing has to do with your consciousness / subconsciousness and your muscle control center. Tap your finger as fast as you can and then try to suddenly randomly stop. You'll notice your finger tries to move a few more times even after the sudden stop command is issued, that is because your brain originally sent a sequence of commands to your muscle control center (can't remember the exact names for all these different regions) that basically said "move these muscles a lot without any further input" and it did exactly that. You didn't have to individually think about each tap, in fact you can feel the difference in your brains speed by trying the tap again but this time don't tap until after your registered that the previous tap was completed. If your honest you'll notice that trying to pay attention to your taps has them moving slower, that is your brains processing speed and the reason that any amount of "less then 16ms time for my gamez" is pure bullshit. The human brain simply isn't fast enough to process that increased information intake.

What people *notice* about higher HZ displays is their rods reacting to the contrast changes between white and black display elements, like say a white mouse over a dark desktop. This is not what would happen inside a competitive FPS where the strobe effect of a white -> black -> white transition doesn't matter.
 


I will concede the point of the competitive advantage, but I will insist on the point of being able to notice higher than 60Hz refresh rates, even in gaming. And I mean "notice" with 2 meanings:
1- Being able to tell it is there if actively trying to see it (not really important)
2- Affecting you subconscious state of imersion, especially if you are used to high refresh rates. This can be seen when someone who is used to higher refresh setups go back to 60. I guess low detail peripheral vision plays a role here.

"Being used to it" makes a huge difference, and to be honest, I spent years not being able to notice 24fps from 60fps. I played everyday in a ~20fps setup, and one day many years ago I upgraded to a constant 60fps one, and didn't notice any difference. After a few months, wen't back to a fixed constant (locked) 24fps setup, and noticed it a lot (something I had never noticed before).

And, to be honest, I can very clearly see constant/fixed 24fps as "slow", where you can actually see the "frames" if you pay attention (more like a jittery movement), so I guess that the 10-14 times/sec only apply on some situations?

I just want to clarify: I'm not "arguing" or "antagonizing", I'm trying to discuss a topic I consider interesting, getting some good info out of this, and presenting my views as well.
 


Points number 1 and 2 are easy to understand if you remember that your eye's rods respond much faster then your eye's cones. The rods are for sensing light intensity changes which is what happens when something moves in your peripheral vision. It's already known that rods can sense flickering lights at 60hz and up till about 90hz which is what you and everyone else is noticing. Cones, which are what sense color and details, work at a fraction of the speed of rods and are clustered such that the center of your vision gets their attention. Movies are 24fps because their static and rarely contain rapid flash's of light or sudden rapid contrast shifts, unless the film director wants that unpleasant effect to take place. Because of this fact the area where having really high refresh rate helps the most isn't video games but reading black and white text for hours on end, after you turn down the intensity of course. If you want to see this in action make a black circle surrounded by a smooth non-black circle, have your mouse set to white. Now move the mouse around the lighter area, then move it around the black area, there should be a noticeable difference at 50~60hz and that difference goes away as you approach 90hz.

Try to remember that different parts of your eyes work at different speed, so just because your rods detect changes in light intensity near the edges of your vision doesn't mean your cones are reporting detailed color changes in the center of your vision.
 



Then, I will insist that, even based on the points you gave now, higher-than-60fps actually have an effect, and are not "imperceptible".
As I said several posts above, how much they impact your experience is up to each one, different people are annoyed (or pleased) by different stuff. But it certainly has a percievable effect.

 


For eye comfort sure nobody should be arguing that, the strobe effect from flickering can make your eyes tired over a period of time. What people are saying is that their super bionic eyes can react faster and interact better at higher refresh rates, which is complete bullshit. The center of the eye, which is where detailed information comes from, simply doesn't react that fast. "It looks smoother" "it feels faster" "I can react quicker" are all bullshit placebos. Human can sense artifacts at a very high speed but we can't draw information from them as they becomes smeared by all the incoming light after they happen.

And back to the subject at hand, this was all about "fps in gaming", which means drawing useful information out of an image to form a mental picture of whats happening on the screen. The limiting factor of this isn't your monitors refresh rate or your graphics card but your eyeballs ability to sense color and your brains ability to derive useful information from it. This happens about a dozen times per second on average, fourteen to fifteen maximum for people with trained eyes. In order to create an optical illusion you need to go twice as fast as the maximum perception rate, so 20~30 individual images drawn sequentially every second. That's why movies are at 24fps, that number wasn't random and production studios don't spend billions of USD every year to create slideshows. That number is sufficient to create the illusion of motion but it doesn't handle random artifacts very well, any unforeseen interruption or dramatic contrast shift will be felt by the audience as discomfort. To do that properly we need to insert a few extra frames to create more samples and less light smear in the eye and so we get to 50~60 fps for a fluid image with minimal smearing. That number as the fps target for games wasn't random. After that point any additional visual information just gets smeared in with the rest and the only part of the eye that even notices are the rods which detect changes in light intensity up to about 45 times per second (so need 90fps to completely fool them). Again we're not detecting color shifts or objects moving around, we're only detecting light intensity changing and don't even have information on exactly what changes, only that ~something~ changed and a general direction (measured because each eye would sense a different intensity shift).

Humans didn't evolve to play video games, we evolved to avoid danger in the jungle and hunt smaller animals.
 

HavocdotEZ

Prominent
Mar 27, 2017
6
0
510


Clearly the article interviewed highly knowledgeable professors in the field of vision/brain. And in the conclusion they all stated that you CAN see a difference in smoothness. So who the heck are you to say they are incorrect, what are your credentials?

"Whether that plateaus at 120 Hz or whether you get an additional boost up to 180 Hz, I just don’t know.”
“I think typically, once you get up above 200 fps it just looks like regular, real-life motion,”
"But in more regular terms he feels that the drop-off in people being able to detect changes in smoothness in a screen lies at around 90Hz."

1. Professor Thomas Busey, associate department chair at Indiana University’s Department of Psychological and Brain Sciences
2. Jordan DeLong is assistant professor of psychology at St Joseph’s College in Rensselaer, and the majority of his research is on visual systems

 
Read it again, they were speaking about your eyes rods and not the cones. Rod detect changes in light intensity not detail, the cones which detect color and details operate much slower.

You don't "see" faster then 10~15 fps but your peripheral vision can detect sudden light changes (flicker) at 45+ times per second. In general you need at 2~3x the maximum detect rate for the illusion of motion and thus 24~30fps for controlled preproduced motion images and 45~60 for spontaneously generated ones.

A protohuman stalking the African Sahara and suddenly his rods would notice a sudden change of light in his peripheral vision, this would notify his brain that something moved and he would then turn his head or refocus his eyes onto where this movement was, his much slower cones would then scan the object and provide color details on what it is. The human visual center only process's 7~14 individual images per second, any additional information from the last time it processed is just added together and creates a smeared image. You can notice this effect if you wave your hand really fast in front of your face, those after images are there result of light being smeared and added together inside your visual center before your brain can process it.

This is well known biology but keep believing that gamers have these special bionic eyes that see and react before the brains even processed the information.
 

HavocdotEZ

Prominent
Mar 27, 2017
6
0
510


No one is arguing the reaction time of gamers. We are all saying that a 144hz monitor is smoother than a 60hz monitor which you keep denying. Thus a gaming experience on a 144hz monitor would be a better experience than a 60hz monitor for most people.

You keep trying to break down the eye into individual components as if we are robots. It's like you're trying to tell someone that purple is not a color and that they are actually just seeing blue and red. Nobody cares, it's purple to us and nobody cares what your rods and cones detect, a 144hz monitor is smoother than a 60hz monitor. End of story.
 


Please stop cluttering the thread with this misinformation. It's easy to tell the difference just between 60 FPS and 90 FPS, and VR headsets had to have these higher refresh rates (and the system to render enough frames) to avoid giving people VR sickness. If you couldn't see the difference, that wouldn't have been necessary, but the fact that it was necessary conclusively proves that you can see the difference.
 


The thing is, motion in film is depicted on-screen in a different way than it is for games. When a film is recorded, the shutter for each frame remains open for around 1/24th of a second, recording all motion during that time period, and as a result, a still of the frame will appear blurred, but when played back, each frame will smoothly lead into the next and your brain will see it as smooth motion. And of course, there's no need to interact with the scene at all, let alone in a quick and accurate manner.

A game, on the other hand, only depicts each frame as an instantaneous moment in time. The scene isn't a smooth blur from one frame to the next, but more like a series of photographs taken at a high shutter speed. If you play those perfectly-sharp photos back at 24fps, the result will look rather choppy, since there's no transition between each photo and the next. If an object zips across the screen, it will look like it's teleporting between a handful of locations, rather than smoothly blurring as one would naturally see the object in real life. Some games may try to mask this a bit by using an effect that looks vaguely like natural motion blur, but it only does a rough job emulating that, and can make things appear worse at low frame rates. Now, imagine you're trying to aim at a small, moving target in the distance, and you see that target skipping between locations as you move your mouse to redirect your aim. Your brain can have a harder time judging the movement of things in that situation, and you likewise won't get as smooth of feedback between the motion of your hand, and your view on-screen.

From your posts, I got the distinct impression that you don't actually play any competitive multiplayer shooters, and a quick look at your Steam profile seems to verify this. Virtually every game you play are either RPGs or strategy games, with the occasional platformer thrown in. All relatively slow-paced games that don't require particularly fast reaction times or precise, split-second aiming with rapid camera movement. Those kind of games don't benefit as much from increased framerates, so I can see why you might not be able to notice framerates above 60fps as easily in those games. If you have hardware capable of pushing higher framerates than you find useful, you might be better off with a higher-resolution monitor to make the games look a little nicer at lower framerates.

I can say though, that in competitive shooters where I can maintain sufficient framerates, I can tell the difference between 60 and 75Hz on my monitor, and I play a bit better at the higher refresh rate. The 25% increase in frames can make an actual difference and help smooth out aiming. The benefit is small but it's there. Increasing the refresh rates further could help a bit more, although there are diminishing returns the higher you go, and a difference of 10 or 20 frames per second will likely to be pretty unnoticeable in the 100Hz+ range.
 
Status
Not open for further replies.