G-Sync Technology Preview: Quite Literally A Game Changer

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


So you believe anyone who thinks they can see differences between 24 FPS and 60 FPS are lying, idiots or having a placebo effect?

You do know that most people here can verify the difference.
 

pepe2907

Distinguished
Aug 24, 2010
643
0
19,010
@bystander
What I believe is that it was scientifically proven /for quite long time/ people are unable to see what is drawn in a picture, shown for up to 1/24 sec between other frames.
Meaning - if you are able to notice a frame with such a problem, it's presented to you for a time, longer than enough to notice it. Meaning what you see is stuttering in mid-frame /and I don't know of a particular reason stuttering to happen on full frames only/.
Meaning the problem you have is with stuttering and this won't help much with it /may even make it worse/, but at least you'll be able to enjoy a perfect frame /although for a bit longer/ when that happens /and then it will jump more/.
Should also mention that I had a few years of experience in RT visual presentation&simulation of events - mostly for safety&hazards management and training /at least part of it provable/.
And also - do I believe in advertising - yes, I do; do I believe in /accuracy of/ everything, what's being advertised - no, I don't, but I also believe there are people who do /believe/ - I usually call those people idiots /:sarcasm/, but that's a personal opinion.
Ah, and just cant miss to mention, I remember how I watched how W Vista was mass advertised /and "tested"/ before release /and the same again with the new magic W8 Metro UI/ - at the time I even was banned from a "tech" site /it's extremetech/ for opposing the advertised opinion.
;)
 

Adroid

Distinguished


Dude sounds like you have an opinion like everyone else, but having less than 60 frames IS noticeable, having less than 30 is choppy as hell.

How do you explain subliminal messages in movies that were outlawed because they were effective, even though people couldn't "see" them?

I don't know that you can or can't see more than 30FPS as some claim, but having another 120 frames between the 30 that you do perceive makes for a much more fluid experience. Unfortunately the conscious mind cannot "choose" what frames you perceive. I think N-Sync (or G-Sync as some may call it) helps with that, at least in theory.

The human mind and body are a bit more complicated than you and science make it out to be.
 


I recall in about 1991, I participated in a study that would flash images for fractions of a second and then later asked to see if I could recall these images. This sounds a bit like a study you must have been a part of. The problem is, this is NOT the same as being able to notice that there is something there or being able to see a disruption.

Like I said, most everyone here has played games at 30 FPS and 60 FPS, and they can tell a difference quite clearly. Most people here can tell the difference in tearing between 30 and 60 FPS. According to you, we can't.

Then there are other problems with your facts. There are a lot of people who get headaches or become nausea when FPS are not even, they see tearing or they experience latency. I've found that I require 80+ FPS on a 120hz monitor without v-sync, or I get nauseated while playing mouse driven, first person games for more than an hour. I've run into a few others with similar issues. If there was no perceptible difference between 24FPS and higher, this would not happen.

There have been lots of studies on what we can see. Google away, they constantly contradict each other. The problems they run into, is in the parameters of these experiments. Some show we can tell the differences of FPS up to 1000 FPS, some say 24, which you have to know sounds absurd from a gamer, or anyone who has watched the Hobbit in its original format.
 

pepe2907

Distinguished
Aug 24, 2010
643
0
19,010
@bystander
What I believe is that it was scientifically proven /for quite long time/ people are unable to see what is drawn in a picture, shown for up to 1/24 sec between other frames.
Meaning - if you are able to notice a frame with such a problem, it's presented to you for a time, longer than enough to notice it. Meaning what you see is stuttering in mid-frame /and I don't know of a particular reason stuttering to happen on full frames only/.
Meaning the problem you have is with stuttering and this won't help much with it /may even make it worse/, but at least you'll be able to enjoy a perfect frame /although for a bit longer/ when that happens /and then it will jump more/.
Should also mention that I had a few years of experience in RT visual presentation&simulation of events - mostly for safety&hazards management and training /at least part of it provable/.
And also - do I believe in advertising - yes, I do; do I believe in /accuracy of/ everything, what's being advertised - no, I don't, but I also believe there are people who do /believe/ - I usually call those people idiots /:sarcasm/, but that's a personal opinion.
Ah, and just cant miss to mention, I remember how I watched how W Vista was mass advertised /and "tested"/ before release /and the same again with the new magic W8 Metro UI/ - at the time I even was banned from a "tech" site /it's extremetech/ for opposing the advertised opinion.
;)
 


Well, I don't really have to say much. Everyone here knows better.

But here, this should help you understand. See for yourself the difference between 30 and 60 FPS. If you had a 120hz monitor, it shows the differences up to 120hz. You should have a 60hz monitor to see that difference at least.
http://www.testufo.com/#test=framerates

Another thing you might not know, is with motion blur, due to slow pixel response, the differences can be harder to see, due to everything getting blurred. However, as long as you don't have a horrible monitor in terms of motion blur, it is clear as day what looks smoother.

Now here is one that shows side by side video between these differences. You tell me, after watching this, the effect that V-sync vs non V-sync has. http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-Visual-Effects-Vsync-Gaming-Animation

This one had a lot of question marks as to what was better, v-sync or non v-sync when at FPS between 30 and 60. One had stutter, the other had tearing which has a slightly different stutter feel. G-sync removes this question, as it takes the best of both.
 

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645
This is very true, 60Hz versus 120Hz is less distinct on a QNIX QX2710 (slower transitions), than it is on an ASUS VG248QE or BENQ XL2420TE (faster transitions).

There are two different response measurement standards:

1. GtG response: The time a pixel is mostly in TRANSITIONING state
2. MPRT response (aka "persistence" or "sample-and-hold"): The time a pixel is mostly in STATIC state

(There's a lot of fuzz between the two, since a pixel transition often slows down gradually as it reaches final value)

Google "MPRT repsonse" = "Moving Picture Response Time" for science papers on scholar.google.com as an example. The surprising thing today, is that many monitors that have 1-2ms GtG are actually rated 16.7ms persistence. However, manufacturers don't actually reveal this. Manufacturers have been doing the easier rating (GtG response) while motion blur is now mostly dictated by persistence, since persistence values are now much larger than GtG values today on the modern faster LCD's.

Incidentially, solving this, recently, has often involved some form of strobing which is finally becoming more popular in high-end gaming monitors (e.g. LightBoost, EIZO Turbo240 strobing, BENQ Blur Reduction in upcoming XL2720Z, and GSYNC strobe mode noted by Toms Hardware and John Carmack). More brands of post-LightBoost-era strobe backlights are coming too, from what I know. For those, most of the pixel transitions needs to fit within less than a refresh cycle, in order to gain even further motion clarity improvements via strobing. Inter-refresh crosstalk is becoming fainter (remnant pixel transitions leaking into next refresh), and is almost nonexistent on some of the better models of strobe backlights that I've seen.
 

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645
I agree that humans vision is more complicated than most thing.
I think both you and bystander is right, because things measured different aspects of human vision:

- FACT: Most human eyes CAN'T tell 30Hz versus 60Hz flicker when staring stationary at it (no eye movement), in a darkened room. Scientifically proven.

- FACT: Most human eyes CAN tell apart 500fps@500Hz versus 1000fps@1000Hz indirectly, during motion tests in ideal circumstances, via indirect effects such as motion blur effects & the stroboscopic effects (e.g. wagonwheel effect, mousedropping effect). Scientifically proven. There are people who see DLP rainbows, PWM dimming artifacts, wagonwheel effects, motion blur effects (1ms persistence = 1 pixel of motion blurring during 1000 pixels/second), and many indirect side effects of of finite-refresh-rate displays.

Sounds contradictory? No -- the two above facts measure completely separate things. They are multiple separate scientific effects, which means you are correct & I am correct -- But for very different reasons. Humans ceases to see the stop-motion effect above around 30fps (approximately); everything above roughly around there (give or take; the number varies depending on science paper) -- looks like real motion instead of stop-motion. That's why a lot people think playing at less than 30fps is "unplayable" while playing at 30fps-and-above is "playable" (but obviously, higher the better!). However, there are other effects that going well above 30fps, even well above 120fps@120Hz, still continues to benefit human vision in different ways, for different reasons.

Let's consider the very extreme theoretical holodeck turing test ("I can't believe I was actually standing in Holodeck!") in attempting to distinguish virtual imagery from real-life imagery. In such a test, you really need refreshrateless displays or ultrahigh-refresh-rate displays. Frame rates and refresh rates are human inventions, as there is no other real way to display motion on a screen. Human eyes do not operate with a refresh rate, and real life does not have a frame rate -- objects in continuous motion are never static at any different instant.
 


Now I have...
Interesting, BUT :

"an eventual AMD driver update that enables control over this function."

:heink:

 


I didn't want to point out that bit. :ange:
 


Maybe because you have been branded an Nvidia fanboy a hundred times over. :D

Good thinking. But damn this could take FOREVER to be released in an open standard.
 


But at least it shows that they have taken note and feel that they have an answer.
 


Except of course that their answer they have already declared they don't even have a plan to make it in to a product, it only works on very specific monitors, and since it doesn't actually control the vsync but only has the ability to alter the sync speed slightly you will still have to have v-sync on and use triple buffering which will still cause the input lag and stuttering, it will just reduce it.

It's also just in the concept phase, which means in AMD terms it will be 2016 before it's remotely usable and then 2017 until it actually works after a few patches.

Don't get me wrong, I would love to see an open standard that supports what G-Sync does, but to call this even remotely a competitor to G-Sync at this point is not just grossly optimistic, but down right loony.
 


seen that. but i want amd to show it running real games. when can we expect it ?
 


Your guess is as good as mine mate! :lol:
 
Status
Not open for further replies.