News When 240Hz Just Isn’t Enough: Hands on With the 360Hz Asus ROG Swift 360

360hz...
Oh, for fudges sake...

There's already people swinging their massive e-peens around, gloating and whining about getting - or not - 200+fps while running on monitors that can't even actually display it.
Then for those that are interested in owning such a panel, only some of them are financially prepared for such an investment.

Not to mention, the benefits depend on one's individual reaction speed. Contrary to popular belief, a 240hz - and now, a 360hz - will not help someone dominate in Fortnite, or snag all those headshots in CS:GO or COD.
If you don't have the reaction speed of Neo or Agent Smith, then fuggedaboutit!

Oh, this was funny though: "Okay, maybe calling 280Hz "modest" is crazy, but you trying staring intensely at a Dota map speeding by at 360 and 240 frames per second simultaneously and see if you don’t feel a little wacky too."
Don't forget LoL, as people need ludicrous fps for bloody PVP STRATEGY GAMES!!!
 
  • Like
Reactions: King_V
360hz...
Oh, for fudges sake...

There's already people swinging their massive e-peens around, gloating and whining about getting - or not - 200+fps while running on monitors that can't even actually display it.
Then for those that are interested in owning such a panel, only some of them are financially prepared for such an investment.

Not to mention, the benefits depend on one's individual reaction speed. Contrary to popular belief, a 240hz - and now, a 360hz - will not help someone dominate in Fortnite, or snag all those headshots in CS:GO or COD.
If you don't have the reaction speed of Neo or Agent Smith, then fuggedaboutit!

Oh, this was funny though: "Okay, maybe calling 280Hz "modest" is crazy, but you trying staring intensely at a Dota map speeding by at 360 and 240 frames per second simultaneously and see if you don’t feel a little wacky too."
Don't forget LoL, as people need ludicrous fps for bloody PVP STRATEGY GAMES!!!

Yeah this is getting ridiculous.
 
Maybe in 2-3 GPU generation you will be able to achieve 360FPS to use that monitor. Totally pointless.

360hz...
Oh, for fudges sake...

There's already people swinging their massive e-peens around, gloating and whining about getting - or not - 200+fps while running on monitors that can't even actually display it.
Then for those that are interested in owning such a panel, only some of them are financially prepared for such an investment.
There's still other benefits of 240Hz+ that doesn't have anything to do with reaction time.

-- Higher Hz means lower frame rates can transmit frames faster to the monitor. 100fps has has less lag on a 360Hz monitor than 100fps on a 144Hz monitor (Quick Frame Transport).
-- It's best to have a VRR range larger than your frame rate range, for no-compromises GSYNC/FreeSync.
-- Blurless sample-and-hold. Strobeless ULMB. (See Blur Busters Law: The Amazing Journey To Future 1000Hz Displays). Basically brute Hz as motion blur reduction.
-- Fewer stroboscopic artifacts. See Stroboscopic Effect of Finite Frame Rates.

That said, you indeed want to double your refresh rate in order to much more clearly see benefits.
Comparing 360Hz vs 720Hz and comparing 120Hz vs 240Hz is much more noticeable than 240Hz vs 360Hz
 
Last edited:
[sigh]
Of course, someone has to defend the very niche products...

-- Higher Hz means lower frame rates can transmit frames faster to the monitor. 100fps has has less lag on a 360Hz monitor than 100fps on a 144Hz monitor (Quick Frame Transport).
Lemme guess, lag in a unit of measurement(single digit ms) that the Average Joe, or most users won't notice? Like the difference between 2x DDR4 3200mhz memory kits, one CL16, and the other CL18?
Also, I'd be hella mad spending $500+USD on that and running 100fps out of 360fps - I wouldn't give a derp about QFT(it would've slipped my mind) - I want to see mah big numbahs because it makes my e-peen bigger!
[^My own impression of those fps obsessed folks. I don't actually think like that.]
100 on 144? That wouldn't bother me nearly as much.
Let's not forget the hardware required to run a 360hz VS 144hz...

-- It's best to have a VRR range larger than your frame rate range, for no-compromises GSYNC/FreeSync.
Ok. Can't really argue with that other than the pricing on the larger range models.

-- Blurless sample-and-hold. Strobeless ULMB. (See Blur Busters Law: The Amazing Journey To Future 1000Hz Displays). Basically brute Hz as motion blur reduction.
-- Fewer stroboscopic artifacts. See Stroboscopic Effect of Finite Frame Rates.
Well, I learned I'm not sensitive to stroboscopic effects, so that's a bonus, I guess.
But who benefits from ultra high refresh displays? Or perhaps it's better to ask how many people will benefit from them? The weakest link, or 'bottleneck', is an individual's eyes; it's even in the 2 articles.
There are people who can't even differentiate anything from 60+hz.

That said, you indeed want to double your refresh rate in order to much more clearly see benefits.
Comparing 360Hz vs 720Hz and comparing 120Hz vs 240Hz is much more noticeable than 240Hz vs 360Hz
Perhaps, to your eyes, it is.
Ultra high refresh 1080p won't be a concern of mine. 120hz is fine for me; I'll be moving up in resolution instead.

There's still other benefits of 240Hz+ that doesn't have anything to do with reaction time.
Still hasn't changed the displays from being niche products.
I guess the manufacturer's milk enough from the fps gamer whales to keep 'em coming... I take that back, the technology simply wouldn't progress as fast without them.

Seriously, how many game genres benefit? #1 is fps shooters, hands down... racing, and... the list drops off right there?


I wonder if these new displays are even true 280 and 360hz. Perhaps Asus are using doublers and triplers(is that even a thing?) in those panels?
Welp, that one's in the air until monitor reviews are out.
 
Well, I learned I'm not sensitive to stroboscopic effects, so that's a bonus, I guess.
But who benefits from ultra high refresh displays? Or perhaps it's better to ask how many people will benefit from them? The weakest link, or 'bottleneck', is an individual's eyes; it's even in the 2 articles.
There are people who can't even differentiate anything from 60+hz.
The bigger the difference, the more % of the population notices.

I certainly remember that DVD vs HDTV, many couldn't tell the difference until they became familiar, and now most people do. We've found that people acclimate themselves to the new high-Hz. Witness the high-Hz phones.

4K used to cost thousands. Now retina screens are almost no cost extra. The same thing will happen to refresh rates, albiet at a slower pace. The cost of ultra-high-Hz is not particularly expensive fundamentally speaking, so the costs will filter down.

Perhaps, to your eyes, it is.
Ultra high refresh 1080p won't be a concern of mine. 120hz is fine for me; I'll be moving up in resolution instead.
The silver lining is that at higher resolutions, it has more noticeable motion blur -- more pixels to blur over the same inches. So, 120Hz vs 240Hz is more noticeable at 1440p than at 1080p. But you need more GPU power.

The good news is that frame rate amplification technologies will fill that gap in the coming years.

Still hasn't changed the displays from being niche products.
That's to be expected. 4K was a niche product. Retina was a niche product. Now it's widespread and even in basic phones.

Now 120Hz is becoming more mainstream, thanks to things like the new 120Hz iPads and coming 120Hz phones in 2020. The bleeding edge Hz research is filtering down.

Our expectation is that 240Hz becomes mainstream within a decade or two, and 480Hz and 1000Hz sometime beyond. But 1000Hz monitors will become available to the niche well before then. There are lab tests of 1000Hz already. Also, it all depends on whether GPUs keep up, thanks to new technology such as frame rate amplification technology. When it's almost a $0 cost-add. The refresh rate race to retina refresh rates is a slow one but all the canary signs of mainstreamization is there (gradually).

Stage of bleeding edgeness:
  • Lab prototype
  • Ultra expensive niche product
  • Semi expensive prosumer product
  • High end mainstream product
  • Low end mainstream product
4K is already hitting low end, with Walmart $299 Boxing Day Sale 4K UHD HDTVs. It's no longer the IBM T221 costing 5 figures in early 2000. See?

Not everyone sees 4K but it is there for nearly free. The same thing is happening to Hz.

Right now, 120Hz is transitioning slowly from "semi expensive prosumer product" to "high end mainstream product", with the reports of less expensive phones considering the feature for 2021-2022. 240Hz is still the stage before, and 360Hz is the stage even before that, and 480Hz is lab prototype literally.

Also, in tests, more people could tell a difference between 60Hz and 120Hz than 720p versus 1080p, when presented a Google Map Panning Test in TestUFO. Research shows benefits in surprising ways.

Inevitably there were people who laughed at 4K (much like you're laughing at 360Hz today). See more information about the refresh rate race to retina refresh rates.

240Hz vs 360Hz is a way too incremental IMHO, it's 240Hz vs 480Hz that's a much bigger difference. One needs to go geometrically up the diminishing curve of returns to continue to be noticeable by average humans, up to their maximum eye tracking speed.

Seriously, how many game genres benefit? #1 is fps shooters, hands down... racing, and... the list drops off right there?
No it does not. It just needs a better understanding of the benefits. Even browser scrolling benefits. That much has been said with the new 120Hz phones and prototype 240Hz phones. 240Hz scrolling has 1/4 the motion blur as 60Hz scrolling.

I wonder if these new displays are even true 280 and 360hz. Perhaps Asus are using doublers and triplers(is that even a thing?) in those panels?
Welp, that one's in the air until monitor reviews are out.
I have done high speed videos and confirmed it's definitely doing true 360Hz. I also have an experimental 480Hz display too. I also do work for manufacturers too, so I know what I am talking about.

Also, GPUs will catch up thanks to frame rate amplification technologies.

The 360Hz monitor is probably not for you (but who knows!).

It is likely going to be built-in into your low-cost screen in a couple decades from now anyway. The cow is fully milked on spatial resolution, the cow is partially milked on HDR and blacks (FALD, OLED), and the cow is being milked on retina refresh rates and will be for a couple (or more) decades to come as long as there's noticeable benefits and demand.

Also scientists have confirmed that passing the Holodeck Turing Test (can't tell apart VR versus real life) requires retina refresh rates too in addition to retina resolution. So that's yet another technological progress pressure, in addition to direct-view displays.
 
Last edited:
Oh, and by the way, ASUS and NVIDIA has now roadmapped 1000Hz already.
https://www.pcmag.com/news/372914/i-tried-the-asus-360hz-monitor-and-it-made-me-a-better-game
Quote:

At our meeting with Asus, a rep told me that 360Hz is just the beginning of what the company calls its "road to 1000Hz," so who knows? Maybe in a few years I'll be looking back at 360Hz and wondering how I ever played on a monitor so "slow," and scoring 10 out of 10 every time.

Definitely not going to be niche forever. 4K was niche. People laughed about 4K. Many have stopped laughing about high Hz, especially since high Hz makes CRT-clarity possible strobelessly.

Also, diminishing curve education time:

60Hz -> 144Hz = 2.4x difference
144Hz -> 360Hx = 2.5x difference
240Hz -> 360Hz = only 1.5x difference

Worse, GtG limitations (~1ms GtG), slightly reduce the 1.5x difference, creating the luddite news websites that shame the 240Hz vs 360Hz.

One needs to go up dramatically the diminishing curve of returns to get benefits. And when new ultra-Hz comes out, it tends to reduce the price of lower Hz. I've known for a long time that NVIDIA is now working on frame rate amplification technologies which will enable much more inexpensive frame rate increases in the coming decade.

We'll be publishing a "Blur Busters Roast" article listing news websites that shame the high-Hz stuff, pointing to the 4K laughers of the past as examples. 😀

Blur Busters' job is to 100% mythbust this <Mod Edit> & be scientific and factual. We work 2030s Road Map stuff. Reporters need to understand the refresh rate race better. Right now, it's like 4K in the 1990s, of "Why?". But a lot fewer people are laughing nowadays.

After all, as resolution gets higher, there's more pixels to motionblur over the same inch for same physical motion speed. Higher resolutions amplifies needs for higher Hz. Eventually we'll need 8K 1000Hz later this century, so it will be a long refresh rate race.

And more education pieces to people understand how the refresh rate race needs to be properly milked, while consumers win (lower Hz becomes free, such as free 120Hz features coming to many displays of 2020s). Consumers save money when those lower Hz becomes commoditized. Win Win.
 
Last edited by a moderator:
Oh, and by the way, ASUS and NVIDIA has now roadmapped 1000Hz already.
https://www.pcmag.com/news/372914/i-tried-the-asus-360hz-monitor-and-it-made-me-a-better-game
Quote:



Definitely not going to be niche forever. 4K was niche. People laughed about 4K. Many have stopped laughing about high Hz, especially since high Hz makes CRT-clarity possible strobelessly.

Also, diminishing curve education time:

60Hz -> 144Hz = 2.4x difference
144Hz -> 360Hx = 2.5x difference
240Hz -> 360Hz = only 1.5x difference

Worse, GtG limitations (~1ms GtG), slightly reduce the 1.5x difference, creating the luddite news websites that shame the 240Hz vs 360Hz.

One needs to go up dramatically the diminishing curve of returns to get benefits. And when new ultra-Hz comes out, it tends to reduce the price of lower Hz. I've known for a long time that NVIDIA is now working on frame rate amplification technologies which will enable much more inexpensive frame rate increases in the coming decade.

We'll be publishing a "Blur Busters Roast" article listing news websites that shame the high-Hz stuff, pointing to the 4K laughers of the past as examples. 😀

Blur Busters' job is to 100% mythbust this <Mod Edit> & be scientific and factual. We work 2030s Road Map stuff. Reporters need to understand the refresh rate race better. Right now, it's like 4K in the 1990s, of "Why?". But a lot fewer people are laughing nowadays.

After all, as resolution gets higher, there's more pixels to motionblur over the same inch for same physical motion speed. Higher resolutions amplifies needs for higher Hz. Eventually we'll need 8K 1000Hz later this century, so it will be a long refresh rate race.

And more education pieces to people understand how the refresh rate race needs to be properly milked, while consumers win (lower Hz becomes free, such as free 120Hz features coming to many displays of 2020s). Consumers save money when those lower Hz becomes commoditized. Win Win.

So what GPU are you gonna use for that 360Hz on Modern Warfare to run the game at a stable 360FPS?
 
Last edited by a moderator:
Definitely not going to be niche forever. 4K was niche. People laughed about 4K. Many have stopped laughing about high Hz, especially since high Hz makes CRT-clarity possible strobelessly.
[...]
We'll be publishing a "Blur Busters Roast" article listing news websites that shame the high-Hz stuff, pointing to the 4K laughers of the past as examples. 😀

After all, as resolution gets higher, there's more pixels to motionblur over the same inch for same physical motion speed. Higher resolutions amplifies needs for higher Hz. Eventually we'll need 8K 1000Hz later this decade, so it will be a long refresh rate race.
4K gaming still is niche. Playing modern PC games at 4K requires expensive hardware that puts it out of the reach of the majority of people. And current "4K" on consoles isn't real 4K from my understanding, just upscaled from a lower resolution. An even then I don't think they manage 60 fps reliably.

Regarding 8K 1000 Hz happening this decade, we've had 4K displays readily available for what, 6-7 years? And 4K @ 120+ Hz is still reserved for a small number of high end products. So it took us 6-7 years to double the refresh rate, but you think we'll more than quadruple the refresh rate again (while also quadrupling the resolution) in under 10 years? I don't see it, except maybe some proof of concept design that gets presented at a tech conference that isn't actually available for purchase (or is obscenely expensive). And of course, having hardware powerful enough to make those kinds of resolutions/refresh rates relevant is another matter.
 
4K gaming still is niche. Playing modern PC games at 4K requires expensive hardware that puts it out of the reach of the majority of people. And current "4K" on consoles isn't real 4K from my understanding, just upscaled from a lower resolution. An even then I don't think they manage 60 fps reliably.

Regarding 8K 1000 Hz happening this decade, we've had 4K displays readily available for what, 6-7 years? And 4K @ 120+ Hz is still reserved for a small number of high end products. So it took us 6-7 years to double the refresh rate, but you think we'll more than quadruple the refresh rate again (while also quadrupling the resolution) in under 10 years? I don't see it, except maybe some proof of concept design that gets presented at a tech conference that isn't actually available for purchase (or is obscenely expensive). And of course, having hardware powerful enough to make those kinds of resolutions/refresh rates relevant is another matter.

The only way 8k is going to go mainstream this decade is if it reaches price parity with 4k making it a free upgrade. 8k is to video what the Super CD and HD Audio CD's were to audio. The general public will not be able to tell the difference from the last generation and thus won't care. The recommended viewing distance for a 32" 8k display is 13 inches. 65" screen is 2 feet. Typical 24-27" PC monitor? Duct taped to your face. The only way to experience 8k at typical living room seating distance would be 100+ inch projectors and I can't see that going mainstream. Other features besides resolution and refresh rates will have to drive tv sales this decade.
 
So what GPU are you gonna use for that 360Hz on Modern Warfare to run the game at a stable 360FPS?
Today it's a challenge, indeed.

But that's what drives progress. But tomorrow, these framerates will be easier. A future GPU with frame rate amplification technology (F.R.A.T.)



NVIDIA is currently working on it already. H.265 video use some mathematics very similar, and this is simply an extension into the three dimensions of sorts. 10:1 framerate multiplication ratios are feasible in a GPU ten years from now.

Remember, back in the MS-DOS days on 386 computers, many true-3D games had difficulty playing more than 15 frames per second. The fact that we are even managing 100 frames per second in some complex video games on an RTX 2080 is quite impressive.

Also, even at this early juncture, one doesn't need ALL 360 frames per second to benefit:
  • Quick Frame Transport. (refresh cycles transmitted & displayed in 1/360sec) regardless of low frame rate
  • VRR range bigger than framerate range means esports don't need VSYNC OFF as much anymore
100fps at 360Hz feels much lower lag than 100fps at 144Hz, especially with VSYNC ON (which can have a lag of 3 frames). And you don't get the sudden lag-increase effect when framerates hit top of VRR range, when the top of the VRR range is both (A) almost out of reach and (B) even if hit, the refreshtimes of 2.8ms keep frame queue buffer latency absolutely low.

Also, with the help of F.R.A.T. being developed by industry later this decade --- framerates eventually approach retina refresh rates, all the sync technologies converge to 0 whereupon VSYNC ON is becomes virtually same lag as VSYNC OFF, or whatever sync tech (Fast Sync, Enhanced Sync, G-SYNC, FreeSync). Ultra-Hz is the great sync unifier.

Eventually, even VRR becomes obsolete when refresh cycle granularity is so tiny, that it behaves like per-pixel VRR -- where multiple windows, 24fps, 25fps, 100fps, 75fps, all look equally smooth -- in concurrently running windows. No need for evenly divisible Hz or global variable Hz at such fine refresh cycle granularities (on a per pixel basis) beyond detectable nyquest aliasing effects.

By the end of the century, retina refresh rates will merge with retina frame rates. It's unobtainium power but the engineering path has become visible to today's graphics researchers.

For now we have to pick our poison. We are stuck with going retina spatially (e.g. 8K) or retina temporally (ultra Hz at lower rez). But remember, higher rez amplifies Hz limitations. 60Hz limitation is more obvious at 4K than at 480p.

Yes, yes, this is Hertz Kool Aid. But this, too, is 100% science. Unabashedly and fully obviously so, we mythbust this <Mod Edit>. 480p was once niche too. Then 1080p. Then 1440p. Then 4K. Then 8K. 480p gaming is no longer niche. 1080p gaming is no longer niche.

P.S. I have a peer reviewed display motion blur testing conference paper -- coauthored with NIST.gov, NOKIA, and Keltek -- that is highly relevant here too -- this technique is another method of scientifically measuring how motion blur halves when frame rates and refresh rates double too (as long as GtG pixel response is not a limiting factor).

Yeah, it's unfortunate we have to go geometrically up the diminishing curve of returns -- but many sites don't understand that. The incrementalism (144Hz->165Hz, 240Hz->360Hz) neglects to at least merely acknowledge this fact.

See, we stop 90% of the laughing. (The final 10% are inconsolable). But even 90% is a sufficient vaccine.
 
Last edited by a moderator:
Today it's a challenge, indeed. But that's what drives progress. But tomorrow, these framerates will be easier. A future GPU with frame rate amplification technology.



NVIDIA is currently working on it already. H.265 video use some mathematics very similar, and this is simply an extension into the three dimensions of sorts. 10:1 framerate multiplication ratios are feasible in a GPU ten years from now.

Remember, back in the MS-DOS days on 386 computers, many true-3D games had difficulty playing more than 15 frames per second. The fact that we are even managing 100 frames per second in some complex video games on an RTX 2080 is quite impressive.

Also, even at this early juncture, one doesn't need ALL 360 frames per second to benefit:
  • Quick Frame Transport. (refresh cycles transmitted & displayed in 1/360sec) regardless of low frame rate
  • VRR range bigger than framerate range means esports don't need VSYNC OFF as much anymore
100fps at 360Hz feels much lower lag than 100fps at 144Hz, especially with VSYNC ON (which can have a lag of 3 frames).

Also, as framerates eventually approach retina refresh rates, all the sync technologies converge to 0 whereupon VSYNC ON is becomes same same lag as VSYNC OFF, and eventually even VRR becomes obsolete when refresh cycle granularity is so tiny, that it behaves like per-pixel VRR (where multiple windows, 24fps, 25fps, 100fps, 75fps, all look equally smooth).

Unfortunately I was only talking about what we can use now and in 2-3 years. I agree with you that of course progress goes fast and who knows what we will have in 10 years but at the moment right now...pretty useless unless you have a 4080ti that doesn't exist yet.
 
  • Like
Reactions: mdrejhon
Unfortunately I was only talking about what we can use now and in 2-3 years. I agree with you that of course progress goes fast and who knows what we will have in 10 years but at the moment right now...pretty useless unless you have a 4080ti that doesn't exist yet.
Ha. We can bitch and moan about pre-requisites. Carts before/after the horse. Or even the cart and horse above/under in the 4th dimension, intersecting with a wall like a failed Star Trek teleport. Like the sad non-arrival of 24" OLED desktop gaming monitors.

Fortunately 120Hz becomes mainstream/cheap/free towards the end of decade. Much like how retina-resolution smartphone screens got commoditized. Win Win.

Give it all time!
 
Last edited:
Today it's a challenge, indeed.

But that's what drives progress. But tomorrow, these framerates will be easier. A future GPU with frame rate amplification technology (F.R.A.T.)



NVIDIA is currently working on it already. H.265 video use some mathematics very similar, and this is simply an extension into the three dimensions of sorts. 10:1 framerate multiplication ratios are feasible in a GPU ten years from now.

Remember, back in the MS-DOS days on 386 computers, many true-3D games had difficulty playing more than 15 frames per second. The fact that we are even managing 100 frames per second in some complex video games on an RTX 2080 is quite impressive.

Also, even at this early juncture, one doesn't need ALL 360 frames per second to benefit:
  • Quick Frame Transport. (refresh cycles transmitted & displayed in 1/360sec) regardless of low frame rate
  • VRR range bigger than framerate range means esports don't need VSYNC OFF as much anymore
100fps at 360Hz feels much lower lag than 100fps at 144Hz, especially with VSYNC ON (which can have a lag of 3 frames). And you don't get the sudden lag-increase effect when framerates hit top of VRR range, when the top of the VRR range is both (A) almost out of reach and (B) even if hit, the refreshtimes of 2.8ms keep frame queue buffer latency absolutely low.

Also, with the help of F.R.A.T. being developed by industry later this decade --- framerates eventually approach retina refresh rates, all the sync technologies converge to 0 whereupon VSYNC ON is becomes virtually same lag as VSYNC OFF, or whatever sync tech (Fast Sync, Enhanced Sync, G-SYNC, FreeSync). Ultra-Hz is the great sync unifier.

Eventually, even VRR becomes obsolete when refresh cycle granularity is so tiny, that it behaves like per-pixel VRR -- where multiple windows, 24fps, 25fps, 100fps, 75fps, all look equally smooth -- in concurrently running windows. No need for evenly divisible Hz or global variable Hz at such fine refresh cycle granularities (on a per pixel basis) beyond detectable nyquest aliasing effects.

By the end of the century, retina refresh rates will merge with retina frame rates. It's unobtainium power but the engineering path has become visible to today's graphics researchers.

For now we have to pick our poison. We are stuck with going retina spatially (e.g. 8K) or retina temporally (ultra Hz at lower rez). But remember, higher rez amplifies Hz limitations. 60Hz limitation is more obvious at 4K than at 480p.

Yes, yes, this is Hertz Kool Aid. But this, too, is 100% science. Unabashedly and fully obviously so, we mythbust this https://www.2-spyware.com/remove-moka-ransomware.html. 480p was once niche too. Then 1080p. Then 1440p. Then 4K. Then 8K. 480p gaming is no longer niche. 1080p gaming is no longer niche.

P.S. I have a peer reviewed display motion blur testing conference paper -- coauthored with NIST.gov, NOKIA, and Keltek -- that is highly relevant here too -- this technique is another method of scientifically measuring how motion blur halves when frame rates and refresh rates double too (as long as GtG pixel response is not a limiting factor).

Yeah, it's unfortunate we have to go geometrically up the diminishing curve of returns -- but many sites don't understand that. The incrementalism (144Hz->165Hz, 240Hz->360Hz) neglects to at least merely acknowledge this fact.

See, we stop 90% of the laughing. (The final 10% are inconsolable). But even 90% is a sufficient vaccine.
You sound like an audiophile trying to convince us we all need a $1300 HDMI cable to experience what the engineers intended to be the true 4k experience. No matter how much made up technobabble (remember, that $1300 HDMI cable has: "Dielectric-Bias System (DBS US Pat # 7,126,055) Significantly Improves Audio Performance") you come up with to justify your stance, the general public isn't going to be able to tell the difference from cheaper mainstream options and won't care.
 
Last edited by a moderator:
  • Like
Reactions: NightHawkRMX
You sound like an audiophile trying to convince us we all need a $1300 HDMI cable to experience what the engineers intended to be the true 4k experience. No matter how much made up technobabble (remember, that $1300 HDMI cable has: "Dielectric-Bias System (DBS US Pat # 7,126,055) Significantly Improves Audio Performance") you come up with to justify your stance, the general public isn't going to be able to tell the difference from cheaper mainstream options and won't care.
@spongiemaster @NightHawkRMX As the resident refresh rate celebrity -- Blur Busters may write "eagerly" like those silly HDMI stuff that often has no basis in science -- but we are 100% real science.

I'm in research papers such as this NVIDIA scientific paper (Page 2) and pursuit camera paper (Co-author). More publicly, we write in an engaging and easy Popular Science format, to bridge between the mainstream and the boring science papers.

In Motion Contexts, Bigger than 4K vs 8K (as long as GtG near 0)

Many people do not realize how big a difference it can make in certain material, such as virtual reality, where it can mean the difference between nausea and no-nausea. And the theoretical Holodeck Turing Test (can't tell apart real life from VR) also requires retina refresh rates.

We are 100% based on real science. In various situations, the Hz differences we're talking about is bigger than 4K versus 8K for example. 240Hz vs 360Hz wouldn't be it, but 240Hz vs 1000Hz is already proven in the lab (to 95% of humans) be a bigger difference than 4K and 8K. For now, 8K 1000Hz is a pipe dream.

Pixel response and lower resolution has long been a limitation in revealing Hz benefits. Faster pixel response and higher resolutions amplify limitations of low Hz. GtG tiny fraction of a refresh cycle, as well as the fact that higher resolutions means more pixels per inch (for motion blurring or stroboscopics) -- increasing the appearance difference between stationary images and moving images.

There are multiple thresholds of human detections of display frequencies:

1. Flicker Detection: ~70Hz

Approximately ~70Hz to ~100Hz, varies from human to human. This is when direct detection of flicker stops.

Note: People may still get some headaches beyond ~100 Hz even if the flicker is hard to detect, though -- the threshold is very fuzzy and human-dependent. A few super sensitive humans may see flicker beyond 100 Hz, while others may not see flicker to a threshold well below 70 Hz. Not everybody's vision is identical, nor are their brains. Also sensitivity of peripheral vision is higher, especially in brighter environments and where the flicker is a sharp squarewave duty cycle. However, 70Hz-100Hz is an approximate range.

2. Single-Frame Identification: ~300Hz

The famous "Fighter Pilot study" everybody cites as a human vision limitation -- about 1/250sec - 1/300sec for a one-frame flash of an object. This assumes non-brightness compensated (it doesn't compensate for the Talbout-Plateau Theorem -- need to double brightness to make a half-length flash equally visible).

However, games aren't always single-frame. Everything is in motion, and there are other effects to consider such as below. Read more:

3. Latency Sync Effects: Beyond 1000 Hz

In augumented reality situations, where you need sync between real-world versus virtual, virtual graphics will lag behind real graphics. A higher refresh rate helps solves this. It also applies to sync between a physical finger and a virtual screen image. Here's a Microsoft Research Video of 1000Hz.


4. Stroboscopic Artifacts: Beyond 1000 Hz

This is the common mousearrow effect, as seen at testufo.com/mousearrow and is also visible in games, as explained at blurbusters.com/stroboscopics .... A mouse cursor moving one screenwidth per second on an 8K display would require 7680Hz in order to be a continuous blur without any stepping. Likewise for bright objects on dark backgrounds during mouseturns in FPS games, for example (see screenshots in above link).
MouseStepping-60vs120vs240.png

This does not just apply to mice, but also any fast motions, panning, turning, scrolling, especially when tere's both stationary and moving objects simultaneously on the same screen -- giving opportunity for different targets for eyes to fixate on (to more easily detect stroboscopic gaps). Many scientific papers exist, such as this one and this one.


5. Persistence blurring: Beyond 1000 Hz

See for yourself at testufo.com/eyetracking .... Refresh cycles are displayed statically for the full duration of a refresh cycle.

When you track your eyes on a moving object, your eyes are in a different position at the beginning and end of a frame's duration, since your eyes are in analog continuous pursuit. At 1000 millmeters per second eye tracking (along the screen plane) -- 60Hz means your eyes have pursuited 1/60th of that (about 16-17 pixels) -- creating 16-17 pixels of eye-tracking-based motion blur on those static 1/60sec frames.

You can see for yourself in this additional motion animation, and in many science papers written over the decades about something called the "Sample And Hold Effect" -- even at places like Microsoft Research.

In photography contexts, a 1/1000sec sports shutter is clearly sharper than a 1/100sec shutter for fast-action stuff. There are diminishing curve of returns too, but the motion blur equivalence is exactly the same (for GtG=0, using the full MPRT 100% measurement instead of industry standard MPRT 10%->90%).

You can even see for yourself in motion tests such as testufo.com/eyetracking and testufo.com/persistence

On a non-impulsed display (sample-and-hold), doubling frame rate halves motion blur. That's assuming a display that doesn't use flicker, strobing, BFI, phosphor (CRT), or other temporal impulsing method to reduce motion blur. The closer you get to retina refresh rates, the more persistence-based motion blurring disappears.

motion_blur_from_persistence_on_sample-and-hold-displays.png


While, 240Hz vs 360Hz is more subtle, the comparison between 240Hz vs 1000Hz is much more dramatic.

The jump from 60Hz->144Hz and 144Hz->360Hz are more representative comparisions, and that's why ASUS' booth compared 144Hz vs 360Hz. There is a need to jump geometrically up the diminishing curve.

As long as GPU power is available for stratospheric framerates in modern games (it will eventually be), the benefits are there -- and is more easily noticeable than 4K versus 8K for motion contexts. Especially if the retina Hz (ultra high Hz) is concurrently used with retina resolution (ultra high resolution).

Science-minded people who would like to understand better, should read the 1000Hz Journey article. Those who don't believe beyond X Hz tend to be mainly the anti-science people. It's not as simple as "Eyes can only see X Hz" because there are a lot of variables that interact with each other (i.e. higher resolution amplify limitations of a specific Hz when it comes to noticing blur or stroboscopics).

Yesterday, 1080p was a fortune. Today, 1080p is free. Formerly, many thought retina screens were a silly waste. But now it's in every smartphone. So will higher Hz, too. From bleeding edge to mainstream, 120Hz will be mainstream already in a few years (it's coming to the next iPhone and Galaxy).

Even if you hate the silliness high Hz (much like hating the silliness of 4K back 15 years ago) -- it's coming anyway -- but it is useful to understand why it's useful, why it's coming (even high Hz helps browser scrolling), and why high Hz will be free in a decade or two (much like retina spatial resolution).
 
Last edited: