G-Sync Technology Preview: Quite Literally A Game Changer

Status
Not open for further replies.

wurkfur

Distinguished
Dec 27, 2011
336
1
18,965
I personally have a setup that handles 60+ fps in most games and just leave V-Sync on. For me 60 fps is perfectly acceptable and even when I went to my friends house where he had a 120hz monitor with SLI, I couldn't hardly see much difference.

I applaud the advancement, but I have a perfectly functional 26 inch monitor and don't want to have to buy another one AND a compatible GPU just to stop tearing.

At that point I'm looking at $400 to $600 for a relatively paltry gain. If it comes standard on every monitor, I'll reconsider.
 
Competition, competition. Anybody who is flaming over who is better: AMD or nVidia, is clearly missing the point. With nVidia's G-Sync, and AMD's Mantle, we have, for the first time in a while, real market competition in the GPU space. What does that mean for consumers? Lower prices, better products.
 
G

Guest

Guest
This needs to be not so proprietary for it to become a game changer. As it is, requiring a specific GPU and specific monitor with an additional price premium just isn't compelling and won't reach a wide demographic.

Is it great for those who already happen to fall within the requirements? Sure, but unless Nvidia opens this up or competitors make similar solutions, I feel like this is doomed to be as niche as lightboost, Physx, and, I suspect, Mantle.
 
I'm on page 4, and I can't even contain myself.

Tearing and input lag at 60Hz on a 2560x1440 or 2560x1600 has been the only reason I won't game on one. G-sync will get me there.

This is awesome, outside-of-the-box thinking tech.

I do think Nvidia is making a huge mistake by keeping this to themselves though. This should be a technology implemented with every panel sold and become part of an industry standard for HDTVs, monitors or other viewing solutions! Why not get a licensing payment for all monitors sold with this tech? Or all video cards implementing this tech? It just makes sense.
 

rickard

Honorable
Dec 12, 2013
1
0
10,510
Could the Skyrim stuttering at 60hz w/ Gsync be because the engine operates internally at 64hz? All those Bethesda tech games drop 4 frames every second when vsync'd to 60hz which cause that severe microstutter you see on nearby floors and walls when moving and strafing. Same thing happened in Oblivion, Fallout 3, and New Vegas on PC. You had to use stutter removal mods in conjunction with the script extenders to actually force the game to operate at 60hz and smooth it out with vsync on.

You mention it being smooth when set to 144hz with Gsync, is there any way you cap the display at 64hz and try it with Gsync alone (iPresentinterval=0) and see what happens then? Just wondering if the game is at fault here and if that specific issue is still there in their latest version of the engine.

Alternatively I suppose you could load up Fallout 3 or NV instead and see if the Gsync results match Skyrim.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
I think this is a game changer only on setups that can`t reach 60 fps, being proprietary only to nvidia is kinda shot in the foot, just like phys-x there are some people that care about it but most don`t even ecknowledge it, technically it can run on anything, practically only on nvidia and being a monitor side gimmick i would see this as a monitor company tech thing not graphics card maker closed stuff. I`m more interested in Mantle than this, since it preaches about better multicore cpu performance and better fps on your hardware.

Mantle (if it will be what they say ) - better CPU performance, better GPU performance, at some point Open Source!?!? , no need for a new monitor.

G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!
 
Needs to be open. If it's not in a standard, it won't take off. How many people want to rip open a display (likely voiding warranty) to install a pricey addon card, and then be strongly limited to what other equipment they can use it with.

Get it standardised and into the DVI/HDMI/DP specs, then it'll take off.

I wonder if you could just add a flag for variable vertical blanks, and have it send a 'starting next frame' sequence whenever a frame is rendered.

If it's not included by default in monitors, it'll become the next PhysX. And to do that it has to be platform-agnostic.
 

pricetag_geek

Honorable
Nov 29, 2013
180
0
10,710


oh really? I envy your eyes.
 


Considering mantle, what does GPU performance matter on a screen with input lag or screen with tearing, choppy and blurry video?

Mantle will not solve this problem. Mantle is supposed to be more of a low-level common API with enhanced GPU performance as a possible advantage. I'm not sure that even compares to what's being discussed here. Maybe I'm way off???

G-sync will eliminate input lag, tearing and blur and as a result add to the overall realism of the gaming experience.
 

Logsdonb

Honorable
Dec 12, 2013
3
0
10,510
This will become more important as we migrate to 4k displays. At that resolution, maintaining very high frame rates will become more difficult. Allowing a better experience at lower frame rates will become more important and more valuable.
 


the monitor might be expensive right now but it will be good investment if you decide to go that route. at the very least you don't upgrade your monitor as often as gpu. my current monitor has been paired with GTS250, GTX460 and now GTX660 SLI. the only downside is it will locked you to use nvidia gpu only.
 

Dubski

Honorable
Oct 12, 2013
7
0
10,510
If you're already running a 120/144hz monitor, this technology won't be that great or really noticeable. I honestly cannot spot any noticeable tearing or ghosting on my Benq monitor and I would question anyone if they claim they could. With 60Hz monitors this will be great but expect this technology but expect it to be extremely expensive (to the levels of 120/144hz monitors). I think this will be a game changer once the technology becomes cheaper and becomes a standard.I expect this will initially be just for those enthusiasts that have money to blow. I'm basing this completely on this review with these quotes especially being important.

"Now, it's important to understand where G-Sync does and does not yield the most significant impact. There's a good chance you're currently using a screen that operates at 60 Hz. Faster 120 and 144 Hz refresh rates are popular amongst gamers, but Nvidia is (rightly) predicting that its biggest market will be the enthusiasts still stuck at 60 Hz."

"Of course, as you shift over into real-world gaming, the impact is typically less binary. There are shades of "Whoa!" and "That's crazy" on one end of the spectrum and "I think I see the difference" on the other.... In certain cases, just the shift from 60 to 144 Hz is what will hit you as most effective, particularly if you can push those really high frame rates from a high-end graphics subsystem."
 
this is truly a beautiful advancement.

now all we need is true implementation. Because G-sync literally requires a hardware component in the monitor, we need a wide range of monitors to implement this, from your everyday cheap LCD panel to the color accurate IPS panels.

that aside, I know it's too much to hope for, but can Nvidia not lock down these nice things to their own GPUs? c'mon, open it up to AMD and (god forbid) intel's iGPUs
 


however, this kind of competition does not lead to lower prices and better products. it can only lead to fanboism, increased marketing costs (to promote their own 'special thing'), and an increasingly polarized market. I'm glad that AMD is willing to open up Mantle to other companies. Nvidia needs to do the same with this G-sync technology
 

Kewlx25

Distinguished


Ahh, to be blind.
 
Status
Not open for further replies.