G-Sync Technology Preview: Quite Literally A Game Changer

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860
If NVIDIA made everything open, AMD could basically lay off their whole software team. Its not NVIDIA's responsibility to subsidize other people's software. If AMD doesnt have x and y that NVIDIA does, then AMD needs to come up with their own things.
 

slyu9213

Honorable
Nov 30, 2012
1,054
0
11,660
This is more innovative then what the next-gen consoles will be. If G-Sync would work with AMD I would get it with the R9 290 I'm planning to purchase. But currently now I don't know what I'm going to do.
 


Some would argue that it is only because of a few lawsuit settlements though, without them it could have been a different story.
 

Lessthannil

Honorable
Oct 14, 2013
468
0
10,860


I was referring more to driver software like Shadowplay. Even on the topic of G-Sync, NVIDIA may of tuned it for Kepler GPUs so that it works better or to make it work at all. I want a standard out of this, but that might perform less well as this or something from AMD from it having to support mutiple architectures.

 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680


You will void your monitor's warranty installing the kit - the 1 year warranty is only for the GSYNC PCB.
 


I'm sure the article I read gave the impression that the warranty covered "the work done" which I would expect if I had to send a monitor off for the retro fit.
 

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645
Hello Toms Hardware staff...

For the Blur Busters preview of G-SYNC, I created an excellent G-SYNC simulation animation at http://www.testufo.com/stutter#demo=gsync

For best results, close all windows/apps/tabs, view in Chrome with nothing else running to add interference to this web-based simulation of G-SYNC, and on your primary monitor (with Aero enabled) -- it supports 144fps that way. This is a web software based interpolation technique that accurately simulates the G-SYNC frame rate stutter-free variable frame rate . Make sure you run in a supported browser -- (Browser requirements are at http://www.testufo.com/browser.html )

If you want, you can embed TestUFO in an iframe by adding &embed=1 attribute to TestUFO URL's (Examle -- http://www.testufo.com/stutter#demo=gsync&embed=1 ) -- you have my permission to do that on TomsHardware, just make sure you credit Blur Busters. They're embedded in the Blur Busters's own GSYNC review...

Mark Rejhon
 


I clicked on the site, but I have no idea how this is supposed to related to G-sync. All I see is bars running across the screen.
 

mdrejhon

Distinguished
Feb 12, 2008
71
10
18,645


You have to stare at the framerate counter at the upper-left while watching the bar getting progressively blurrier/clearer. Make sure you're running in a modern, perfectly stutter-free web browser (requirements at www.testufo.com/browser.html ), though, and double check your web browser using the UFO at www.testufo.com and make sure there's no stutter on the smoothest UFO, and it always says a green "VALID" at the bottom. Once you've done that, play with the various animations using the selector at www.testufo.com/stutter to compare VSYNC ON, VSYNC OFF, and G-SYNC fluidity.

This article, with embedded animations, brings context: http://www.blurbusters.com/gsync/preview
 


Negative? Where did you get that from?:heink:
 
If this is not open standard ie: have to buy a monitor that only supports this feature with Nvidia cards, then it is pretty stupid. Monitors are very expensive as is and to be limited in choice is not consumer friendly at all.

If hypothetically AMD came up with something better in the similar context it would mean id have to change my GPU and monitor. This is profiteering through closing technology off from the consumer level.

On the bright side, my 670 would stop being crap in BF4
 


670 or the 6970 in your sig?
 
G-Sync vs Mantle:
G-Sync works with every game every made. Mantle support arrives soon for BF4 but then it will be years for any significant penetration (years to develop most games and only a few are working with it so far).

But I can handle 60FPS already:
G-Sync in many cases still works better by minimizing jutter even compared to those maintaining a "solid" 60FPS VSYNC'd experience.

Plus that solid 60FPS experience disappears once more demanding games appear.

Licensing?
NVidia is considering various licensing models, however they have a huge win here for selling desktop graphics cards so they won't immediately have it work for AMD cards. NVidia does want it in EVERY future product with a screen (tablets, phones, HDTV's, monitors).

Hardware or software?
It's a hardware component for the screen, but a driver modification in software only. Does that mean we'll likely see HACKED variants of AMD's drivers? You bet.

PS4 + Sony HDTV?
That would need a Sony to get licensing for an HDTV and the PS4, build a new HDTV to sell and update the PS4's drivers. I do think it will happen in about two or three years time (design cycle).

Is NVidia really that awesome?
In my opinion it's incredible. It doesn't solve everything of course, but it does something for visual quality that NO other solution can do on a regular basis; even maintaining a "solid" 60FPS experience isn't so simple without G-Sync due to issues like jutter and periodic stutter due to frame drops when things are way demanding even on a really high-end machine.
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795
This technology cannot be Vendor locked... It's extremely beneficial to all gamers...
I'd love to see this in all better than office grade equipment. Also In TV... because console suffer from the same V-sync issue... especially now when they struggle to deliver this constant 60 fps. (30fps as target frame rate is hardly enjoyable).

If nvidia releases this out of chains... I'll buy their gpu as my next upgrade just to thank them for sending V-sync to the history books.
 
And Mantle works with every monitor ever made. And many people have spent a lot more on their monitor(s) than games.

G-sync only makes a difference when the GPU can't handle a 60FPS minimum, from my reading. The issue it fixes is where the frame rate is quite unstable.

This could be just another addition to standards. But if it requires a totally different frame buffer in the display, it's going to be useless. Chances are you'll need a different PCB for every display series.
 


ASUS DirectCUII GTX670 2GB, I can post a picture and the box if you are interested. My sig hasn't been updated in a while becauce I have a i5 system running.

 

Traciatim

Distinguished


Of course you can't do what you describe with today's monitors since they are fixed refresh rates, and require G-Sync so that is is handled on the card. So what you are asking for G-Sync provides. The sending the frames to multiple syncs is a problem with V-Sync only, because the syncs are at fixed frequencies and the frames are not.



You don't have to wait with G-Sync, assuming both cards generate the next frame in the same amount of time the G-Sync screen is told by the card to draw the next frame as soon as it's done while the V-Sync player (who also has to rely on triple buffering to keep any semblance of smooth game play) must wait for the monitor to refresh. As an average that means the V-Sync person will always be half a refresh behind the G-Sync player. Also, any time the V-Sync card doesn't draw frames faster than the sync of the screen it will cause huge swings in frame rate causing your eye to pick up the judder making the motion look jumpy even if the frame rate is measured as fairly fast.

This is the whole point, it eliminated fixed frequency sync which is the whole problem demanding the choice between huge frame display time disparity or mid sync updates (tearing) by moving the control of the sync to the card.
 

swordrage

Distinguished
Jul 4, 2012
64
1
18,635
Orig. Quote Ohim,

"G-Sync good on old hardware that can`t reach 60 fps, bad since you need a new monitor, so guys who can`t afford a better GPU will have to get a new monitor ?!?!?!"

Spot on. And I think they will cost north of $500. So gtx 780 or r9 290 with change to spare.
 

Datcu Alexandru

Honorable
Mar 19, 2013
131
0
10,710
@swordrage

this is what i never understood. The tech is good and partially solves a problem that plagues a good chunk of gamers BUT it solves an issue that is nonexistent for those who can actually afford it.

how is someone who had to buy a 770 instead of a 780(ti) because of low budget be able to afford a new monitor that is also more expensive than the average.

For those who have high end cards but want all the latest tech because they can afford it wont get it (yet) because it isn't yet available on higher resolution monitors with better picture quality.

The market for G-sync in its current state is almost non existent in my opinion.
 

Maegirom

Distinguished
Dec 13, 2013
19
0
18,510
Lightboost and g-sync can't run together? :__(
My intention was to buy a light boost 3d monitor right now, maybe an Asus VG248QE or an Asus VG278HE, but I was waiting as a result of g-sync release new. Now I'm not sure what to do. I want the monitor precisely to use 3dvision, so g-sync IN 2D MODE is the same to me...
 

Maegirom

Distinguished
Dec 13, 2013
19
0
18,510
Lightboost and g-sync can't run together? :__(
My intention was to buy a light boost 3d monitor right now, maybe an Asus VG248QE or an Asus VG278HE, but I was waiting as a result of g-sync release new. Now I'm not sure what to do. I want the monitor precisely to use 3dvision, so g-sync IN 2D MODE is the same to me...
 
Status
Not open for further replies.