G-Sync Technology Preview: Quite Literally A Game Changer

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

all stalked out

Honorable
Jul 3, 2013
46
0
10,540
I'm almost scared to commit one way or the other, I was about to buy a 21;9 monitor when we heard about G-sync from the usual suspects so I delayed the purchase, It's great that where actually getting all these possibilities but now I want to buy a card and I want a high res monitor so do I wait on G-sync or do I just buy one now and leave g-sync alone for a year or two and consider an AMD card now aswell?
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680


Where is this confirmation? I have seen a lot of articles state a number, but nothing from the horse's mouth. Overlord is offering the 248 with the kit for $499 right now and a mod service, but I have yet seen an official announcement on ASUS costs. Please share an official statement - not just an article claiming a number.
 


ever since the first g-sync introduction it has been talk about 399 price. Asus VG248QE alone will be around 280 so with G-sync installed from factory it will be 400. (about 120 more but i heard the actual price on the kit alone is around 170-180). no official statement yet but nvidia did want to bring the price for 399 when native G-Sync start to rolls out. well at least for the VG248QE

We were told in October than the native G-Sync version of the VG248QE would sell for $399, so the modded screen is priced at a premium. Falcon notes that part of the additional cost can be attributed to the labor associated with the modification process. The G-Sync module also replaces some of the monitor's existing electronics, which won't be necessary in the native version.

http://techreport.com/news/25775/modded-g-sync-monitors-available-in-limited-quantities
 

JUICEhunter

Honorable
Oct 23, 2013
1,391
0
11,960
Price will come down, PC computer builders like Digital Storm & Falcon NW got G-sync kits first, so they take an existing Asus VG248QW + G-sync kit + labor = $499 for those who can't wait for the pre-built ones directly from Asus which should be $399. Personally waiting for my local Fry's Electronics to have this so I can physically pick one up, hoping they add a 27 inch version to launch.
 

Caustic Aspirin

Honorable
Feb 8, 2013
18
0
10,520
I really just can't fathom the excitement over this. Maybe I'm crazy, but I've never really been able to notice a good deal of input lag from Vsync, and most noticeable tearing for me only really happens just above the 60 fps threshold.

Sure it'll be nice to not have to worry about tearing and vsync anymore, but it looks like this has no real plan for being standard. I really just can't imagine spending hundreds of dollars on a fancy new monitor for the real end result being slightly reduced impact lag.

I've seen article after article calling this the next big thing, but I don't see how this will even remotely affect anyone who isn't a pro gamer. There is so much excitement behind such an incredibly basic thing that either I'm missing a MASSIVE piece of information that I haven't been able to extrapolate from all of these articles and videos, or Nvidia's been greasing a lot of palms to build hype over something that will shave a few milliseconds off of output lag.
 


it depends on the user. some people can't tolerate the input lag created by v-sync. also with g-sync lower frame rates can feel much smoother. same people argue that the tech benefit people with more mainstream/budget pc but the cost will defeat the purpose. for now it is expensive but nvidia are working/thinking a way to make it cheap. it might be expensive but it will be good investment since monitor is one part that will not be upgraded as often as other pc component. as i mention in my previous post the only downside is it will lock you to use nvidia hardware. will nvidia license the tech to other is another story altogether. from developer point of view g-sync can change how they develop their games. usually if they want to get smoother gameplay (60fps) they will sacrifice graphic visual to get more performance they need. but if 30-40 fps can feel smoother as 60 fps now developer can use the performance they usually 'give' for frame rates to make better graphic for their game.
 


It is most likely the fact that you can't really comprehend what this brings without seeing it. The reviewers have seen a ton of monitors and yet still are shocked at the improvement. Many have made it clear that you cannot appreciate G-sync without seeing it.

For myself, when I play a game that is running at 60 FPS with V-sync, I notice the moment it no longer is at 60 FPS. There is immediate noticeable stutter for me. G-sync would stop this. If I run on my 120hz monitor, I often use adaptive v-sync, but I see tearing, though not as bad as 60hz monitors show. I do this because I do notice a latency difference between having V-sync and not.

 

qiplayer

Distinguished
Mar 19, 2011
38
0
18,530
I think it would be interesting for nvidia (or for us gamers) to priorize higher graphic detail to the central part of the screen to keep fps abowe 60-80. Sacryfying all graphic extras on the borders if needed. It would be far more efficient, needs just a driver update (tell the card to process another frame instead to make the actual one perfect) and no extra hardware.
Sorry but I don't trust big companies, they don't know consumers, and I personally don't want compromises. (I'll explain later)
I just changed from 3 60hz asus ve278h screens to 3 144hz vg278he, image is way better than on 60hz but still quite ghosting, yes when you are in multiplayer and need to kill 3 opponents in 1 second you want things to work. In my opinion it is better but still far away from clear image. And I think it's pathetic that they developed lightboost for 3D and didn't thought it earlier for gaming. If it wasn't for gaming I would'nt own a pc at all, so thats why I own 3 gtx titan dear Nvidia.
About g-sync, I never even used v-sync playing online and preferred to have lower details and average 80-110fps to avoid loss of important frames when the action was faster. And this phenomenon isn't changed by g-sync at all.
So what do you do if you'r down at 30 fps need to turn twice jump cloack turn decloack shoot launch a grenade kick and take care of the third opponent? You apreciate the clean screen??
You can't do that with fps drops, if fps drop you'r distracted and dead.

This is my opinion about g-sync.
I would think it differently if we still had good combat flight simulators. But we don't as I know. So I play else and watch less the panorama ;)

If you like fast shooters I highly suggest crysis2 multiplayer ;)
 


I don't think you understand G-sync if you think you are sacrificing FPS. G-sync gives the benefits of V-sync without the sacrifices. In the past, V-sync meant your GPU had to wait around to be displayed to be in sync with the display. Now when the GPU is finished with an image, the display immediately uploads it. You'll actually get more responsive visuals, as when a frame is sent to the screen, it is a full complete image, and you don't have the delays v-sync causes.

As far as Lightboost goes, you may not be aware, but you can turn Lightboost on without 3D using a hack here: http://www.blurbusters.com/zero-motion-blur/lightboost/

These new G-sync monitors are going to offer the Lightboost (new name for it though), without the need for a hack or 3D.

The idea of only focusing on rendering the center of the frame may be doable sometime in the future, but I doubt it is something that can be done by drivers and is probably more of a dev thing. However, Xbox One has a new feature that has a similar concept to what you are wanting. If the game can't keep pace with your monitors refresh rate, they have the ability to lower the rendering resolution to increase FPS, and that image gets upscaled after. It does seem there are people thinking of tech of this nature.
http://www.youtube.com/watch?v=mbMv0NQwaMY
 

wjw2000

Distinguished
Mar 15, 2011
7
0
18,510
I have been so frustrated at Tv's and other displays not taking true 120 HZ in. I want to play more games in 3d and avoid headaches associated with low refresh rate 3D rendering. Give me 60hz per eye whenever you can even and lets get past these lame 60 hz input displays. Variable or at 120hz anything is better than this 30 per eye 3D input garbage on TV's. I am very excited and really look forward to this!
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
This doesn't fix the persistence problem with LCD displays. Only strobing can do that.
Strobing is noticeable below approximately 85hz.
Gsync may help stuttering at low frame rates, but you're still stuck with poor motion blur from pixel persistence.

This isn't really a game changer for those that push 100+ fps with strobing backlight. I'm not sure why even work on combining g-sync with strobing. You can't strobe at low fps, and gsync has no benefit at high fps.

 

squirrelboy

Honorable
May 3, 2013
89
0
10,640

the BenQ XL2420T is $300 (€220) in the US, but €340 in the netherlands. thats $470!
 

etonbears

Distinguished
Dec 18, 2013
3
0
18,510
I was always surprised that the limitations of electron gun displays seemed to carry over into new display technologies. It just seemed lazy.

But I don't really think a proprietary hardware solution is a great idea.

Displayport/Thunderbolt use a data packet protocol that was designed to be extended to implement features like this. A small modification to the displayport protocol would allow GPUs and displays to operate in this manner, using the auxilliary command channel.

It may even be possible to implement a protocol change for existing GPUs and displays through driver changes.

Of course if you want to use obsolescent connectors like DVI that are not packet-based, then a hardware solution would probably be needed; but those connectores are being phased out.

Personally, I'll wait until this gets put into a VESA standard.
 


If you read through the article, you'd know that this was done through Displayport, but it requires special monitor hardware to be capable of this.
 

qiplayer

Distinguished
Mar 19, 2011
38
0
18,530

Hi I play already with lightboost. The hack is even not needed if you have 3d glasses, can it up that screens goes to 3d mode automatically, then disable 3d but lightboost stays.
About gsync I say if you play fast games you dont want 50-40-30 or other fps than the max. I never experienced big thearing playing above 60. I mean 80-110 on a 60hz screen. So would never need vsync and gsync. But what would be nice is a setup to avoid lower fps, sacrifice quality for quantity of fps.
The way gpu work is stupid if you game. Who cares about ultra detail if fps range from 80 to 15??? and it is still probably like this because apple never entered the gaming market with an "I-game" device.
Something that works as it should.

Also 3d is far away from good, in crysis2 the pointer is'nt pointing to enemyes, medal of honor has faulty shaders. In a competitive market no company would survive with things working by half. But there maybe isn't enough competition in this field, and even worse not knowing what customer need.
I see lots of players are happy for gsync, good, nvidia made goal. After 15 years of graphic cards... I still think the work must be done on drivers to not render ultra details if there is another priority.

Thanks for your reply, I apreciated :)
 


As far as the hack goes, it would still be the better method, as having stereoscopic drivers loaded does cause a 10-20% loss in performance, even if 3D is not activated in game.

Those losses in FPS, from 80 to 15 are almost always a CPU problem. GPU performance issues are rarely so extreme. Sometimes bugs cause this. Even if there was Mac gaming, like Windows, it is the dev's that determine gaming quality. The platform well set your limits, but ultimately the dev's are the ones that cause good or bad gaming experiences.

As far as 3D goes. When done well, it is awesome. When done poorly, it sucks. A few things, turn off the in game gun sites and use the Nvidia 3D Vision Laser sites instead. They will then aim accurately. This goes for all 1st person games. Tridef also has a laser site that works for 3D. Other shader issues are simply a matter of games not being designed for 3D, but if you do have 3D Vision, you can go to this site for fixes for many games:
http://helixmod.blogspot.com/2013/07/game-list-full.html

That site has changed 3D gaming from ok to great.
 

rdc85

Honorable


I'm not familiar with tax system in Europe but in my country it goes like this :

Let says the screen is 180, + shipping to shop (anyone have import permit for special items) 80 u got = CnF 180.. (base tax price)
+ tax 21% = 217.8 (price + tax )

this is the base price for the product in the store, then the store will up again the base price to get some profit. let says 20% (sometimes 30%) for high end/new tech item..
(some shop not so greedy but most of them are, since this stuff not sells quickly)

U got :
217 + 20% = 261.36.... (price on shop display) + shipping to your home if u not buy in the shop...

This is why computer stuff so pricy in my country,,, :sad
Even they manufacture it in china that closer to us than from usa. US price always much lower than ours.. (envy)

U can go buy online your self but u need to prepare facing the custom officer...
(yeah, my friends once got blackmailed by them)



 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Let me know when AMD's Mantle is open. They've said it COULD run on other hardware, but lets be real here, open at this point means AMD allowing NV to make AMD gpus...LOL. How open is that? I don't see that happening either. If it was open wouldn't it work with all AMD gpus? You have to be running GCN or bust (making it just like needing NV 650+ to use G-sync), so NV would have to make GCN capable cpus, as in toss out kepler, maxwell etc. That isn't open. Also we have no idea what AMD would charge for this OPEN Mantle. Making Gsync open at this point would mean AMD making Kepler cards. We do not know there is any other way to do this, so claiming NV could even make it OPEN might be ridiculous as it may require things only Kepler currently does. We can say the same about Mantle needing GCN. Clearly it even requires special hardware just to work with an Kepler card. If they have to make a specific monitor card for each one it's even less easy to migrate to another vendor like AMD. We don't know about that yet either.

You first have to prove you can get something running on your OWN stuff before bothering with others and being open. I doubt they are even at a point where they can say someone else can run it. It can't be that easy or they would already have it running on 100 monitors instead of ONE launch monitor right? It wouldn't have needed a special NV card to work with the monitor if you could just roll this crap out willy-nilly right? It may never work with AMD without AMD making NV based GPU cards, and that will likely never happen. It would take AMD a good few years to put out a NV based chip and I'd venture to guess if they are that desperate they'd be bankrupt before fielding an NV based chip (they are near bankrupt now).

With BF4 being the first mantle game and it having so many problems already, I don't think NV will consider ever using mantle, and they can bleed for ages watching it first. I'm guessing Jen Hsun would rather bleed a billion or two before using anything AMD. With 2.7B in cash you can take the time to respond with your own R&D/Tech that matches Mantle rather than adopt it (nothing stopping that, NV certainly has funding to beat any AMD innovation). Until then lower prices if Mantle becomes important until you answer with your own tech. Lower pricing to Break-Even until you have an answer, and those lower prices will cut AMD margin to nothing or losses during the mean time. Lets see NV's options are adopt your enemy's tech legitimizing it, or break even for 1-3yrs to match their tech while keeping their margins at ZERO or losses while killing their tech and keeping them broke. I'll take door #2 every single time and so will NV and anyone else. You don't adopt your enemy's stuff if you have all the cash and no debt (the exact opposite of them). Instead of adopting their stuff, you just stall until you catch them, then go back to killing them and raising prices (see Intel...LOL - Note Haswell has went up $20 across the board pretty much).

The first 4 Tegras were NV bleeding until they could field their own ARM SOC cpu and pair it with their own GPU from the desktop and a modem; which up to now Qcom has had on everyone else basically owning the phone market due to this advantage (but that is over shortly). The first REAL Tegra is Tegra6. This will be the first one to make money also, though T5 might make some first (it's only missing the IN-HOUSE cpu part but the modem was most important I think for now as it gets more phone sales-we'll see how T4i does there) :) The first 4 were just delay tactics until this point of bringing it all together. Shield was just prepping some great games in time for T5/T6 and getting gamepad/streaming working better. Next xmas I would be running from AMD/Intel Stock as they both have many people entering their world of CPU's (T6 and all it's enemies). TSMC just said 20nm is entering Volume production and 16nm is entering before end of 2014 ahead of schedule (16nm entering Risk production right now). The real party happens then as SOCS will be better than xbox360/ps3/wiiu and 1080p consoles will be a tough sell by then.
http://www.digitimes.com/news/a20131212PD211.html
Apparently they are 1Q ahead.

GT6 on PS3 shows old console's power envelopes still have enough power to make a great game for TV (even though car prices are absolutely stupid, the game itself looks pretty impressive). Until consoles hit 10mil+ sold nobody will concentrate on them. Next Gen will get crappy ports just like PC's until they sell 10mil units or more (except for Sony/MS titles, and a few EA ones I guess). It's smarter to aim at 100mil each of xbox360/ps3 than aim at 2-3mil of each next gen hoping one day they reach 100mil each. Like it or not, most will aim at last gen consoles for another year or more (and the rest will aim at mobile or PC - both 360/ps3 have ~14% aiming at them, while next gen both have less-see GDC 2013 surveys). My theory is, by the time next gen sales are enough to shoot at (next xmas when they have 10mil+ each?), there will already be a tablet/phone in your hand that does a job at least as good as xbox360/ps3 with next xmas android games. At that point a $400/500 console (with $60-70 games, and a VERY small catalog for both xbox1/ps4 even next xmas) is a tough sell vs what you already have in your hand with $0-20 games and a massive android library+PC gpu streaming to TV (NV will roll this out to any tegra device, it's a no brainer) and nothing hiding behind PAY WALL (like netflix etc all behind XboxLIVE $50 yearly subs! You shouldn't have to pay MS/Sony $50-60 to see netflix). The best experience will be a combo of your PC+mobile (mobile/android games out of the house, PC+streaming gpu+android inside house).

By next xmas consoles will be lucky to have 10-12mil each in the market. Very small compared to the numbers of 20nm products that will be sold by next xmas and shortly after as 20nm ramps fully everywhere. I don't see how consoles survive this + 1/2 dozen consoles coming (mojo, shield, gamestick, gamepop, wikipad, ouya etc etc). It took ONE day to sell 1 million of each new console, but another 20 days to sell the 2nd million of each. If the trend continues you'll be lucky to hit 3mil by xmas and need a few months after that to hit 4mil and I'll be shocked to see it not slow further all next year just like wiiu did (xmas/black friday don't come every month of the year). Wiiu had no new consoles coming after it (alone for a whole year basically), where xbox1/ps4 have a 1/2 dozen+ 20nm tablets, phones, PC GPU streaming, steamboxes etc along with no games catalog for another year or more. There is a tough road ahead for them. New models have better hardware obviously (both blow away wiiu/ps3/xbox360), but no games will be aiming at them for a while so extra power is no help. Black Friday showed last gen still sells very well even today.
 

etonbears

Distinguished
Dec 18, 2013
3
0
18,510


Oddly enough, I read both the article and the Displayport specs. Did you?

If you are defining a prorietary extension to Displayport, you need to be able to ensure that the display understands it and can respond to it, hence NVidia NEED to offer the G-Sync retrofit hardware, or persuade manufacturers to use it.

If you are drafting an update to the Displayport standard, display manufacturers will implement any hardware changes themselves in order to be compatible.

I know which I prefer.

I do not know whether some of the existing displays are capable of updating the screen at variable refresh rates; but if a screen supports Displayport, which does not use an external clock, then it is quite possible that screen can be driven at variable refresh rates. Whether there is any route to make it so through firmware or driver changes is another matter.

Regardless, I do not think that it is anything but a retrograde step to start inserting proprietary hardware into displays, particularly for something like this that is relatively trivial to implement, and should long ago have been in the standards.
 

Adroid

Distinguished
I really want to understand what impact G-Sync will have on multiplayer gaming. Sure, the 30-60 FPS at ultra resolution my be "smooth" in single player, but what impact is this going to have playing BF4 multiplayer?
I am skeptical that 30-60 fps in multiplayer will equate to a smooth and lag-free gaming experience.
 




With G-sync, you won't have to live with tearing and still get high FPS and low latency and while 30-60 FPS is vastly improved, 60-144 FPS is also improved.
 


Yes, they decide on performance over quality and deal with tearing. Some people decide the tearing is too bad and deal with the input latency and delays of V-Sync in order to not have their screen look like it was a puzzle put together with a hammer.

With G-Sync you don't have to make the decision, you just get the best of everything.
 

pepe2907

Distinguished
Aug 24, 2010
643
0
19,010
@bystander
What I believe is that it was scientifically proven /for quite long time/ people are unable to see what is drawn in a picture, shown for up to 1/24 sec between other frames.
Meaning - if you are able to notice a frame with such a problem, it's presented to you for a time, longer than enough to notice it. Meaning what you see is stuttering in mid-frame /and I don't know of a particular reason stuttering to happen on full frames only/.
Meaning the problem you have is with stuttering and this won't help much with it /may even make it worse/, but at least you'll be able to enjoy a perfect frame /although for a bit longer/ when that happens /and then it will jump more/.
Should also mention that I had a few years of experience in RT visual presentation&simulation of events - mostly for safety&hazards management and training /at least part of it provable/.
And also - do I believe in advertising - yes, I do; do I believe in /accuracy of/ everything, what's being advertised - no, I don't, but I also believe there are people who do /believe/ - I usually call those people idiots /:sarcasm/, but that's a personal opinion.
Ah, and just cant miss to mention, I remember how I watched how W Vista was mass advertised /and "tested"/ before release /and the same again with the new magic W8 Metro UI/ - at the time I even was banned from a "tech" site /it's extremetech/ for opposing the advertised opinion.
;)
 
Status
Not open for further replies.