G-Sync Technology Preview: Quite Literally A Game Changer

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

swordrage

Distinguished
Jul 4, 2012
64
1
18,635
@Datcu alexandru

Honestly I can't see ghosting or tearing more than I see those colour bands that appear in a very finely graded part of a picture (such as the sky) in a tn panel. I am not a competitive gamer myself so I don't understand the importance of a millisecond. Rather than this I would prefer a tn panel with better colours or ips with better refresh rates. But I can't ask nvidia as they don't make panels.

One more thing. Is there a chance we can completely do away with refresh rate? Like, if a frame is displayed, it remains on the screen static unless the next is displayed. After all it started this way because of the standard electronic circuits that were available in the time of crt monitors. But for lcd-s they could have started differently.
 


You make a good point. I wouldn't get a monitor with G-sync until it's available in 2560x1080 resolution or higher.
 


it has been confirmed that Asus VG248QW with G-Sync installed will cost for 400. it is no nowhere costing 500 and above like you said. also monitor is one part that will not get upgraded as often like we did with gpu. it might be a bit expensive but you will use it for a very long time.
 


You don't have to go out and buy a new tech as soon as it's available, but having the tech available will enable anyone that was already going to buy a monitor to go out and get one with this new feature.

It may cost an extra $100, according to posts about the upcoming VG248QE release, and once you spend that extra money, you are good to go for several GPU upgrades, if you are like most people who hold onto monitors a lot longer than GPU's.

If you want to jump on board without having needed a new monitor, then you can, but you don't have to do anything. You have a monitor.
 

swordrage

Distinguished
Jul 4, 2012
64
1
18,635
Need a clarification. When gsync is enabled, until the next frame is displayed, does the previous frame remain static or is it displayed multiple times to make it appear static?
 


Assuming your FPS are 30+, every frame rendered is immediately sent to the screen and left there until a new frame is ready and sent (no refreshes). If your FPS drop below 30, it will behave like a 30hz monitor with V-sync on or off (not sure if V-sync will be on or off).
 

Djibrille

Distinguished
Jan 31, 2007
27
0
18,530
I am impressed with the technology, but it would really be great if there was a way to buy the gSync module separately and install it yourself in a big range of monitors. I, for example have an AMD GPU and an 60hz IPS monitor. There's no way I'm replacing ~700Euro's worth of hardware no matter how much better GSync is (I'm reasonably happy with my current setup except for the occasional blurring and tearing in games - I find panning motion to be the most annoying). They need to make a way for us IPS users to add a GSync module to our monitors, maybe using some kind of breakout box(if at all possible). In that case I might be tempted to switch to an Nvidia GPU in the future so it would still be good for business.
 


Really? Wow, so all of us who have more than one are in trouble then I take it? :lol:
 

The3monitors

Honorable
Dec 9, 2013
151
0
10,710
I cant wait till I have a dual 60" monitor setup that will be able to handle Zbrush/Maya/Photoshop/Illustrator and play any game that I want at resolutions of 16k.
 
"I'm most looking forward to 2560x1440 and IPS. I'm even fine sticking with a 60 Hz refresh if it helps keep cost manageable."

This is how I feel. But would higher-res panels be IPS-only and still include the inherent input lag?
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
Wish you would have talked about adaptive vsync when you were discussing the FPS graphs.

Anyway, i'm pretty excited about G Sync, but yeah, needs to be standardized.

Also, i recently discovered that most of the lag experienced with Vsync or adaptive vsync is due to triple buffering, turn that off and the lagginess reduces.
 

Soulara

Honorable
Dec 13, 2013
1
0
10,510
Huh! I have no idea why they have to get into such a chaotic circuit design when they could stream the HDMI signal through a fast DAC and then ADC and achieve an ideal frame blend and an optimal refresh rate. In our days many people drive to the future on reverse gear.
 

cubebomb

Distinguished
Jul 6, 2008
5
0
18,510
you write about it and my brain cannot yet grasp what you are saying about g-sync. I need to see this and experience it before i through my money at it. currently gaming at 1440 at 100hz with 80 - 100 fps is smooth enough but the tears sometimes annoy me. other than that, they need to increase resolutions. 1080....i cannot go back to that. even for Gsync.
 


You clearly don't understand Dynamic V-sync. Just like adaptive V-sync, tearing occurs when you don't reach your refresh rate. Those two tech's turn off V-sync when you don't reach your refresh rate, and tearing occurs any time you do not have V-sync on. It may not be as noticeable, but it does.
 

rdc85

Honorable


U will need to count taxes in receiving (buyer) country this need to be counted in the price

example in my country the tax is about 15-23% (from total cost) for computer equipment (depending of the goods) and can up to 30% if u have bad luck and got greedy custom officer.....
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
This is nothing more than a gimmick. A properly functioning frame buffer would do the same thing. Repeat last completed frame to signal output if next frame is not completed before next scan interval. Unless you want to reintroduce screen flicker by not displaying anything while the display panel waits for an updated frame, there is no other way of displaying video.
 

Traciatim

Distinguished


This is exactly what happens now, it causes judder because on a 60hz screen if you are running at 50FPS every few frames one of them has to be displayed for 33ms instead of 16.6. You don't notice this (or tearing) as much on 120 or 144hz monitors because the time each frame is displayed and the difference between frames is much smaller, but it is still there.

If the monitor could be driven by the video card to refresh on demand rather than at a fixed frequency it pretty much solves all the problems that cause tearing, judder, and delays seeing the content on the screen... Hey, what do you know? G-Sync.


 


You seem quite confused. LCD's don't flicker. They don't require refreshes to keep an image on the screen, though the color will shift if not refreshed at least 15 times a second. The current tech is designed around CRT's, which are pretty much extinct. G-sync prevents stutter/judder while removing screen tearing.

I'd try to explain it, but obviously you aren't very technically inclined, or you'd have understood the article. So I'll try to make this simple. If this was a gimmick, why are all these sites reviewing the product finding so much better results than in the past?
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
If your GPU is topping out its fps of a game at 50 then its going to fluctuate wildly under that. So when you move your look curser around the fps will go from 50 suddenly to like 20 fps causing the studder you are referring to. There are only 3 ways to deal with that. 1. Is to render the frame as it us at the time of refresh and cause screen tearing. 2. Repeat the previous frame until an updated frame is complete causing stuttering. 3. Display all frames for a fixed amount of time causing screen flicker. Lcd monitors have been able to display various frequcies for years. It's just that there have been standers developed like pal at 50Hz and ATSC at 60Hz. Most LCD monitor specs will say "frequency range 55-75Hz"
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
Bitcoin is also talked about a lot, that that doesn't make it any less fabricated. Remember those physx cards? That turned out to be little more than SLI but with one card gempt to only running physics calculations. The idea didn't take off because if you could afford a second card, you would be better off using it in normal SLI configuration. And Nvidia' other idea SHEILD why would I buy a $300 android device to play games when I can buy a android phone that can play the same games for about the same price?
 


You are not getting it. LCD screens are solid state. THEY DO NOT FLICKER. If you remove the refresh, they will not flicker. The only thing that can happen is if refresh rate drops below 15, the color may fluctuate a bit.

G-sync removes the whole idea of having to repeat frames. It just updates the images when they are ready. The old method was created to deal with CRT's, which flickered, but LCD's do not have this problem, but because it had been done the way CRT's required it for so long, no one ever changed that behavior until now.
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510


I know they don't flicker, my point is how they don't. All Nvidia is doing is creating a new standerd that allows for frames to be transmitted to the monitor at a dynamic frequcy rather than a fixed one. This is sum what pointless as it doesn't prevent the GPU itself from failing to generate frame updates fast enough to not be perceived by the Human eye. This is akin to buying high price wax to wax your rusty car. Most people can't tell the difference in expincive or cheap car wax, but neither one is going to fix the rust. You would be better off saving you money on the wax and get a new paint job.
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
Sorry I forgot to mention LCD TV's do use screen flicker to mask motion blur.
http://m.cnet.com/news/what-is-refresh-rate/57524894

They have been doing this for years, but it's not desirable for PC monitors or gaming.

Now if you want to talk about the screen refresh rate? That can be much higher than the display frequency. LCD monitor manufacturers sync the pannel's refresh rate to a multiple of the display frequency for better image quality
 


Just read all the reviews on it. Clearly it makes a massive difference in gaming. At least 1st person gaming. Everyone who has it has commented on never going back again. It is not something small. It is not a gimmick. It is the way that many people have wanted their monitors to behave for years. Ever since LCD's were created.

The huge difference between having G-sync and V-sync is when things aren't perfectly smooth at max refresh rate, G-sync keeps all the frames smooth. V-sync causes stuttering and also causes latency, even when you are at max refresh rate.
 
Status
Not open for further replies.