AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

NethJC

Reputable
Aug 7, 2015
9
0
4,510
0
20FPS is still 50ms lag just from a frame generation point of view. If your screen makes it look good, you still have the 50ms lag.
Sure, it may look smooth, but you still have the same lag as a mid range consumer high speed internet connection - coming just from your videocard.
You are welcome to think that spending 100-200$ more on a monitor that puts lipstick on a pig is a good thing - but i personally don't.
 

paulbatzing

Reputable
Apr 11, 2014
156
0
4,760
25
20FPS is still 50ms lag just from a frame generation point of view. If your screen makes it look good, you still have the 50ms lag.
Sure, it may look smooth, but you still have the same lag as a mid range consumer high speed internet connection - coming just from your videocard.
You are welcome to think that spending 100-200$ more on a monitor that puts lipstick on a pig is a good thing - but i personally don't.
Well, if the alternative is spending 500$ a year on graphics cards or running on low graphics , I think it is a nicer option. I fully understand what gsync does, and I like how it looks. It doesn't make the lag worse (which triple buffered VSync obviously does) as you put it, and that is exactly what I want. Anyway, the games I play run with fps between low 30s (Arkham knight) to high 100s and above (Fifa for example), and my screen has me covered for both scenarios. You might not want it, but I do. As I said before, if you have a stable 144 fps, good news, you don't need gsync. Huzzah!

And I get that you like CRTs, I really like them too. I just don't have a good one anymore, and I don't have the space, so that is not really an option. And bringing up CRTs in an article about the one technology that can never be implemented in CRTs because of physics constraints, is just a bit strange.

Just a comment, I resent you calling a 780 and a nice processor a "swine". It might not be the newest of the newest, but it's good enough for me. And I don't get your "window dressing" argument. Computer graphics are purely about tricking our eyes and our brain into believing that a series of pictures made out of glowing points generated purely artificially are a representation of a "real world" of some kind. Computer graphics are ALLWAYS about perceived quality, as that is what they are for.
 

salgado18

Distinguished
Feb 12, 2007
645
68
19,070
4
Old black and white movies have achieved a quality standard that current color movies have not, even to this day. They have a much better image than today's color technologies. And yet no one wants to go back to black and white movies, because a pink pig is nicer than a gray pig.

We get it, you have one hell of a monitor, but it is old, uses a lot of energy, wastes a lot of desk space, weights five times more. The industry is trying to evolve, go somewhere, you know. You could run a Beetle for hundreds of thousands of miles and still fix it with a straw and some bubble gum, but people want the better experience that the new cars can give, despite needing specialized maintenance and care.


Then buy a FreeSync monitor, it's, um... free :)
 

salgado18

Distinguished
Feb 12, 2007
645
68
19,070
4
Maybe the sample size skewed a bit the numbers. Also, the fact that Nvidia has better marketing and more efficient cards (important to a lot of people) does not imply brand preference. I'd recommend a 750ti to many people because of its efficiency, but my PC will only have AMD cards.
 

loki1944

Honorable
Oct 31, 2013
1,665
2
12,460
211


Makes no difference if the FPS average is 30,50,80, 144 whatever. Even single GPU my 290Xs in crossfire are smoother on a 59Hz 1440p monitor with ESO than my 780Tis in SLI, or even a single 780Ti running at 94 FPS average. SLI stuttering is actually worse with Gsync on than without so far in the games I've played.
 

jdwii

Splendid


What you are forgetting is if a game has V-sync on at 60hz and it drops it skips or stutters all those problems go away with free-sync(until the monitor doesn't support the hz) or G-sync. I play at 40-50fps a lot and it feels smooth and doesn't tear.
 

jdwii

Splendid


Again this is a logical fallacy a exaggeration. What happens when the game is playing at 40fps? Something my card often goes down to and even a 980Ti.
 

jdwii

Splendid


I'd recommend a card from either company who ever makes the most compelling product unless you are tied to a certain feature. Really that's the only thing i dislike about G-sync.
 

paulbatzing

Reputable
Apr 11, 2014
156
0
4,760
25


Makes no difference if the FPS average is 30,50,80, 144 whatever. Even single GPU my 290Xs in crossfire are smoother on a 59Hz 1440p monitor with ESO than my 780Tis in SLI, or even a single 780Ti running at 94 FPS average. SLI stuttering is actually worse with Gsync on than without so far in the games I've played.


Makes no difference if the FPS average is 30,50,80, 144 whatever. Even single GPU my 290Xs in crossfire are smoother on a 59Hz 1440p monitor with ESO than my 780Tis in SLI, or even a single 780Ti running at 94 FPS average. SLI stuttering is actually worse with Gsync on than without so far in the games I've played.
I am not going to argue about which graphics card is the best, that is not what this is about. But running SLI/CFX with *sync is asking for trouble at the moment.

As I said, *sync does one thing and one thing only: it refreshes the screen as soon as the picture is ready in the card. Obviously this only makes a difference when you are producing fps in ranges where VSync would drop you into half the frame rate (e.g. 59 fps, giving 30hz refresh for one frame). You don't see the difference? Don't buy into it. But many of us (including me) like it and use it. A person who buys 2 780 ti s and two 290x s was never the target group for this tech anyway.
 

loki1944

Honorable
Oct 31, 2013
1,665
2
12,460
211


The only difference I see is worse performance with Gsync on in the majority of games; SLI without Gsync works much better and gives a smoother experience overall as far as I can tell. SLI issues are few without Gsync, but turn it on and whammo, stutterfest. As for who is the "target group", as far as I know you aren't Nvidia, so when they say this: "eliminating screen tearing and minimizing display stutter and input lag. The result: scenes appear instantly, objects look sharper, and gameplay is super smooth, giving you a stunning visual experience and a serious competitive edge.” I hardly think they are implying "if you have SLI you shouldn't get this".
 

Xorak

Honorable
Jun 7, 2013
79
0
10,660
13
I own an MG279Q (and a 290X) and I think it's a fantastic monitor. I also think it's ridiculous that they left V-Sync off! There is no benefit to pushing more frames than the monitor can display, regardless of either variable refresh technology.

In my experience, the drop below 35 FPS is pretty harsh, and the real reason G-Sync has an advantage. (I have not used G-Sync, but that benefit is obvious.) However, anything above 45-50 FPS is buttery smooth. 80-90 FPS is absurdly smooth. I'm almost glad this monitor has a 90hz cap because it seems like a waste of power and heat to tell the video card to keep pushing harder after that.

If they had used V-Sync on both systems, and tuned the settings to give a few sub 35 FPS drops on the harder games, then they would have a much more valid real world test. As it is, I feel they did Freesync a real disservice. Owning the MG279Q, I can say two things: 1) Variable refresh is the real deal! 2) Understanding the limitations of this monitor, I would not prefer to spend the extra $200 on a different monitor for a better experience. I would rather put that towards a faster graphics card if I were buying both at the same time.
 

Xorak

Honorable
Jun 7, 2013
79
0
10,660
13
Should have also pointed out that it's awesome Tom's and everyone involved put on this event.. Instead of just whining about one thing I disagreed with. That's what I hate about the internet. Thanks Tom's!
 


You would want to "turn off" G-Sync above 70 or so fps and use ULMB anyway.

http://www.tftcentral.co.uk/articles/variable_refresh.htm


It should be noted that the real benefits of G-sync really come into play when viewing lower frame rate content, around 45 - 60fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of G-sync are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using G-sync decrease, and it may instead be better to use the ULMB feature included, which is not available when using G-sync. Higher end gaming machines might be able to push out higher frame rates more consistently and so you might find less benefit in using G-sync. The ULMB could then help in another very important area, helping to reduce the perceived motion blur caused by LCD displays. It's nice to have both G-sync and ULMB available to choose from certainly on these G-sync enabled displays. Very recently NVIDIA has added the option to choose how frequencies outside of the supported range are handled. Previously it would revert to Vsync on behaviour, but the user now has the choice for Vsync on or off.

It should be noted that the real benefits of variable refresh rate technologies really come into play when viewing lower frame rate content, around 40 - 75fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of FreeSync (and G-sync) are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using FreeSync decrease, and it may instead be better to use a Blur Reduction feature if it is provided. On FreeSync screens this is not an integrated feature however, so would need to be provided separately by the display manufacturer.
 

jdwii

Splendid

That's fine and all but what about people with one card i mean how often does two very high-end cards even drop FPS? Just cause it has issues with SLI enabled doesn't mean G-sync or free-sync sucks.
 

jdwii

Splendid


I was so disappointed when i found that out the whole reason why i wanted G-sync is so i can play my games at a low FPS sometimes. In the future heck probably 1.5 years my card won't be able to push FPS and i'm already on 1440P.
 

paulbatzing

Reputable
Apr 11, 2014
156
0
4,760
25
@loki1944:
To be blunt: I find it baffling that you can spend so much money on a technology as you appearantly did, without knowing how the technology works and what it is for. Both toms and anandtech have written that it doesn't work well with SLI atm.

Just as a sidenote: if you don't see a difference between gsync on and gsync off at 140 fps average frame rate, that means that you can't see a difference between 144hz and 72hz screen refresh rate. In that case you are probably best off with a much cheaper 60hz screen anyways.
 

loki1944

Honorable
Oct 31, 2013
1,665
2
12,460
211


I do see a difference: stuttering. And gsync on with a single card is inferior performance wise to gsync off with SLI enabled for the vast majority of games. Also, this is what Nvidia said: "Q: How does NVIDIA G-SYNC work with SLI?
A: The NVIDA GPU connected to the display manages G-SYNC. SLI GPU setups work seamlessly with G-SYNC displays."

This is what Andantech said: " If you have a beefy SLI rig, you could see frame rates of well over 100 FPS without ever having to turn off V-SYNC."

and this: "what G-Sync will allow you to do is to crank up quality levels even more without significantly reducing the smoothness of your experience. I feel like G-Sync will be of even more importance with higher resolution displays where it’s a lot harder to maintain 60 fps. "

This is what Tom's said, "the same pacing technology that keeps frames displayed consistently with V-sync off in SLI is what you need for G-Sync to function properly. There's no additional work that needs to be done. Those frames are displayed from the "master" GPU, which also controls G-Sync."

You can be baffled all you want; doesn't overly bother me, but there's problems between gsync and sli, despite how Nvidia and others have presented it.
 

rantoc

Distinguished
Dec 17, 2009
1,859
0
19,780
0
You must not have a top end system that is properly tuned (which is why you need gsync in the first place)
Always nice to pay $$ for technology which is merely compensating for a poor system design in the first place ;)
Assumptions based on what? Assuming things without any clue is immature to say the least. As for the topic - It don't matter how much a system can handle or how well optimized it is when the software run on it is poorly coded, like that batman game mentioned in the prev post - The one the publisher even removed... In titles like that gsync have a clear benefit and if you bothered reading before spewing the assumed nonsense it also pointed out at high fps (100ish+) the tech didn't matter much.
 

rantoc

Distinguished
Dec 17, 2009
1,859
0
19,780
0
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
These variable sync monitors excel at lower framerates.. Almost everyone uses a Single GPU setup. Even a GTX 980Ti, will dip below 60 FPS on some games..

On a regular monitor, lets say a game runs 60 FPS, and then a big fight scene happens, lowering the frame-rate to 57 FPS.. on a regular monitor, it will drop it down to 30 FPS instead of 57 FPS, since the monitor has two modes 30hz/FPS, or 60hz/FPS

on a gsync or freesync monitor, it drops the monitor refresh rate to 57hz, which lets you get 57 FPS.

your argument is all about very high frame rate. even with a very high 144hz rate,, it still needs to be able to do 143hz, 142hz 141hz and so on. it cant do those lower refresh rates without Gsync or Freesync.

you can buy a additonal GPU to try and keep it at 144hz or 144 FPS, but money is more wisely spent on a Gsync or Freesync monitor. uses less power, cost less money, and less heat
Bet you can explain the tech as much as you like, some heads are just to thick to process the information and see the real benefits from them. Was an early adopter of the rog swift and have a hard time to play on anything w/o gsync since (heck even my 4k screen collects dust)
 
It's like many other GFX technologies (PhysX, ULMB) .... once you get used to it, when it's gone, you miss it.

On the AMD side, wanted to address something I said above .... apparently there is a monitor with Freesync and Blur Reduction. The BenQ XL2730. The Blur Reduction has some limitations and there's still the Freesync bug issue to deal with, but if those can be resolved soon, it will Freesync more attractive.

http://www.tftcentral.co.uk/reviews/content/benq_xl2730z.htm#gaming_summary



 

jdwii

Splendid
I think the major issue with free-sync is monitor manufactures don't care as much since it only works on Amd cards and Amd controls like 30% of the video card market compared to Nvidia. If Intel would pick free-sync then Nvidia might be in more trouble.
 

pezonator

Distinguished
Dec 13, 2011
356
0
18,860
27
Thanks for the write up. I'm currently using a 290X with a Dell 1440p screen and I'm at the the point where tearing is driving me up the wall. I also want the option for 3D and the increased range of G-sync seems worth it. The upcoming Asus 144hz/IPS/G-sync is looking mighty tempting.

And all you people having a go at Toms for not doing a good job, at least be appreciative that you have some data.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS