AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

omgBlur

Reputable
Aug 16, 2014
3
0
4,510


I have read about the issues with G-Sync on SLI and mutiple monitors. I'm only running a single ROG Swift on a GTX 980, the only time I've encountered stuttering is when the fps dips below 30. It stutters worse than if it was on a normal monitor, but generally that only happens on AAA when I have things like AA or all shadow settings on max.
 

Math Geek

Titan
Ambassador
i know it is hard to quantify but a question asking something like "if you had a preference, how much better was the experience? little bit, a bunch, massively....."

it is possible that some of the participants chose one as better but in reality thought the difference was very little. or possibly massively better. it is clear g-sync was the preferred tech that day but i wonder how much better folks thought one was over the other. for me to spend the extra cash on the g-sync it would have to be A LOT better than freesync to justify the cost.


just wondering. overall, great idea and decent execution. no one will every be truly happy with the results but that is not why we do science. the answer is important whether you like it or not and seems we have a start to getting that answer :)
 

FritzEiv

Honorable
Dec 9, 2013
253
0
10,780
A few elements to consider.

1.) Community members didn't know what they were playing on until the entire day was over and all the voting had taken place. I've added a note in the article to clarify that for anyone making a different assumption.

2.) The question we asked players was "do you think you know which machine was which," with the operative word being "think." When we asked people after the event, some automatically thought the machine they liked best was the one with Nvidia's technology, not because they had picked up on anything. One person noticed some heat. In other words, they didn't really know. We disabled all of the mechanisms through which they could have discovered it; we had 4 people or more watching to make sure nobody did any detective work, and we hid all of the equipment.

3.) On the thoughts about caveats and "wacky" variables (although I think "weak excuses" is excessive): there is a practical reality at play here. If we'd somehow, artificially kept all game play within the ranges of the technologies at play, we're fairly certain the results would have been even closer. But many people have to make a decision, and the variables matter. Are you going to only play games that operate at the frame rates within the range? Are you only going to buy a monitor whose scaler matches the majority of your game play? Are you going to turn off the VRR technology on a 144 Hz IPS panel, especially when playing games that easily run at high frame rates? Are you going to turn v-sync on or off? Are you willing to buy a TN panel that extends the higher range of FreeSync, but squeezes the lower ranges? Are you willing to pay the premium for G-Sync? Are you willing to wait until both technologies are on similar footing in all of these variables (and good luck on that)? These and others are questions people have to answer ONLY because these are the real choices in front of us TODAY (and by the way, there are other variables emerging as both technologies evolve). Choosing to change other variables would have also led to (equally fair) critique. We chose scenarios that we thought represented the way a majority of people game. That was a variable we did not want to compromise on. And we hope that you've got enough information -- including about all of the variables -- to help you start making some decisions. We wish there was a happy Disney ending, but is there ever in real life?

4.) Note that the price difference between the monitors has changed since we put this story to bed.

5.) Oh, and we're not done. Filippo, for example, has been doing quite a bit of research on FreeSync, and has his hands on a FreeSync monitor and will be providing even more data for you to consider.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


The exact point I'm trying to make that I'm getting tons of down votes for.
 

Vistouf

Honorable
Jan 2, 2013
11
0
10,510
1) Your picture exclusively show male gamers.
2) Did you have any female gamers in your sample? If so, how many?
3) You did not consider gender into your analysis. Do you assume gender to be irrelevant? If so, on what do you base this assumption?
 

beshonk

Distinguished
May 26, 2011
164
0
18,710
1) Your picture exclusively show male gamers.
2) Did you have any female gamers in your sample? If so, how many?
3) You did not consider gender into your analysis. Do you assume gender to be irrelevant? If so, on what do you base this assumption?

There were 4 women, I think. One of which was my wife. We found out at the very end that she chose the same technology I did. She also won one of the raffle prices =D You have to realize that there were most likely less women who signed up, so yes, less women end up there.

This test was well done, albeit not perfect. The only change I would have made was going with the TN panels with a wider VRR range for AMD. Also, the benq issue mentioned above may have played a factor.

For some reason the folks who thought they knew which machine was which assumed the better experience was with Nvidia, even though there was no way they could have known. They must have been among the nvidia fans group (of which i was a part, because of their shield ecosystem and drivers). My wife's systems are built with AMD, though. She just upgraded her card with the one she won (R9 390 from MSI! Thanks MSIUSA!)
 


What does male vs. female gaming have to do with the cost of oranges in Wisconsin in this test? Are males better than females in making these determinations? Are females better than males in making these determinations? Why do you consider gender in gaming at all relevant unless you think one sex is superior and another sex is inferior in that aspect.

I think they are both equally capable of gaming experiences and skills, and focusing on the sex factor is irrelevant. Why not also complain about not being inclusive to variations in race, religion, sexual orientation, college degreed vs. high school only, income and wealth, nationality, favorite color, smoker vs. non-smoker, height, weight, hair color, political orientation, etc. Why not bring up every single character trait in a sample to keep things "equal" in perspective? Why only worry about sex?

 


The reverse is actually true...as I said above, the 970 overclocked is actually slight **faster** than the 390x overclocked @ 1440p ...

.... just as the 780 was faster than the 290x

https://www.youtube.com/watch?v=djvZaHHU4I8 (see 8:40 mark)

perfrel_2560.gif


Reference 390x scores 98%
Reference 970 scores 91%

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/33.html
The MSI 390x gets the typical R9 teeny 7.1% fps increase OC'd (90.3 / 84.3)

http://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html
The MSI 970 gets the typical R9 teeny 17.1% fps increase OC'd (133.5 / 114.0)

390x = 98% x 1.071 = 104.96
970 = 91% x 1,171 = 106.57

I wouldn't actually call that a win .... that's so close as to call it even ... no doubt some games will favor one and others will favor the other. But another way to look at it is, the cost $100 difference in the cards pays for G-Sync ... and with ULMB and 144 Hz monitors, that is certainly a plus. And that again, is a glaring hole in this review as the impact of ULMB was not addressed.


 
1) Your picture exclusively show male gamers.
2) Did you have any female gamers in your sample? If so, how many?
3) You did not consider gender into your analysis. Do you assume gender to be irrelevant? If so, on what do you base this assumption?
Does it matter whether the players are male or not? A distinction based on gender is completely irrelevant, especially considering the small sample size. Take that feminazi BS out of here. This is about G-sync vs FreeSync. Not male vs female preference/difference.

In any case, the test is interesting. I'll have to read it later to determine what I really think of it. The people commenting have raised some valuable concerns.
 

husker

Distinguished
Oct 2, 2009
1,251
243
19,670
I find it puzzling that the decision was made to go with IPS panels instead of TN at the cost of a level playing field. Sure, IPS is prettier but your results are less informative because the IPS panel wouldn't support freesync above 90 FPS. This is especially puzzling because one of your stated goals was to remove artificial bottlenecks, and then you introduced one so that 48 testers could see a prettier picture while thousands of your readers get less accurate results.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160
I know this may be a silly question, but is it possible to have a monitor that supports both FreeSync and G-Sync? If it isn't, I guess it doesn't really matter which is better since if you have an AMD video card you can only use FreeSync and if you have a Nvidia card you can only use G-Sync. The card itself would be the deciding factor which technology to use.

I do know that the hope is that one of the 2 technology wins and both cards end up supporting it; although the best doesn't always win (VHS vs Beta for example).
 


FreeSync works practically through a firmware update and is an open standard. But G-sync needs specific hardware to work. G-sync is the limiting factor since it's proprietary and costs more. So theoretically, any G-sync monitor that has a display port could run FreeSync. But obviously nVidia wouldn't want that. And even then, why would anyone pay an additional $200 to have both? Only people that want G-sync specifically are going to get such a monitor, so there's no reason to put FreeSync on it. That $200 can be spent on something else.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


I'd definitely pay a little extra on a monitor now then having to buy a new monitor if i switch between AMD/Nvidia when upgrading the video card. The $200 seems steep though just to add dual support.
 

NethJC

Reputable
Aug 7, 2015
9
0
4,510
For all the hot air from biased individuals replying to my initial comment, I didn't have to blow dry my air.

Now, I would like to point out that I was enjoying 144fps on 144hz, vsynced, nearly 20 years ago. I don't care about variable refreshes, or any other gimmick. I want true fluidity, and not additional technology to make up for a lack of gpu horsepower.
You guys can go enjoy your pseudo smoothness, but know that it's exactly that: pseudo smoothness.
 
I played on the Battlefield 4 and Crisis 3 test rigs. I tried to create tearing and move the rigs out of the lower VRR ranges without much success. On my first machine, the exhaust of the PC was right on my left knee and hot enough for me to change my posture. That rig turned out to be the 390X rig (Only those of us that stayed until the very end found out which machines were which). I think more people noticed the excessive heat coming from 1 rig, but just didn't vocalize it.

If all VRR & other things were equal in this testing, which they weren't, the heat output alone from the 390x would steer me toward the GTX 970. The GTX 970 rig had a very noticeable sharpness to it even under heavy action sequences in Crisis 3.

I felt Battlefield 4(which I have more experience with) didn't or couldn't separate itself visually from the AMD rig by comparison. The differences were easier to see and recreate in Crisis 3 on my test rigs.

I didn't know which rigs were which until the reveal at the end of the day. I had my suspicion's and I certainly could tell one system was outperforming the other, heavily, at least in one of the test games.

My wife also was there. She could perceive a difference between the two machines but was of the opinion that it wasn't worth paying the premium, for it. She played the Borderlands\Witcher test rigs.
 
Aug 15, 2013
257
0
10,810


There are some small biological differences in vision between men and women. For example women are on average able to perceive more colors than men. So, it's a valid point of consideration in my opinion.
 

George Phillips

Reputable
Jun 17, 2015
614
0
5,360
The event, test, and the analysis are really good and show readers some very valuable information without bias. It's much more personal and involves real people. This is so far the very best article and the most useful test on Free-Sync and G-Sync technologies.

On the other hand, considering that there are more Nvidia fans, the overall result is pretty equal between two technologies. Either one of these technologis should be equipped on all monitors one day.
 


Indeed. The correlation between the amount of nVidia/AMD fans and preference of G-sync/FreeSync is uncanny.
 

NethJC

Reputable
Aug 7, 2015
9
0
4,510


Have you ever experienced a solid 144Hz/FPS/VSynced setup? i have , for years , at high level competitive gaming. I would even argue that it was a significant factor towards my success.
I will have a look at gsync, but i am extremely skeptical.
And as for maintaining constant FPS at that speed - anyone who has an experience such as mine will, without a second thought, lower quality settings to maintain the fluidity.
Properly tuned setups at 144Hz/FPS/VSync are still the absolute best graphics i have ever experienced.

There is no way you will convince me that a variable sync/framerate setup capped at 60-120Hz / FPS in the best of cases, is better than my past experience. It's simply technically impossible.

If you haven't experienced what I have, then you have - quite literally - not seen the light (frequency).


 
Well I just picked up an AOC G2460PG 144Hz 24.0" Monitor for $349, and I put it's G-sync to a little test and it did look noticeably much smoother than my other monitors.

Basically this article makes me glad I went with a G-sync monitor.
 
I would call this more of a FAIL, since there are ZERO control units aka monitors that are NOT g-sync or freesync. That would probably change some of the results. Like using the best TN or a 12-bit IPS.
 

MasterMace

Distinguished
Oct 12, 2010
1,151
0
19,460
So things to note: when put to a blind test, 9 out of 60 toms viewers will be able to tell what brand they are using. Overwhelmingly NVidia's technology was preferred, but not at the price being offered. Battlefield 4 doesn't make a difference between the 2.

That last part i'm fine with, as I've stopped buying EA.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
My biggest problem here is no control. You really needed to have monitors with freesync and gsync turned off.

What id really be interested in is the subjective experience of either gsync or freesync over those control setups.

Aside from that interesting read.
 


I think everyone in attendance agreed, a control would have been nice. Did the original Pepsi challenge have a control?

Reason I ask that is I think it would be obvious to any informed tester, or person who knows what adaptive sync technology is, to perform a couple quick "twitch-like" FPS moves and tear the screen to crap(V-Sync = Off). You would only need a few seconds to identify the control machine, making a control rig more effective in a double-blind test.
 
Status
Not open for further replies.