AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

JVC8bal

Distinguished
Jan 17, 2012
4
0
18,510


You clearly know nothing about statistics, sampling, and surveys. I created an account just to respond to this to ensure other mental-midgets out there aren't affected by this idiocy.

First, there's nothing random about this... it's a survey. Generating random variables on a normal distribution will _always_ average to 50%... that's the definition and has abso-f'n-lutely nothing to do with anything at all. Everyone here is dumber for having read your "impressive" sounding experiment. If we sampled 12, 48, or even 1,000 people on whether the sky is blue or red and said blue... that's like you saying 100% is invalid because you could do a spreadsheet doing 1+1 over a thousand times and it always equals 2.... jeebus.

Second, if a survey somehow magically followed a random distribution, 48 samples is plenty. In fact 14 has enough statistical power if the standard deviation is not significant. But this is just an idiotic red herring. If Tom's Hardware did in fact eliminate bias (which they did by anonymizing the hardware), whether it was 12, 28, 48, or 1,000 - the results would have been the same: G-Sync would have came out ahead.

Everyone, please down-vote this thread.
 

cpm1984

Honorable
Jan 27, 2014
7
0
10,510
What's all this nonsense about G-sync not working well with SLI? I run SLI 970's + G-sync and it seems to be working just fine. Some games stutter but those stutters happen with SLI disabled as well. But most games just play silky smooth.
 
Well for one, it's about knowing when to use G-Sync and when to use ULMB.

30 - 70 fps is fine for G-Sync,
If you have an SLI rig, capable of 60+ fps, then USE THE APPROPRIATE TECHNOLOGY ! Turn off G-Sync and use ULMB.

http://www.tftcentral.co.uk/articles/variable_refresh.htm

It should be noted that the real benefits of variable refresh rate technologies really come into play when viewing lower frame rate content, around 40 - 75fps typically delivers the best results compared with Vsync on/off. At consistently higher frame rates as you get nearer to 144 fps the benefits of FreeSync (and G-sync) are not as great, but still apparent. There will be a gradual transition period for each user where the benefits of using FreeSync decrease, and it may instead be better to use a Blur Reduction feature if it is provided. On FreeSync screens this is not an integrated feature however, so would need to be provided separately by the display manufacturer.

 

cpm1984

Honorable
Jan 27, 2014
7
0
10,510
Yes, I'm familiar with ULMB, TFTcentral, blurbusters, etc. What does that have to do with my statement that SLI and G-sync work together as well as can be expected?

In any event, I've tried ULMB and even at 100+ fps I still prefer G-sync. ULMB will always have tearing (unless you turn on v-sync, but that would add unwanted latency). I'm sure other people may have different preferences - it all depends on your relative sensitivity to tearing, blur, and lag.

ULMB looks amazing in a demo with constant-speed scrolling like they have at blurbusters (the ufo test). In games the improvement is not nearly as apparent, probably because on-screen motion is highly accelerated and rarely at constant speed.
 
I was agreeing with you. I have been hearing about alleged SLI problems for years... other than not having a profile for a week after a game comes out, I have not experienced any of them. The 970 SLI rig works just fine at 1440p / 144 Hz.

I have only seen it on the Acer Predator G-Sync (144 Hz / IPS) model tho and I'm in line with the TFT central folks in preferring ULMB
 

cpm1984

Honorable
Jan 27, 2014
7
0
10,510


Yep, same Acer Predator monitor here as well! Really loving it.
 
The Predator was the 1st IPS panel I had seen that I could recommend for gaming. ... well still only one so far. When someone asks for a 4k build or 3 screen build I always make them look at the Predator 1st. The Freesyc model just doesn't compare ... not just cause of the Freesync bug (which wasn't even mentioned in this article which was weird) but because of the panel and no ULMB. I'm guessing w/ ULMB and a quality IPS panel, the relative cost between thiss and the G-Sync model wouldn't have been big. Wonder if that was Acer's marketing strategy ... get rave reviews on th G-Sync model and then have peeps assume they were getting the same thing w/ the Freesync one
 

nicuola

Reputable
Aug 10, 2015
2
0
4,510
This test means nothing. It should had made with a statistician, the same way that a motherboard review is done by somebody knowledgeable in motherboards.

48 samples is too low. If you generate 48 samples randomly, most results will look like this, instead of 50% / 50%

Start excel, generate 1000 samples at random, and it will come close to 50% for each choice (example: 55/45%). But if you generate only 48, most trials will be extremely biased in favor of any alternative.

In other words, this test is not different than a random result. It means nothing. You can repeat exactly the same experiment and get the opposite results.

Worse, if 10 out of 50 players (20%) know what is the real hardware, then it gets even more biased.

Of course, you cannot collect 1000 monitors, but you can cycle the players, and get more players.

I agree. Plus genre, age, hours per weeks etc etc. . I don't understand how a company in financial trouble such as AMD could agreed to participate.
 


Because not everyone agreed the difference is worth the price premium. AMD should indeed be able to grab some market here.

This was a Pepsi challenge to gauge user's abilities to perceive a difference between the two technologies. I believe, as an attendee that every effort was made by Tom's to make it as fair as possible. Vendor product selection for the challenge, could have been better on AMD's side(not Tom's fault).

There was a clear winner here, but at what price does that Win, loose its weight? G-Sync has an added cost, and even after seeing the difference in the challenge, it still won't be worth it for some people.
 


I think right now that cost difference is being exaggerated .

The G-Sync Acer Predator XB270HU is $750
The Freesync Acer Predator XG270H is $500

So are we to assume that G-Sync costs $250 ?

But wait, the G-Sync model has an IPS panel and the G-Sync model has ULMB ... so what does it cost apples and apples ? How much for the quality IPS panel in the XB versus the TN in the XG ? How much for a comparable blur reduction mode ? The BenQ has a bur reduction mode but it has several issues ... essentially it only works fully at 120 Hz....60, and 100 Hz are problematic; 144 hz only support DP.

http://www.tftcentral.co.uk/reviews/content/benq_xl2730z.htm#blur_reduction

I really want to see two otherwise identical monitors w/ same panels go head to head ... give us a chance to see how G-Sync ULMB fares against manufacturer specific MB techniques w/ Freesync and get a better idea of what G-Sync actually costs.

OTOH, if we want to do "apples and apples" costs, we'd have to compare with comparably performing overclocked (bawlz to the wall) GFX cards and we'd have to compare any relevant PSU cost differences.

 
It might be exaggerated. It is certainly volatile. I don't know the cost of the G-Sync premium. Generally, I believe it is thought to be around ~$200.00 historically speaking. I've been watching\pricing VRR monitors since I attended the event and they are only going up..... =(

I was hoping to make a VRR Monitor purchase this summer. I'm closing on a house, instead. Maybe next year, I can get a monitor.
 
Purchasing the G-Sync module was $200 including shipping. Must be waaaay cheaper having it as part of the monitor. It was anticipated when it was released to ultimately cost $50. If it's $200... I can't see getting that IPS panel for just $50 of the same panel wuda been in the Freesync version of the Predator. My guess is AMD wouldn't allow a Freesync predator with IPS and Blue Reduction for that very reason.
 

wtfxxxgp

Honorable
Nov 14, 2012
173
0
10,680
Comparing different monitors with different specs and different prices doesn't tell you anything about the technology. This is just a comparison of different monitors.

Unfortunately I have to agree with you on this point. This was always going to be a relatively valid point of contention. I think THW did try to keep everything within the ranges relatively too, but the reality is that unless you can have a monitor that factors in both technologies (kinda surprised there isn't one), you will never have the means to do an absolutely fair comparison, double-blind or not.

I have to commend THW and the other role-players though - I appreciated the attempt. I can't put a lot of stock in the end results, other than to say that the technologies both work well.

I think it was very cheeky of NVidia to say "don't worry about using our more top tier GPU when an O/C'd 970 will do the job against this competition". I feel it shouldn't have been allowed, but then again I don't know what all the other considerations were.

A few people commented on the fact that there were 10 people who felt they knew which systems were which, and their point of view is that those people already had a preference and therefore called the results into question. I disagree with this view slightly, but I do think that those 9 people should have been reported on separately in every aspect, starting with their confessed affiliations. The reason I slightly disagree is that their guesses were not confirmed/denied on the spot - they still only found out after conducting the tests.

I appreciate the effort and I wish I could attend something like this. Fine-tune this method for future use.
 


1. One problem is we have no idea whether one of the two the Freesync monitors were "fixed". Did the BenQ have the required firmware update done (requires factory servicing) ? or was it shipped after the problem was corrected ? The fix was June 1st so don't how long will it take for fixed units to make it into the channel.

2. I saw no mentioning of it in the article what version drivers were used (pre or post bug fix) ? The fix came out June 1st and the article is August 7th so I would **assume so** but I don't know how long it took to compile the data, get the article written and get it posted. So, yes, I assume so ... but, as with the above, it should have been addressed

Read about both here ... scroll down to big yellow box

BenQ XL2730Z Review by Simon Baker, 24 April 2015 (Updated 1 June 2015)
http://www.tftcentral.co.uk/reviews/content/benq_xl2730z.htm#freesync

3. I would like to hear your reasoning why the 970 should have been disallowed. Using a 980 Ti would have been a distinct disadvantage to Freesync ... this was not about a card performance comparison but a technology comparison

All results at 1440p

Lets first establish the relative strengths of the reference cards
http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/30.html

Reference 390x - 98%
Reference 980 Ti - 127%
Reference 970 - 91%

So even outta the box... the gap between the 980 Ti and 390x provides an extremely unfair disadvantage. Now let's look at overclocked.

http://www.techpowerup.com/reviews/MSI/R9_390X_Gaming/33.html
The MSI 390x gets a 7.1% boost from overclocking (90.3 / 84.3) as compared with the reference 390x

http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_G1_Gaming/33.html
The Gigabyte 980Ti gets a 31.4% boost from overclocking (134.8 / 102.6) as compared with the reference 980 Ti

The MSI 970 gets a 17.1% boost from overclocking (133.5 / 114.0) as compared with the reference 970
http://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/30.html

So adjusting for overclocking from the original comparison ....

Reference 390x - 98% x 1.071 = 105.0
Reference 980 Ti - 127% x 1.314 = 166.9
Reference 970 - 91% x 1.171 = 106.6

Clearly, the reference 980 Ti being 30% faster (127 / 98) than the out of the box reference 390x and 59% faster (both overclocked non-reference) would not be a fair comparison.

The reference 390x is only 7.7 faster than the 970 reference but the no reference cards, the way most of us use them overclocked, is 1.5% faster than the 390x .... The 970 vs 390x therefore woud be the closest "apples and apples" comparison.

 


Since AMD & Nvidia were actively involved in the hardware selection process I think one can assume the hardware chosen and the drivers used were both checked for current firmware and driver versions. The main issue I have is the sampling size is too small to get an accurate measure of Freesync vs G-sync. It was interesting based on what it was a small snap-shot of users. Not enough to truly draw any final conclusions. But an interesting experiment none the less.

It would be interesting to find out the total number of Freesync and G-sync monitors sold and how they are split. Since G-sync has been around longer I suspect it will be in the Nvidia's favor. A year or two down the road it would be interesting to see if that would change.
 


1. Hardware / Drivers - I agree that they would have checked. But it's beside the poin. They may have been checked for current drivers the day the test was performed, but the point being, we don't know when that was.

I saw no indication that THG was even aware of the Freesync bug at all. If it's ever been mentioned on THG at any time, I haven't seen it. As for the hardware issue, the time from when a monitor comes off the line, is shipped across the world, goes thru customs, makes it to distributors and winds up in a consumers hands could be months. If it was a model previously on hand, the unit would have to be shipped back to the factory or at least a BenQ service center and that alkso takes some time. I don't know that any of this had been done.

I look here for example, and we see a "How we test" section, that provides this info.

http://www.tomshardware.com/reviews/sapphire-amd-radeon-r9-fury-tri-x-overclocked,4216.html

GeForce GTX 980 Ti: Nvidia 352.90 Beta Driver
All GeForce Cards in Grand Theft Auto V and The Witcher 3: Wild Hunt: Nvidia 352.90 Beta Driver
GeForce GTX Titan X, 980, and 780 Ti in all other games: Nvidia 347.25 Beta Driver
Radeon R9 290 X: AMD Catalyst 15.5 Beta
Radeon R9 Fury X and Fury: AMD Catalyst 15.15

2. Sampling Size - As for the sampling size, I disagree. By letting the users choose "neither", it kinda messes up the statistics as it's harder to do with 3 answers. So lets ignore them. We have 29 G-Sync and 10 Freesync so .... that's a 74% chose G-Sync

Going here... let's see how many people we need to be 90% sure the results are accurate to within

http://www.custominsight.com/articles/random-sample-calculator.asp

I put in a population of 2,000,000 gamers with an acceptable accuracy of + / - 10% and we come up with 68 people. So with 68 people, the 74% favoring nvidia (among those that showed a preference) result obtained would mean that we are 90% sure the whole population results would fall between 64% and 84%. Doing trial and error, I put in +/- 11.9% say 12% and came up with 48 participants....

So we can be 90% sure that the whole population answer would be between 62% and 86% (ignoring undecided)

Now if we throw out the undecideds, after some more trial and error, we get 13.2% (say 13%)

So we can be 90% sure that the whole population answer would be between 61% and 87% (eliminating undecided)

Looking at the logistics of doing 8 rounds of testing and giving each of the 6 participants enough time to form a judgement makes it very time and resource intensive to use large sample sizes.

3. Not apples and apples - The issue here is that the G-Sync hardware that you pay for comes with blur reduction (ULMB) and Freesync does not. So the notion that G-Sync comes with a price premium and Freesync does not is false. Since Freesync doesn't come with Blur Reduction, this has to be added by the monitor manufacturers and there most certainly is a cost associated therewith.

Some monitors do have this added by the manufacturers and some do not. Since there is little advantage to G-Sync / Freeseync much above 60 fps, if one is seeing 85 - 100+ fps, with G-Sync they can switch to ULMB and improve the experience, in the other you can not. If I was to pick one fault in the methodology to correct in Round 2, including motion blur mode would be it



 
Maybe people want to attribute more to this challenge than it can provide. Both rigs were dialed into the VRR range as best as the respective AMD and Nvidia people could provide, for the games under test. They were there all day and stood right behind us as we tested, observing our methods to uncover the masked technology. I don't recall anyone speaking of the Free-sync bug you are referencing, so I can't say one way or the other if it had been addressed prior to test day.
 

madman83

Reputable
Jul 24, 2015
3
0
4,510
Tearing only occurs if your FPS is ABOVE the refresh rate. Buying a 144hz monitor there will be no tearing until you hit over 144fps. How could these technologies help when you drop below 30FPS? You still drop below 30 fps and on a 144hz monitor these techs (based on the technology) provide no benefit. I would rather run 144hz/30fps than 30hz/30fps. Having a high refresh rate doesn't provide a disadvantage. Just cap your frames to never exceed 144fps. I wouldn't pay any premium for either tech. Higher hertz will always have a higher perceived smoothness. We just need 500+hz plasma desktop panels. I dont care about power consumption. My 500+hz tv sucked a ton of juice.
 
I would have liked to seen a larger sample size and have the competing technologies side by side to get and easier comparison. Each rig should be setup in such away the users can't tell if it's a Freesync or g-sync. The hardware should be reviewed by AMD and Nvidia and the setups mutually agreed and equal as possible. Oh and someone tell AMD about the bug since they don't or didn't know about it. If a user guesses which is which before he runs the comparison his results shouldn't be counted in the results. Since it is a visual comparison we are talking about being able to compare them side by side makes it easier to choose the better setup then playing one rig for 20 minutes take a break and then play the other. This should also move the comparison along faster so more can be included in the sampling.
 


This is not true. Tearing will happen in any case that your frame rate does not match you refresh rate. It's much more difficult to see on 144hz monitors because each refresh is only 6.95ms rather than 16.67ms. This is why V-Sync exists, but the trade off being that if you miss your refresh your next frame will be delayed for an entire next refresh cycle so you see stuttering (because the last frame in the 60hz example is shown for 33.34ms). The variable sync techs fix both these problems by only drawing the frame when it's ready and driving the monitor refresh rather than having to always refresh on fixed cycles.

I just wish they would come up with a way to fix the low refresh rate problem, which I would think would be easy to do by just displaying the last frame for the shortest refresh the panel supports until the next frame is ready.... but that might make stutter too. It would be nice if the VRR would go down to about 20hz, or at least 24 so that you could watch 24fps video at it's native refresh rate (though, you could do it at 48hz if a video player would support VRR... come on VLC!).

 
We had 2 different games to play on two different machines @ 7 minutes per game which already just over 30 minutes per tester when you factor in keyboard,mouse & headphone wipe downs in between switching stations. 7 minutes was more than enough time to experience/test.
 
 

GObonzo

Distinguished
Apr 5, 2011
972
0
19,160
48 samples is too low. If you generate 48 samples randomly, most results will look like this, instead of 50% / 50%.
Start excel, generate 1000 samples at random, and it will come close to 50% for each choice (example: 55/45%). But if you generate only 48, most trials will be extremely biased in favor of any alternative.

I'd really like to see someone who understands data do this as a real test.
Just setup 4 or 8 systems with identical setups. Get AMD and Nvidia GPUs that are on par with each other, running the same speeds. Have half IPS, half TN displays all running at THE SAME variable refresh rates with Freesync, G-Sync, and V-Sync on on all of them. 2x AMD TN, 2x AMD IPS \ 2x Nvidia TN, 2x Nvidia IPS.
Over a period of time have ~200 random users blindly test each setup on each type of display, IPS\TN. (and remember, this is not about which GPU or company you prefer, but which display tech works better). And have them reply answering a series of questions about which display was smoother and why.

This tom's test just sounds like another uneven match-up with uneven results like most company comparisons.
 
Status
Not open for further replies.

Latest posts