AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.

Since the user had no idea if he was using an AMD or Nvidia based system this argument is invalid. If you like coke more then pepsi and you do a blind taste test the test isn't biased.
 
G

Guest

Guest
I also attended. (I'm the guy in the nVidia hockey shirt in the pictures)

To my eyes, Crysis 3 showed a marked improvement using FreeSync (Was I the only one?). This "improvement" may have simply been performance in terms of frames per second. We found out at the end of the day that both monitors were capable of 144 Hz.

I could not tell which system was which. I could not tell the difference between G-Sync and FreeSync in Battlefield 4. The overall experience simply seemed smooth throughout. My point of reference is my rig at home which is an i7-2600K @ 4.48 GHz + a pair of GeForce 580s in SLI, capable of running Battlefield 4 at ultra at my monitor's refresh rate of 60 Hz.

Of course I was in the first test group that did not switch computers until after the second game was played.

Nonetheless, going from G-Sync to FreeSync in Crysis 3 felt like going from 30 fps to 60 or 80 fps right off the bat (i.e. the moment I moved the mouse around).
 

cmi86

Distinguished
So roughly 75% of the reviewers are green and green wins roughly 75%... hmm i guess that has nothing to do with anything. Totally objective, checks out to me. It's toms, I knew who won before I even clicked on the article.
 

jdwii

Splendid


It scores better cause it is better when one display technology can't even work at a certain frequency its a POS. I like G-sync cause it works at any frame rate not at some weird rate. It wins cause its better simple as that. If free-sync was better it would win.
 

jdwii

Splendid


Some can't notice i can easily no tearing no lag beween 40-144fps.
 

jdwii

Splendid


Actually my acer IPS monitor is 144hz
 

jdwii

Splendid


Actually G-sync offers a buffer so its monitors aren't limited like free-sync for any frame rate.
 

jdwii

Splendid


so having a monitor run at 144 and then 70 fps without tearing is now a gimmick? WTF is wrong with people here?
 

cinergy

Distinguished
May 22, 2009
251
0
18,780
Not surprised by the result since Tom's has a long track-record of tinkering the results to be favorable towards nvidia cards let it be handpicking of benched games, game/benchmark settings or plain audience preference (which in this case seem mostly be nvidia fanboys).

Both technologies are equal in their capabilities so the results should also be equal when equal settings, software and hardware is used. Only AMD's solution is cheaper since its based on standards whereas nvidia (as always) is using their own proprietary and closed sw/hw. Closed technologies will eventually end up losing when cheaper and equal "free" solutions will gain ground.
 

cinergy

Distinguished
May 22, 2009
251
0
18,780
So roughly 75% of the reviewers are green and green wins roughly 75%... hmm i guess that has nothing to do with anything. Totally objective, checks out to me. It's toms, I knew who won before I even clicked on the article.

Same. I've been casual reader of this site for more than ten years and I can tell you there has always been a flavor of bias or general attitude towards preferring nvidia. ATI used to be Canadian company whereas nvidia has always been US. Things like these weight a ton in US based sites. When I read my local sites, all the sudden, ATI/AMD hardware is performing much better and tests are set in a truly objective way.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


NO surprise when the "better" version has been gsync for all of freesync's life. If I was sitting in front of machines and found say 5 out of 10 were way better than the other, I would have said those 5 are probably nvidia. That isn't bias, just based on play that day and previous reading of dozens of reviews that say you CAN tell gsync is better. Not recognizing that, would be like saying the dozens of reviews I read were all LYING. I would have been shocked to be wrong based on reviews all of the tech sites of both technologies. Raise your hand if you expected AMD to win based on dozens of reviews of both techs around the web? I see no hands or maybe 1 in the back row? :) Does that make the point easily understood? It is good enough for many and especially budget constrained people, but nobody expected it to be a landslide for AMD here. I went into this article (reading it) wondering just how good AMD has gotten over the last 6 months, and is it close enough to change what I'm PROBABLY going to do (gsync if budget allows and AMD doesn't blow out NV on perf at the die shrinks for both). If it gets close enough (nope, not yet) and AMD wins handily after the shrink they could still get my money (heat in AZ plays a big role for me too though). If perf at the shrink is a tie, I'll pay for the better tech if at all financially possible as I live with this monitor for 7-8yrs probably. Barring a super hot NV card that is (trying to lower temps in my PC room).
 
G

Guest

Guest
1) Your picture exclusively show male gamers.
2) Did you have any female gamers in your sample? If so, how many?
3) You did not consider gender into your analysis. Do you assume gender to be irrelevant? If so, on what do you base this assumption?

Yes! Female gaming war!. Laywer up, Tom's.

Do you game? Can you enumerate the model number and specification of a few cards from any recent generation? Does the impact of these technologies in any way interest you?

Personally, I'm not surprised nVidia seems to come on top. Neither technology seems that good, but a dedicated asic will usually give better results. Neither technology is that relevant. 4k gaming isn't a reality right now. 1080p can be rendered good enough.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


No hardware site says freesync is as good as gsync. You can argue it's cheaper, but you can't argue it's better or equal. AMD themselves said they were going to try to influence PART choices more in the future (admitting they have issues while NOT doing this). That isn't bias, that is reality. Even in this article toms says no idea why the vendor picked a limited part (yet again making AMD's part comment valid). Again, reality. Basically you wouldn't be happy until AMD is picking games? NV owns 78% (last Q, judging NV's revenue beat yesterday probably well above 80% now) of the discrete market, and it isn't all fanboys. Maybe AMD needs to build better products. I bought Intel this year, but not at all because I like them. It's about PERF. I hope Zen changes that for my dad's build, but if not I'll tell him INTEL too. Neither of us is INTEL fans.
 

paulbatzing

Reputable
Apr 11, 2014
156
0
4,760
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
At what kind of frame rate? GSync doesn't make much of a difference in perceived quality above 70 fps or so in my experience, except in cases where the frame rate drops more then 10 fps and goes back up again shortly after. If you have a solid 100 fps or above gsync isn't for you.

When it comes to stutter, gsync doesn't remove stutter that comes from the game engine (like in Arkham Knight), but when it stutters on a 60hz screen because you are hovering between 29 fps and 31fps ( meaning that suddenly the screen will show 15 fps), gsync will improve the perceived picture. SLI micro stuttering will not be better, and compared to some VSync modes it will be worse, but it does not introduce the lag that frame buffering does.

Just increase you settings until you get ~50-60 fps in games, and you will se the difference. If not, why run it in gsync mode at all? Much better to use ULMB in that case...
 

NethJC

Reputable
Aug 7, 2015
9
0
4,510


The best analogy would be regular/diet cola. You are convinced diet cola tastes as good as the real thing. When every other minded person thinks the opposite. The fact is, they both taste different.
I rather get the real deal, than some lightened version of the real deal.
Simple as that. Do you understand now?
 

NethJC

Reputable
Aug 7, 2015
9
0
4,510


"perceived picture" - Exactly.
This technology is a 'fooling' technology, and plays to the perception rather than fully fleshing out fps/hz.
While an interesting innovation, with maturing to do, it still doesnt improve what is already the best.
It provides a new tiered cost in the high mid range, but that's already putting it close to the range where it is no longer needed (top range machines/monitors)
The entire flatscreen industry has been a severe retrograde to display technologies. They are only NOW (15-20 years later?) waking up and saying 'oh wait, we can make higher refresh rates....'.
So it is ANNOYING when BS technology like this comes out and confuses the crap out of those who do not know better after literally decades of a stagnant industry.
CRT's and graphics with VSync at the end of the CRT era was a quality that is still unsurpassed (IMHO, though the increasing refresh rates on the market as of late is an indication this may soon change, if not for faux technology like this stuff.)



 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

Can run gsync in windowed mode now, it was added in the drivers recently.

As for you who complain gsync/freesync don't matter i can tell it matters more the lower the FPS, utilizing my swift @120+ fps their effect is barely noticeable but when a game stutters or have lower fps (crappy batman port anyone?). Then the difference is day and night. Normally its either avoid tearing (impossible with normal monitor with vsync off) or avoid input latency (impossibe with normal monitor and vsync on). Gsync and freesync have the best of both worlds - No tearing (much like vsync on on a regular display) and no additional input latency (equivalent to vsync off on a regular display) - Simply said -> Best of both worlds! If that isnt good, well go and cry somewhere else!
 

NethJC

Reputable
Aug 7, 2015
9
0
4,510
You must not have a top end system that is properly tuned (which is why you need gsync in the first place)
Always nice to pay $$ for technology which is merely compensating for a poor system design in the first place ;)

 

paulbatzing

Reputable
Apr 11, 2014
156
0
4,760


"perceived picture" - Exactly.
This technology is a 'fooling' technology, and plays to the perception rather than fully fleshing out fps/hz.
While an interesting innovation, with maturing to do, it still doesnt improve what is already the best.
It provides a new tiered cost in the high mid range, but that's already putting it close to the range where it is no longer needed (top range machines/monitors)
The entire flatscreen industry has been a severe retrograde to display technologies. They are only NOW (15-20 years later?) waking up and saying 'oh wait, we can make higher refresh rates....'.
So it is ANNOYING when BS technology like this comes out and confuses the crap out of those who do not know better after literally decades of a stagnant industry.
CRT's and graphics with VSync at the end of the CRT era was a quality that is still unsurpassed (IMHO, though the increasing refresh rates on the market as of late is an indication this may soon change, if not for faux technology like this stuff.)
Sure, perceived quality, just like VSync, ULMB, AA... The list goes on. All gaming hardware is about perceived quality. If you just care about fps, run all games on low with VSync on and be done with it.

It might not be for you, but *sync has a real advantage over fixed refresh screens, namely that you get the experience of VSync without the inherent lag. If you go above the monitor refresh rate, obviously the technology is identical to fixed refresh mode. Not a single person that knows the technology denies that. It was created with the specific purpose of removing problems caused by VSync in lower than refresh rate fps scenarios. You might not need it, but some people (like me) only buy a new graphics card every 4-5 years, and when the fps of new games drop, that is a nice feature to have. You don't think you need it? Fine, don't buy it! But at least try to understand what it is meant to do!
 

jdwii

Splendid


And how on earth is having a 144hz IPS monitor with G-sync not the real deal?
 

NethJC

Reputable
Aug 7, 2015
9
0
4,510
There is lag in the fact that your machine isnt pushing the frames. All you can claim is that your screen isnt making it any worse, and adapting its refresh to make it look as good and smooth as possible
It's not as if *Sync generates/interpolates frames in any way - you are still only getting X graphical updates a second, it's just being presented nicer.
There is inherent LAG builtin to the fact that the FPS are low, and having *sync makes 0 difference about that.
 
Status
Not open for further replies.