AMD FreeSync Versus Nvidia G-Sync: Readers Choose

Status
Not open for further replies.

NethJC

Reputable
Aug 7, 2015
9
0
4,510
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160
So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.
 

jkrui01

Honorable
Jun 4, 2012
5
0
10,510
as always on toms, nvidia and intel wins, why bother making this stupid tests, just review nvida an intel hardware only, or better still, just post the pics and a "buy now" link on the page.
 
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.
 
It seems with these hardware combinations and even through the mess of the problems with the tests that NVidia's G-Sync provides an obviously better user experience. It's really unfortunate about the costs and other drawbacks (like only full screen).

I would really like to see this tech applied to larger panels like a TV. It would be interesting considering films being shot at 24, 29.97, 30, 48 and 60FPS from different sources all being able to be displayed at their native frame rate with no judder or frame pacing tools (like 120hz frame interpolation) that change the image.
 
"Our community members in attendance now know what they were playing on. "
Thats when you lost my interest.

It is proven that if you give a person this information they will be affected by it, and that unfortunatelly defeats the whole purpose of using subjective opinions of test subjects to evaluate real life performance rather than scientific facts (as frames per second).

too bad since the article seemed to be very interesting.
 

omgBlur

Reputable
Aug 16, 2014
3
0
4,510
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

Gsync is way overhyped; I can't even tell the difference with my ROG Swift; games definitely do not feel any smoother with Gsync on. I'm willing to bet Freesync is the same way.

Speak for yourself, I invested in the ROG Swift and noticed the difference. This technology allows me to put out higher graphics settings on a single card @ 1440p and while the fps bounces around worse than a Mexican jumping bean, it all looks buttery smooth. Playing AAA games, I'm usually in the 80s, but when the game gets going with explosions and particle effects, it bounces to as low as 40. No stuttering, no screen tearing.
 
So, 38% of your test group are Nvidia fanboys and 8% are AMD fanboys. Wouldn't you think the results would be skewed as to which technology is better? G-Sync may very well be better, but that 30% bias makes this article irrelevant as a result.
But it was a blind test. The rigs were obscured. It sounds like at least one participant guessed the amd rig based on heat or fan noise, but otherwise they could judge only based on what they saw on screen.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


Actually it was 9: "Right off the bat, we found it interesting that 10 of 48 respondents believed they knew which system was which. Of those 10, nine were correct, though for a variety of reasons." . Around 20%
 

molo9000

Distinguished
Aug 14, 2010
646
0
18,990
Comparing different monitors with different specs and different prices doesn't tell you anything about the technology. This is just a comparison of different monitors.
 

jasonelmore

Distinguished
Aug 10, 2008
626
7
18,995
Both of these technologies are useless. How bout we start producing more high refresh panels ? It's as simple as that. This whole adaptive sync garbage is exactly that: garbage.
I still have a Sony CPD-G500 which is nearing 20 years old and it still kicks the craps out of every LCD/LED panel I have seen.

These variable sync monitors excel at lower framerates.. Almost everyone uses a Single GPU setup. Even a GTX 980Ti, will dip below 60 FPS on some games..

On a regular monitor, lets say a game runs 60 FPS, and then a big fight scene happens, lowering the frame-rate to 57 FPS.. on a regular monitor, it will drop it down to 30 FPS instead of 57 FPS, since the monitor has two modes 30hz/FPS, or 60hz/FPS

on a gsync or freesync monitor, it drops the monitor refresh rate to 57hz, which lets you get 57 FPS.

your argument is all about very high frame rate. even with a very high 144hz rate,, it still needs to be able to do 143hz, 142hz 141hz and so on. it cant do those lower refresh rates without Gsync or Freesync.

you can buy a additonal GPU to try and keep it at 144hz or 144 FPS, but money is more wisely spent on a Gsync or Freesync monitor. uses less power, cost less money, and less heat
 

Achoo22

Distinguished
Aug 23, 2011
352
2
18,785
With your wacky variables, and subsequent weak excuses, explanations and conclusions, this is not your best work.
Sadly, I must completely agree. Please consult a statistician when setting up future experiments.
 

cegasaturn

Distinguished
May 8, 2009
15
0
18,520
As an event attendee, it's great to be able to read the results! Any chance you can tell us which set of stations ran Battlefield at different quality settings?
 
As a follow up, I'd like all frame pacing data put under the microscope. Do we see particular game engines / developers stand out? Do later versions of the engines improve or worsen the situation?

Is it a software problem that hardware is trying to solve? Can it be solved in software with better development? Will the appearance of a hardware solution encourage less effort to be put into the issue on the software side?
 
1. First off, it was interesting that nVidia matched the 970 against the 390x recognizing that the difference in overclocking headroom between the 2 cards makes out of the box performance comparisons futile.

2. One thing that should have been made clear if the BenQ XL2730Z was factort updated or a fresh off the lime model with the firmware update or was it the original with broken Freesync.

http://www.tftcentral.co.uk/reviews/benq_xl2730z.htm

"From a monitor point of view the use of FreeSync creates a problem at the moment on the XL2730Z at the moment. The issue is that the AMA setting does nothing when you connect the screen over DisplayPort to a FreeSync system. This applies whether you are actually using FreeSync or not, you don't even need to have the option ticked in the graphics card settings for the problem to occur. As a result, the setting appears to be in the off state, and changing it to High or Premium in the menu makes no difference to real-World response times or performance. As a result, response times are fairly slow at ~8.5ms G2G and there is a more noticeable blur to the moving image. See the more detailed response time tests in the previous sections for more information, but needless to say this is not the optimum AMA (response time) setting on this screen. For some reason, the combination of FreeSync support and this display disables the AMA function.

Having spoken to BenQ about it the issue is a known bug which apparently currently affects all FreeSync monitors. The AMD FreeSync command disturbs the response time (AMA) function, causing it to switch off. It's something which will require an update from AMD to their driver behaviour, which they are currently working on."

This was fixed as of June 1st but many monitors were purchased or where still in channel when this occurred.

3. While G-Sync / FreeSync come in handy in the 30 - 70 fps range, the key for me is what happens when you have 80 or 100+ ... at this point you would turn off G-Sync and use ULMB but how does ULMB compare with Freesync ?
 


Exact opposite of what Gsync does for me. Using the same settings as on my 144Hz non-Gsync monitor and difference is not noticeable; also Gsync does not stop stuttering. In fact with SLI I have MORE stuttering with Gsync on than off. And I have 3 ROG Swifts, tried each one just in case one was bad, but same deal on every one of them.
 

salgado18

Distinguished
Feb 12, 2007
977
434
19,370
1. First off, it was interesting that nVidia matched the 970 against the 390x recognizing that the difference in overclocking headroom between the 2 cards makes out of the box performance comparisons futile.
No, that was smart play of Nvidia. Their games would run a bit slower, and would stay within the refresh range longer. The AMD card would overshoot some times, and it has trouble dealing with higher frame rates than the refresh. That's not "being nice", that's strategy.
 
1. First off, it was interesting that nVidia matched the 970 against the 390x recognizing that the difference in overclocking headroom between the 2 cards makes out of the box performance comparisons futile.
No, that was smart play of Nvidia. Their games would run a bit slower, and would stay within the refresh range longer. The AMD card would overshoot some times, and it has trouble dealing with higher frame rates than the refresh. That's not "being nice", that's strategy.

And also show an effective 3.5GB vram versus 8GB, as well as compensating for the price difference between gsync / freesync monitors
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
This test means nothing. It should had made with a statistician, the same way that a motherboard review is done by somebody knowledgeable in motherboards.

48 samples is too low. If you generate 48 samples randomly, most results will look like this, instead of 50% / 50%

Start excel, generate 1000 samples at random, and it will come close to 50% for each choice (example: 55/45%). But if you generate only 48, most trials will be extremely biased in favor of any alternative.

In other words, this test is not different than a random result. It means nothing. You can repeat exactly the same experiment and get the opposite results.

Worse, if 10 out of 50 players (20%) know what is the real hardware, then it gets even more biased.

Of course, you cannot collect 1000 monitors, but you can cycle the players, and get more players.
 
Status
Not open for further replies.