AMD CPU speculation... and expert conjecture

Page 739 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Ranth

Honorable
May 3, 2012
144
0
10,680


How come? Isn't the cost advantage based on not having the G-sync module? Or did I misinterpret that?
 


The problem with AMDs implementation, as FPS rises over 60, the old ghosting problem in LCDs starts to return again, since you're now free from the 16.67ms refresh period. You can get around that with better panels with lower pixel response times, but that's going to raise costs by about $40 or so, or about the same as the cost as the Gsync module. And right now, Gsync is the better TECHNOLOGY, due to how it handles lower FPS, so if the two technologies are going to have an end-user cost about the same, and 80% of the market uses NVIDIA rather then AMD, which one do you think is going to win out?
 

Ranth

Honorable
May 3, 2012
144
0
10,680


But isn't Nvidia affected by ghosting aswell? That's my point if you have two equivalent monitors, as in same panel, but on one you've got G-sync and the other Freesync. The G-sync monitor will come at a premium while the freesync wont, or does G-sync do magic with the LCD?
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
buying a $500 monitor for a graphic card that plays games at under 30fps. This is relevant, how?

Also, all LCDs ghost: http://www.tftcentral.co.uk/reviews/asus_rog_swift_pg278q.htm There's the same monitor PCPER is saying doesn't have ghosting problems, having, well, ghosting problems.

Free-sync and G-sync aren't here to stop ghosting. You'll need a Plasma or something else for that. PCPER dropped the ball by stating G-Sync has no ghosting while Free-Sync does. All LCDs have some ghosting, it's why your screen turns into a blurry, unreadable mess when you scroll yet on CRT it's still readable.
 
From what I can take, according to Gamerk's interpretation is that AMD *included* in it's advantage in price, the price of lower response panels? If that is correct, Gamerk has a point that AMD inflated the price difference.

In regards to ghosting itself. I have a Samsung 120Hz with an advertised 1ms GTG response time. I see *no* ghosting at all. It's quite nice indeed. And, on the other hand, my older Viewsonic 60Hz advertised 2ms GTG had little to no ghosting at all. I'm sure if I try hard enough I could see it, but then it would be like trying to prove the rubber on my tires doesn't last long by doing burnouts or drift.

Cheers!
 


From PCPer:

The question now is: why is this happening and does it have anything to do with G-Sync or FreeSync? NVIDIA has stated on a few occasions that there is more that goes into a VRR monitor than simply integrated vBlank extensions and have pointed to instances like this as an example as to why. Modern monitors are often tuned to a specific refresh rate – 144 Hz, 120 Hz, 60 Hz, etc. – and the power delivery to pixels is built to reduce ghosting and image defects. But in a situation where the refresh rate can literally be ANY rate, as we get with VRR displays, the LCD will very often be in these non-tuned refresh rates. NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.

So in short, when you aren't running at the "preferred" refresh, the NVIDIA Gsync module is dynamically doing voltage adjustments to affect how quickly the pixels refresh. Since AMD isn't doing this, their implementation, in theory, is going to be a LOT more sensitive to this problem then NVIDIA's is.

So now we need a EE to tell us how much voltage REALLY matters when it comes to pixel response time, because I'm not going to claim to be an expert in this area. But it's one that bears some watching.
 

Ranth

Honorable
May 3, 2012
144
0
10,680


Thanks for clearing that up.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


It means that AMD is dropping support for certain instructions that aren't popular among developers and not used by software. It means nothing about modules vs cores. At best, it means that efficiency is a key design for Zen and engineers are eliminating waste from Zen.
 

con635

Honorable
Oct 3, 2013
644
0
11,010

It certainly doesn't cost the manufacturer of the monitor the same and does that 80% count for apus? I assume fresync works with apus? There will be a lot more freesync than gsync monitors imo but lets wait an see.

 
To the question of which standard will win... Freesync (or specifically adaptive sync as per the dp standard).

What will decide that? Support on Intel's igp. They have majority share of pc graphics after all. AMDs solution Works on any adaptive sync display and thus wins by default.

Nvidia will undoubtedly add driver support for adaptive sync when they realise gsync is dead.

This happens time and again, wider adoption beats better tech.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790

And the second part

BioShock Infinite Is The Latest Game Showing Why Linux Gamers Choose NVIDIA


Another fiasco from AMD.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
I want a second opinion on freesync besides PCPER. They were the ones that made a huge noise about frame times and did so while not properly disclosing that they were doing so because of tools provided by Nvidia. They have a nasty history of making claims and not properly disclosing who they are working with or what tools they are using.

Not to mention not LCD panels are the same, which is why you end up seeing some panels go for a lot less. Prime example of this is the Korean 1440p IPS.

There's a lot of factors that depend on LCD ghosting beyond what is pushing pixels to the monitor. And on another note, AMD did have an LCD overdriving feature built into their drivers long ago, called Overdrive (not to be confused with the new overclocking tools). It was abandoned because, while it did reduce ghosting, it caused display artifacts and all sorts of visual anomalies. Not to mention changing voltage causes problems with inverse-ghosting and input lag. Which, between the frame doubling and voltage changes, and vsync, it seems like g-sync would be suffering from some input lag issues. But I don't see those addressed at all.

You can see input lag from vsync in gsync here: http://www.blurbusters.com/gsync/preview2/ Note the cases where frame limiter is set above 144hz/fps and the input lag is very bad. That is pretty awful you are not allowed to turn off vsync on gsync. Hopefully it is something that is remedied. Most gamers prefer tearing over input lag from my experience. Not having the option to change is not good.

CS:GO is forcing gsync to kick in vsync and massively hurting input lag. I would assume that since vsync and frame doubling of gsync work similarly, that you'd also see those problems at lower frame rates as well. Regardless, for a website that was so obsessed with latency of frames before, they seem oddly unconcerned with latency of display with gsync.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


http://wccftech.com/amd-freesync-nvidia-gsync-verdict/

G-Sync ==> better quality
Freesync ==> cheaper
 

Ranth

Honorable
May 3, 2012
144
0
10,680


From the link

FreeSync Pros :
– Easier to integrate into a wider range of monitors due to lack of any additional hardware.
– Significantly less expensive than G-Sync, no licensing fees.
– Enables all the usual monitor features and display outputs.
– Gives you the option of V-Sync on or Off.

FreeSync Cons :
– Doesn’t work below the minimum refresh rate, amplifying the tearing and input-lag issues at low FPS.
– Currently limited to six graphics cards and six APUs.

G-Sync Pros :
– Offers a better experience than FreeSync below the minimum refresh rate of the monitor.
– Compatible with a wider range of graphics cards.

G-Sync Cons :
– Requires dedicated hardware in the monitor and demands licensing fees.
– Limits monitor features, sound and display output options.
– Measurably more expensive than FreeSync.
– Currently doesn’t give you the option to disable V-Sync above the maximum refresh rate of the monitor.

I just took all the points since it's... well points. I don't think: G-sync better | Freesync cheaper covers all the points.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I was answering to the discussion here, which was about cost and about refresh rates.
 


Freesync DOESN'T support every display, only those with build in support, same as Gsync.
 


I'm kinda assuming they did, but lets be real: When was the last time ghosting was a serious problem? I kinda doubt the first Freesync monitors have ghosting issues @ 60Hz...
 

con635

Honorable
Oct 3, 2013
644
0
11,010
The ghosting happens with and without freesync, apparently simply adjusting the contrast gets rid of it, its set very high from factory on the benq monitor.
 

jdwii

Splendid
Wow just now hearing about the minimum frame rate thing some of these monitors are claiming 40-50hz or so LOTS of games can drop down 40-50fps, it's like we can never win.

Maybe Amd can make it where it turns off freesync below the minimum FPS the monitor supports.?

Anyways no problem with me i'm probably going to get that Acer G-sync monitor IPS 1440P and 144hz.
 


Freesync support *any* monitor that conforms to the new display port standard.

Therefore moving forward it *will support* any modern display port screen, as the *only requirement* for Freesync is that it supports the open display port adaptive sync capability present in the latest display port standard.

So whilst I agree it doesn't support all current display port screens, it's going to become the standard. I'm pretty confident on that- also as the standard in question is an open one, I fully expect Intel to add it to their driver, IGP's tend to operate at lower refresh rates after all so they stand to gain a lot in playability from this imo.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


This is why I don't like PCPER. They benchmark two different products with a ton of variances between the two, find one thing that's different, and then wave their hands around like that one thing is all that matters and nothing else matters.

The whole test is invalid. If it were the same exact panel and monitor and the only difference was gsync or freesync, we'd have something against freesync. But it's simply not the case. It's akin to running one CPU benchmark with faster memory, Linux, and a better graphics card and another with slow memory, old virus encrusted Windows, and a worse graphic card and going "lol the Windows machine is so much slower at everything what's wrong with the slow computer!?"

Both examples have way too many variables that aren't being properly accounted for and certain variables are being overly emphasized. Which is why I don't trust PCPER. They are clearly biased and have a history of being so, from not disclosing partnerships to not disclosing tools used to ignoring results and flaws in their testing to come to conclusions that benefit Nvidia while not benefiting AMD.
 
Status
Not open for further replies.