Samsung, AMD Announce First FreeSync Monitors, Coming March 2015

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

aztec_scribe

Distinguished
Jan 22, 2010
115
0
18,710


Yeah exactly.
 


AMD did mention there will be company that will make scaler for adaptive sync but so far they did not say about if customer can buy such scaler themselves to retrofit their current monitor to get Adaptive Sync capability.
 

Haravikk

Distinguished
Sep 14, 2013
317
0
18,790
I'm glad to hear someone finally moving to add this, as I definitely think that Adaptive Sync is the way to go; it's a technology that already (sort of) exists, so it makes no sense to go with something proprietary and expensive like Nvidia's G-Sync. In fact this is something that really annoys me about Nvidia, namely their always opting for proprietary technologies when it's something that is clearly better as an open standard. Though of course AMD's Mantle and HSA are open only in name until someone else bothers to adopt them too, but at least they seem to committed to doing it that way.

For those asking; adaptive sync is an optional feature of DisplayPort 1.2a, so monitors aren't required to implement it (though later standards may require it). DisplayPort 1.3 includes improvements to allow for faster refresh rates at higher resolutions, but also includes the 1.2a specifications and optional components, so a DisplayPort 1.3 monitor implementing Adaptive Sync should be able to run at (up to) any supported refresh rate. The problem really is that without cards using the feature, there's no reason for monitor makers to add it and vice-versa, so hopefully this move will help it to get off the ground.

If it does, then G-Sync stands to fall by the wayside unless it can demonstrate some kind of clear advantage, but I think that's for the best, as no-one wants to buy a GPU that limits you in your choice of monitors.

Also, I really hope that it takes off because we desperately need to get away from trying to push frame-rates as high as possible, just to overcome monitor flaws; if you monitor only does 30 or 60hz then syncing should mean there's nothing to be gained from a higher frame-rate, so we can just reduce load on the GPU for quieter, cooler running with no loss in visuals.
 

geogan

Distinguished
Jan 13, 2010
57
2
18,535
Isn't Dell the biggest name in PC monitors. Why aren't any of their new monitors (like the new 21:9 1440p model) able to do this? ie. why aren't AMD talking to Dell?
 


maybe Dell themselves are not interested with it? i never heard about Dell making G-Sync monitor as well. or maybe they want to see customer adoption towards this tech first before investing further into this kind of tech
 

dovah-chan

Honorable
Dell is just a monitor manufacturer but Samsung is an OEM for panels. AMD will have a bigger impact by teaming up with someone like Samsung (or LG or NEC) as that will cause much more ripple and faster adoption across the market.

Also there is something that has been on a lot of people's minds lately regarding this AMD and Nvidia struggle. Why must we be forced to choose our GPU based on features that each side has instead of performance? I hate having to recommend one product over the other (especially in the same price bracket) just because it has proprietary method x and the other one doesn't. To me, that's not how a person should be forced to choose their card.

There are five criteria a card should be judged on and nothing else: price, performance, noise, aesthetics, and quality. I just wish they could get along and compete, but still work together in some aspects to ensure that they deliver the best product possible for their consumers instead of alienating us to one side with such despicable marketing tactics.
 

InvalidError

Titan
Moderator

Most of the displays Dell ships is to businesses, Active Sync has very little use for CAD and office work so Dell has little to no reason bothering with pushing Active Sync any faster than their upstream providers will bother including it as standard in their products except maybe for their AllienWare brand.
 

soldier45

Reputable
Oct 29, 2014
21
0
4,510
Stopped caring at 23 and 28 inches. Where are the 30+ inchers with this and 4K. some of us game a lot higher than 1080p and have for several years now.
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165
"It competes directly with Nvidia's G-Sync, but whereas Green Team's version is proprietary and adds between $100 and $200 to the cost of the monitor, FreeSync uses an optional protocol in the DisplayPort 1.2a specification in the form of the open Adaptive-Sync standard, which is free for anyone to implement."

This is very misleading. The fact is that FreeSync monitors WILL COST AS MUCH as Nvidia GYNC monitors. Why? Simple. It's not the DisplayPort 1.2a that makes the cost of the monitors higher, it's the special scaler in the monitor that is required in order to communicate full duplex with the graphics card. It's articles like this that mislead so many potential buyers. Case in point, no prices are given for the new FreeSync monitors. Now you know why!
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


This is incorrect. Nvidia asked scaler makers to MAKE their chips capable of this, and they said no. So Nvidia did the R&D and made it themselves. This is stated in many articles regarding the two technologies and even direct from Nvidia on why they did it (they even had their doubts about AMD getting them on board since they admitted they failed at this part). Nvidia didn't do this because there was another way and they just wanted to screw users, they did it because there WAS NO OTHER WAY, and they wanted their users to have the tech. I'm sure they want to make money on it (or at least get back the R&D spent) but they didn't set out to do that in the beginning. They were merely calling on scaler makers/monitor makers to help resolve a LONG problem in gaming (3 problems actually). Of course once NV's solution was in the market, well then scaler makers have to either add something like it or risk losing tons of sales a few years later as NV starts selling scalers to everyone under the sun (assuming it catches on if nobody else does anything to resolve the issues Gsync tackles). You really should be thanking NV for forcing scaler makers to make a move this time when AMD asked for the same thing. I really hope they get the R&D back at least, as they did drive the market forward here and considering it came out on a SINGLE monitor and they are slowly adding models, it's clearly a pretty tough set of problems to tackle in the way NV did it. It remains to be seen if it's just easier AMD's way, or a putz way that's just good enough for some people but not a true solution.

You could call them evil if the scaler makers were willing to help but NV said, nah we just want to charge people money so screw you guys. But that wasn't how it went down. NV seems to have to re-work some aspects of this tech for each monitor, so I'm really starting to wonder how good AMD's blanket solution will be. We'd be seeing them rolling out gsync everywhere on tons of models if it was a simple fix to get this truly done RIGHT. Something tells me AMD's will be good enough (for some anyway), and NV will still have a reason for a premium charge here (even after they get the R&D back). Having said that, we won't know squat until we see the first monitor tested with AMD's fix.

This really isn't about video playback, it's about the gaming for the majority of us. The basic implementation means nothing to me. I want to know if it fixes the GAMES and not one at a time as drivers get updates etc. I don't want to wait for 300 AMD game updates to get it working etc (I really hope this won't be the case, and that it just works).
 

Bondfc11

Honorable
Sep 3, 2013
232
0
10,680
Dude you are so wrong - Nvidia has never said they asked for Async to be implemented by other OEMs - NEVER. Nvida has said many times they did what AMD is doing and found it did not perform the way Nvidia wanted it to across the board. Further Nvidia will not be implementing Async in their drivers since they believe it inferior to the Gsync ASIC module - and it may very well be - we will have to wait and see.

By the way, OEMs still must create boards for the Gsync modules so saying scalar companies wont do the work is incorrect. Actually, the whole comment on scalar companies is incorrect. The AD PCBs are designed and made in house at all the major OEMs - only the ICs are from the main three scalar component manufacturers.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


If I remember correctly this was pretty much what was stated about AMD Mantle but you were one of the detractors in regards to Mantle. Why was Nvidia the one to disrupt the market, in this regards yet with modernizing API's (lower cpu overhead, etc) AMD is considered playing piggy back.
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


My thoughts are the 1.2a/1.3 standards are relatively new. The monitor in question was probably in development/mass production when the standards were being finalized. I'm unsure if it will occur in the future. but unless Dell has some sort of contract with Nvidia I can only assume they will carry the 1.2a/1.3 in the near future releases.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Perhaps because OpenGL already had low draw calls, as NV demonstrated on stage. OpenGL had most of the ability years ago but nobody knew how to use it until they talked about it on stage. Perhaps because NV/MS had already been working on DX12 for a few years before it was announced?

https://www.youtube.com/watch?v=-bCeNzgiJ8I&html5=1
How to generate a "crap ton" more draw calls in OpenGL (feb 2014, and it was available long before this). John Carmack said on stage also, that OpenGL already had similiar abilities if you wanted to use them via extensions, confirming what Cas Everett showed. I'd have to find the NV speech with Carmack on stage etc, but carmack said it on stage and twitter etc as discussed here:
https://www.youtube.com/watch?v=CdyxWuXQOvQ&html5=1

Carmack article mentioning his tweets etc:
http://vr-zone.com/articles/john-carmack-mantle-became-interesting-dual-console-wins/61108.html
"Later on during the talk Carmack and the other panelists — Epic Games’ Tim Sweeney and DICE’s Johan Andersson — were asked if they thought other developers should follow AMD’s cue and create their own APIs. All agreed it was a bad idea."

I wasn't alone saying it wasn't first or good. Those are THREE huge names saying the same and they make the games etc. right? They were saying that in 2013. What evidence do you have showing OpenGL could NOT do this first? What evidence is there that Carmack, Cas Everett, John Mcdonald etc are all wrong? Is there any evidence that even shows Mantle work pre-dated Dx12 work, which NV said was going on with MS for at least 2yrs before they hit the stage with it?

http://www.dsogaming.com/news/john-carmack-nvidias-opengl-extensions-rival-mantle-8gb-on-consoles-not-a-significant-change/
Another fairly semi-famous mod guy (Durante with his DSfix, AKA Peter Thoman) showing Carmack was right and note that DATA is from 2011. So OpenGL could do this crap in 2011 at least correct? When was Mantle done? Cas showed some phenomenal perf gains with just a few lines of code changed in one of his examples in the first vid.

While you MAY be able to ague mantle started before DX12, that is not at all the case for OpenGL which had it for years already. Consider me still a detractor, and I stand by my original comments that OpenGL was first, or call carmack etc and tell them quit saying/proving it. ;)
 

InvalidError

Titan
Moderator

AMD's AMA about FreeSync just started and the word is that many scalers in displays already on the market are firmware-upgradable to Adaptive Sync.

http://www.tomshardware.com/forum/id-2407018/official-amd-radeon-representative.html#14788173
 

airborn824

Honorable
Mar 3, 2013
226
0
10,690
Wait the UD590 is already available. Is this a firmware upgrade? I just recieved one that i have not opened so i would like to know or i will return it. I want Freesync for my R9 290 PCS+
 

childofthekorn

Honorable
Jan 31, 2013
359
0
10,780


Highly recommend sending an email to tech support of the brand and asking.
 

Ninjawithagun

Distinguished
Aug 28, 2007
747
16
19,165


You are dead wrong in so many areas. Nothing was said about Nvidia asking OEMs to co-develop Async. Nvidia wanted to have them co-develop an adaptive sync technology BEFORE the Adaptive Sync specification was released as part of DisplayPort 1.2a. NOT ONE OEM creates anything GSYNC. Nvidia manufactures the modules in Honk Kong and then sells the GSYNC modules to 3rd party manufactures for incorporation into their monitors. Asus does NOT manufacture GSYNC modules. They buy them from Nvidia and then modify them as necessary to make them compatible with their monitors (i.e. ROG Swift PG278Q). AOC, BenQ, and LG all buy their GSYNC modules from Nvidia.

Also, FreeSync is more software based than hardware based meaning that it will exhibit more lag than GSYNC. Preliminary reports hint that in fact GSYNC has a lower response time in which the video card directs the monitor GSYNC module to draw the next frame. Because FreeSync is primarily software based, it has to contend the OS virtual layers to communicate with the monitor. I guess we'll find out soon enough if either is better than the other or if FreeSync even works at all...
 

InvalidError

Titan
Moderator

FreeSync is just AMD's name for their GPU/driver-specific software/hardware support while AdaptiveSync proper is entirely between the GPU's display engine and the display itself. I do not remember AMD saying software was actually involved in the frame-by-frame management and from a technical point of view, there is absolutely no reason for software to be involved beyond negotiating minimum and maximum frame timing / rates with the display. There may be additional driver parameters to smooth things out further but those are unrelated to AdaptiveSync itself - Nvidia likely has a bunch of internal parameters they tweak when operating in GSync mode as well that have nothing to do with what happens between the GPU and display.
 
we will know more when there are review. i just hope we don't need to wait for monitor maker to be available in mass production before we see any review. hopefully AMD can provide these monitors ahead of actual launch to review site to review freesync and make direct comparison with GSync. i'm interested to know about AMD claim which they say FreeSync is superior than GSync.
 
Status
Not open for further replies.