AMD Fires Back at G-Sync With Non-Proprietary Alternative

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

hannibal

Distinguished
The idea behind this is that this is based on VESA standard, so most possible graphic cards and monitors support this if they are just new enough and follow the free standards. I am sure that nVdia could make the best drivers to support this feature with their graphic cards, I am just not sure if they are keen of doing it... It would compete directly with they G-sync products.This is all handled by drivers, so everyone: Intel, nVidia, AMD or any other can use this feature. And because it is made by drivers, there will be different solutions to do it and different outcomes. All in all this is the way to the right direction. To open standard that anyone can use if they want to. That is why I like display port and why I am not (personally) interested in all those Apple "standards" connections. In long rn they are cheaper to customers.We allso has to remember that there are some hefty hardare in g-sync so I am guite sure that it is better in some situations, but the intereting part is, what can be done id some frame buffering hardware will be put directly on GPU or graphic card and the use free-sync for the rest. If that would be as good as g-sync, then it would be wictory to all customers!
 
I would say that given GSync required 3rd party support from monitor manufacturers it's pretty much DOA now. Monitor manufacturers can just implement variable VBlank and offer the equivalent of GSync to ALL their customers (that i users of nVidia, AMD, Intel or any other GPU) for no extra cost. Why bother fragmenting/complicating your production line with proprietary hardware not all your customers have a use for?
 

Djentleman

Distinguished
Jul 25, 2011
1,045
0
19,410
What nvidia had to say about this: http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

"He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel.

As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand. That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction."

 

Djentleman

Distinguished
Jul 25, 2011
1,045
0
19,410
What nvidia had to say about this: http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

"He first said, of course, that he was excited to see his competitor taking an interest in dynamic refresh rates and thinking that the technology could offer benefits for gamers. In his view, AMD interest was validation of Nvidia's work in this area.

However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel.

As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand. That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction."
 

razor512

Distinguished
Jun 16, 2007
2,134
71
19,890


If the screen can dynamically adjust the refresh rate, then it will be an easy replacement for it, but if it still stays at the refresh rate, then the technology will essentially be running at its native refresh rate, bit instead of rendering incomplete frame, it will simply pull up the latest complete frame on each refresh cycle, thus increasing input lag, while getting rid of tearing.

They need to come out with a new display standard that follows the design of cheaper and dumber electronics where the processing equipment in the device controls the screen refresh entirely, meaning the device can update once per second if it wanted to. The technology to avoid refresh rates have been out for a long time (since the early 90's) They just need a new standard that brings it to monitors where the the display can tell the system its max update rate, and then the videocard will know when to cap the frame rate, but other than that, the videocard will have full control of the display refresh.
 

razor512

Distinguished
Jun 16, 2007
2,134
71
19,890


That is what gsync tries to avoid.

with gsync, frames are not buffered as the display refresh rate is matched by the videocard frame rate.

meaning as soon as the videocard finishes rendering a frame, it is pushed to the monitor.
it is like that now but because the monitor is on a fixed fresh rate, you end up with situations where a frame is pushed to the monitor in the middle of a refresh cycle, this causes the display to only partially render one frame, and partially render another.

With a standard display, you only get a error free image when the frame rate is a multiple of the refresh rate, e.g., if the monitor refreshes at 60Hz, then frame rates 240,120,60, 30, 15 and so on, will be in sync.

at 120FPS the display simply displays every other frame, at 60FPS, it displays every frame, at 30FPS, it displays each frame for 2 refresh cycles

with this the timing of the frames are all the same as it would be with gsync,

The only thing gsync tries to fix, is allowing frame rates in between those. for example, suppose you went from 60FPS to 58FPS, your only options with vsync, is to either drop to 30FPS (thus killing most of your performance), or allowing some anomalies on the display due to partially displayed frames. (adaptive vsync)

Gsync will fix that by allowing the display to refresh at an arbitrary rate, e.g., 58Hz, if the game drops to 58FPS, thus no frame rate snapping, and no partial frames.
 

ddelrio

Distinguished
Jun 26, 2009
17
5
18,515
They did demo the technology. The demo wasn't very flashy, but that actually makes it even funnier.http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014
 

chumly

Distinguished
Nov 2, 2010
647
0
19,010
It just came out that this only works, the way AMD is doing it, over Display port or imbedded eDP in Laptops, which is why they demo'd it on laptops. Which means, if you don't have a monitor with Displayport 1.3, this isn't an option for you.
 

Djentleman

Distinguished
Jul 25, 2011
1,045
0
19,410


Exactly.
I think DP1.3 isn't possible on a desktop monitor either just yet.
 

And yet they have waited for Nvidia to bring out G-Sync before mentioning this! :lol:


Let's see it working on a desktop PC first. :whistle:
 

redeemer

Distinguished




No reason to believe why it will not work on a desktop, seems to prefer Nvidia to keep this (existing) technology proprietary.

http://community.amd.com/community/amd-blogs/amd-gaming/blog/2014/01/08/doing-the-work-for-everyone
http://www.pcper.com/reviews/Graphics-Cards/AMD-Variable-Refresh-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync
 


I'd have to see it rather than blindly believing it.
 

redeemer

Distinguished




Its funny when it comes to Nvidia paper launches and hype your all over it no proof needed, when it comes to AMD why cant you feel the same way? DisplayPort 1.3 will make FreeSync a reality no extra charge no proprietary nonsense, imagine if AMD kept GDDR5 to themselves.

Instead of hating on AMD as usual be optimistic.
 

Really? Now that's something I'd like to see proof of, or is this the same as me supposedly saying that I love G-Sync? Something else you failed to provide proof of I might add.
 

redeemer

Distinguished




I remember you being much more receptive to the idea of G-Sync, AMD already presented proof of concept at CES. Of course they showed the use of eDP and its abilities, DP 1.3 is suppose to have the same refresh abilities as eDP no extra hardware needed like G-Sync.
 


Please remind me by finding and quoting those posts then because I don't recall making or deleting them.
 

redeemer

Distinguished




So your not Nvidia bias is what you're telling me?
 


Is that your way of dodging my previous request? Wishing to see a side by side comparison or something that could be used in such a comparison is somehow showing bias now is it?
 

redeemer

Distinguished




The proof you desire will come out when AMD deems it. You haven't said one bad thing about G-Sync since it was announced or anything negative about Nvidia since I have been on these boards

http://www.tomshardware.com/answers/id-1802729/bad-amd-driver-support-issues.html

You quote " a fan of things that work" come on man!

 


Why would I say "bad things" about something that has had no negative press and that I have yet to see for myself? And what is wrong with being a fan of things that work?
 
so it's already a few days since the windmill demo. when amd can show us free sync running real games? of course the is no monitor to support free sync exist yet but can't they show free sync running on the same laptop they demo the windmill thing?
 
Status
Not open for further replies.

TRENDING THREADS