Nvidia To Support G-Sync Alternative Adaptive-Sync? Nvidia Say No

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I have the ASUS ROG Swift PG278Q with Gsync and am totally loving it. But that said I hope nvidia eventually embraces Free-Sync if its just as good as gsync because monopolies are not good for us customers at all.
 
Just one more bloody standard for the sake of proprietary marketing gimmicks. Standards that need to be compatible across a broad range of devices should always be open - it's far more sensible to make the technology easy for consumers and OEMs to adopt, since they know hardware will support them.

My next card will be AMD, I guess. Goddamnit, NVIDIA. And the industry in general. This stupid propensity to force a proprietary route and split consumers angers me greatly.
 
Hmmm... All sources says that Adaptive sync and non adaptive sync versions of monitor cost same to produce... It may be that in the beginning it may be more expensive just because "it is new". Adaptive sync is just minor update to scalars that are already in the monitors. It does not require more complex hardware, so that is why it will be cheaper than G-sync. How good it will be, remains to be seeing, but it definitely will not compete G-sync in price... No extra cost is not the same as extra cost because extra hardware.
And Nvidia will make compatible driver to it, if they just want to do it.
 


This was misreported from the beginning and never true. AOC, ACER, etc. already have Gsync models out or announced - more will be here soon.

People don't realize the time it takes to make a new monitor from scratch - especially using tech that isn't "in-house" meaning having to wait on Nvidia to tune the module for the panel the OEM wants to use. Then the OEM must go through all the normal steps to bring that panel to market. Takes a bit of time.

 


Not entirely correct. Async is in the IC scalar chip - but cannot be simply dropped onto existing base boards. There is a whole redesign that will be required from any OEM that wants to have an Async-enabled system. That costs R&D time and money - money that must be recouped. Expect a premium for some time after Async-enabled monitors hit shelves. In addition, there is no saying at this point how much those new chips will be: A few bucks over? Twenty over per chip? We will see later this year.

On a side note: People need to start getting Freesync and Async straight - both terms are getting used interchangeably and it's muddying the discussion.

Async is silicon and will be available (the chip at least) at the end of this year. After that you are looking at a 9-12 month window for an OEM to bring a product to market (assuming everything goes to plan during testing, certification, etc.)

Freesync is vaporware - I mean software. This is the magic unicorn created by AMD to challenge Gsync. To date there has not been one single review of a monitor running Freesync and doing it as well or better than Gsync.

In my opinion, we will see Async before Freesync - and honestly I am doubting if Freesync will ever become reality once Async-enabled monitors hit production late 2015 early 2016.
 
@Ninjawithagun
"...although given that the Adaptive-Sync standard isn't expected to cost all that much..."

NOT! Where did they author get his degree for journalism? Captain Crunch University?? Adaptive Sync is definitely going to cost as much if not more than G-Sync. "Why?" you might ask? Simple. Adaptive Sync requires very similiary technology that G-Sync uses (and Nvidia knows this!). The scaler on an Adaptive Sync monitor is not ordinary and requires a special set of hardware that allows for full duplex communication with the graphics card as well as the screen display itself to have the capability to redraw the frame when directed to do so by the graphics card. Needless to say, AMD is full of it if they think they can lie to the tech community anymore about it's BS vapor Free-Sync. Not buying it sorry! All out of any interest to by swamp land in Florida too ;-) Give Nvidia credit where credit is due. They invented a dynamic refresh rate technology and it works!! Stop hating people. Stop thinking like a caveman just because you decided to buy an AMD card and now are pissed off because AMD lied about Free-Sync and now you are stuck with no dynamic refresh rate technology. Try refocusing your hate on AMD for not only lieing to you, but also for not being innovative to compete with their own REAL dynamic refresh rate technology. NUFF SAID!

OK really, shut up, you are an ignorant. nVidia didn't invent anything. Forms of adaptive refresh rates has been in our hands for years in mobile equipment. Through eDP, whitch is an extention of the DisplayPort standard/protocol, adaptive sync was doable. It was actually added as a power saving feature, and enabled the display to switch refresh rate seamlessly. The first tech demos of FreeSync used just THAT.
http://en.wikipedia.org/wiki/DisplayPort#eDP

DisplayPort has bi-directionnal half duplex already. Full bidir duplex is useless and a waste of energy and hardware in this case.

nVidia's G-Sync is more expensive as it is add-on hardware to existing designs. They use FPGA, not even custom ASICs to implement their tech, along with some memory, a board and all the components. Thats why its sold to you either as an upgrade kit for select models or charge an hefty extra for other models with it built in. And guess how this shmorgaborg drives your panel? Through eDP.

AMD's approach was to develop a standard similar to eDP and submit it for adoption by VESA, which was done. The DisplayPort standard is currently 1.2, but they have officialy released a new 1.2a with adaptive sync, and 1.3 is coming with it too. Evolving such a standard means that adaptive sync is simply slipped in the specs of what manufacturers have to implement when they need to design their next scaler unit, and build it with, say, DP1.3 support. It does require some extra transistors, driving up unit cost a bit. Many manufacturers have announced they were designing new scalers with support for that.What makes Freesync cost much less than g-sync is the simple fact that there is NO addidionnal hardware; just support from the scaler unit that makes your display.

nVidia chose to quickly milk their fans as usual, and AMD chose to take the long way home, pushing for wider adoption of otherwise already existing tech. The only good thing about G-Sync is that it allows enthusiasts who have already invested significant money in monitor upgrades to upgrade them with this (truly welcome) tech advance.

By the way, OF COURSE nVidia will support freesync, they already DO by supporting eDP. If they say they dont, then they've really just locked it down. That makes THEM the one lying to YOU!

I'll stay with my AMD cards, despite the impressive Maxwell wave.
 


Thanks for clarifying, on this...

If it's true that implementing new VESA standard requires new hardware, even it's not dedicated HW like g-sync,
minimum they need to tweak the firmware a bit ...
I think the Royalties fee is the one that make the price different between both monitor....

Now what i truly hope for is the new tech is available at affordable price range.....



 
Even despite reading all this, comments included, I feel that Nvidia still makes good graphics cards.

Keep in mind, every business is in the business of making money. You can't fault them for making a new technology and trying to get a good return on the R&D costs. That's just silly.

I find their graphics cards have been less hassle then the AMD cards I've used. They're powerful, and they get the job done, with fewer random bugs, and generally less power consumption.

I'm not going to hate a graphics card manufacturer because the bleeding-edge-tech of the monitors that use their new tech are more expensive.

You guys can do what you want. I'm not going to hate them for this.
 
Even despite reading all this, comments included, I feel that Nvidia still makes good graphics cards.

Keep in mind, every business is in the business of making money. You can't fault them for making a new technology and trying to get a good return on the R&D costs. That's just silly.

I find their graphics cards have been less hassle then the AMD cards I've used. They're powerful, and they get the job done, with fewer random bugs, and generally less power consumption.

I'm not going to hate a graphics card manufacturer because the bleeding-edge-tech of the monitors that use their new tech are more expensive.

You guys can do what you want. I'm not going to hate them for this.
 

Just to be on the proper side here, all of the listed were either already open-source projects before Apple had anything to do with them (giving Apple no choice but to make its contrib O-S also) or are only partially open-source, making the code not really usable and interchangeable. Apple is rarely a contributing guy to OS community on its good nature...

As for nVidia - dick move.
 
@Zanny You can try. Nouveau exists because of reverse engineering the GPU architecture. It's honestly a miracle that it actually works, and works fairly well. So yes, you can try. You aren't going to get very far though, as the Nouveau team has found out.
 
nVidia supporting their own tech instead of getting behind an as yet-unproven technology for now makes them dicks?

here *are* required hardware changes for Adaptive Sync to be implemented in new monitors. It is *not* as simple as them soldering a DP1.2a port to the monitor hardware and calling it "Adaptive Sync ready." I'm sure Nvidia will change their stance as monitors that *actually* implement Adaptive Sync scalers and ICs become more common place, because Nvidia likes money. But until then, they're going to completely back their own, already proven market solution, because it makes no sense to come out with driver support for a standard that isn't even to market yet, that monitor manufacturers are not required to implement. All the standard says is that DP1.2a *will* support it. It doesn't require or guarantee the hardware behind the port will.

It's like having a motherboard with no USB 3.0 headers, but a case with USB 3.0 ports. Sure, you have the feature, but no hardware driving it, because there's no requirement that you have to have it. Nvidia is hardly at fault for not supporting something that nobody is compelled to use. At least with GSync monitors, if the monitor manufacturer buys into it, its there. There's nothing requiring the monitor manufacturers to support Adaptive Sync.
 
So... which is it? The title says no, the first paragraph says yes, and the rest of the article argues for maybe.

In any case, Nvidia would look childish to not support it at this point. I really don't care how it's accomplished, but my next monitor WILL have some sort of adaptive refresh technology, and I'm not paying over $500 for it, and it can't be TN-based. My $130 tablet is IPS, my current monitor is IPS, my smartphone is AMOLED. I'm NOT going back to TN. I don't need 144Hz because I don't run my games anywhere near that framerate. Give me a 24" IPS 1200p or 1080p with adaptive refresh from 24-96Hz (without neglecting contrast, even backlighting, etc...) for under $400, and I'll be a happy camper. I don't see that happening with G-Sync (cost, TN needed for higher refresh rates, etc), but I'd love you to prove me wrong, Nvidia.
 


Oh really? What source would that be? You mean the "make it up because it sounds good in this forum" source? Just the facts please!
 
Status
Not open for further replies.