Report: Nvidia G-Sync Exclusive to Asus Until Q3 2014

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


But I'm sure they can license it and AMD can then make compatible hardware (it may take another gen of cards or something, but it can be done, no different than AMD64, or ARM IP - you can take theirs or ROLL YOUR OWN like Apple, Qcom, NV Denver/Boulder etc). If they don't license it, it could be years before AMD gets anything like it, and may cost monitor makers (mobile etc) more to support two totally different techs. Even with a lic though, AMD would need a hardware rev to support it so at least a year away while NV builds momentum. This will cost AMD market share unless they can come up with a card that is so much faster than NV (No not 290x, doesn't cut it, and party over next week with 780TI) that gsync isn't worth it for some.

Clearly NV has been working on it during 600 series designs or they wouldn't work on 650TI+ already and going that low it's not about perf, it's about tech inside the chip. I'm thinking this means AMD is years away if no license happens. It took FCAT to make them realize they had a problem. NV seemed to already know, worked it out best as possible (they created fcat), found out drivers have limits and moved to a $100 card to resolve it. See how far out that makes AMD?

Until we get more lopsided, I believe phsyx & mantle will only be supported when NV/AMD subsidizes it for headlines. If one gets 90% of the market (which gsync could cause if AMD gets blocked and can't come up with it for years- NV doesn't have to give it out), then you'd see people writing for stuff like physx (or whoever the owner of the 90% is). But yes, I'm sure Gsync cost NV some R&D. I mean it only works with ONE monitor and requires a card to do it. More R&D will need to be done for something that works on all others (they're surely trying to avoid 100 cards for different models) and I don't think AMD has the funding to pull this off for a while. I only hope NV charges reasonable fees (sure they need to recoup money, but hope they don't completely gouge AMD, they get to license to all other screens/products that desire it).
 
Well, i'm not sure i can believe this, as during the Montreal even Nvidia announces quite a few partners. Asus was merely the first one to announce a product ready for purchase in the first quarter next year.
 


What do you mean "ultimately a limited number of games"? Devs don't have to write specifically for this, though they can eventually to take advantage of the freedom from worrying about fps, and in turn taking advantage of any extra power your gpu has.
Watch the vid of the devs talking about it (it's only 2 mins):
http://blogs.nvidia.com/blog/2013/10/18/montreal/
Carmack, Andersson, Sweeney all say the same thing. It takes what is already there and makes it better, buttery smooth. They're not recoding games to work here.
https://www.youtube.com/watch?list=UUHuiy8bXnmK5nisYHUd1J5g&v=BZS8Bbyf1to&feature=player_embedded
You don't have to change your games. There's a direct link to the vid.

http://www.geforce.com/hardware/technology/g-sync/faq
"Q: Does NVIDIA G-SYNC work for all games?
A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver."

I'm still waiting for a list of PROBLEM games, and if it's perm or temp and fixable via driver updates, but generally speaking all games should work and not need changes (barring the occasional old game with issues I guess). Not sure who is to blame for the issues in these cases, drivers or the game itself having something that just causes issues.

For anyone interested in a lot more gsync info and quotes from devs etc (including stuff not discussed on stage, backlight strobe mode etc etc):
http://www.blurbusters.com/
"John Carmack (@ID_AA_Carmack) tweeted:
“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”
 
it would be nice if you could add the hardware yourself.

For an example, I bought a VG248qe 144hz 24" monitor, so if I could open the back of the monitor and some how add the hardware that would be cool.
 
@Steveymoo, for performance reasons if you have v-sync on you probably want triple buffering, meaning you are always at least 25ms behind. Assuming you want vsync to avoid tearing switching to gsync solves the tearing problem, stops the judder problem on unsynched frames, and reduces your input lag by 33% . . . What's not to like?
Its proprietary to both GPU and monitor...
 


But solves the problem. You might as well get used to proprietary stuff. If I pay to solve a problem everyone hates (insert company name) I won't be giving it to the enemy free...NEVER...LOL. Or at least until I can't figure out how to milk it alone...ROFL. There corrected myself.
 


"Q: When will I be able to purchase this?

A: The NVIDIA G-SYNC Do-it-yourself kits for the ASUS VG248QE monitor will be available for purchase later this year. We will have more information to come on how and when to get G-SYNC enabled monitors in the future."

"Q: Can I install G-SYNC modules for my current monitor?

A: For gaming enthusiasts, NVIDIA has made available a do-it-yourself monitor modification kit for an ASUS VG248QE monitor. The mod takes about 20 minutes. More details of the kit will be posted. "

Just wait a bit, you can DIY soon.
I suppose you can watch here:
http://www.geforce.com/hardware/technology/g-sync/faq
Or their blog.
http://blogs.nvidia.com

I'm sure tech sites will cover this the second it gets released anyway but just in case... :) I don't think you open the monitor but could be wrong. Sounds like a dongle type deal but maybe not. It can't be that tough or it would be a customer nightmare for anyone considered NEWB :)

 
But solves the problem.
That's yet to be seen.

There are many open standards developed by companies. Unfortunately both NVidia and AMD both resort to proprietary gimmicks.

Realistically when things are proprietary, the consumer gets screwed.
 


Yet to be seen by you. Everyone who has seen it says it's awesome, including Sweeney, Andersson, Carmack, Reign and every tech site out there. They all claim it's a total game changer. It is NOT a gimmick in this case. It works and doesn't require devs to go back and code everything again or even work to get it in their future games. It's a monitor driver thing, not a dev please write more code thing (like mantle). Many great things start proprietary but become licensed which is what I hope happens here so we get it everywhere (mobile, built in to tv's etc), so gaming is smooth everywhere.
 
Nobody cares about Carmack's opinion anymore. He has completely gone off the deep end. This is the same guy that said Unreal engine 4 or whatever could only be fully utilized by a GTX 680 OR xbox 360 or ps3 consoles. Like give me an effing break. I can respect a guy like Gabe Newell sticking up for what he believes in and putting his checkbook to work to make it happen -- example, porting all popular Valve games to OpenGL. I can't respect Carmack for selling his name to the highest bidder.

The point is, as AMD continues to gain marketshare for providing better price/performance gaming GPUs, Gsync is not going to become a game changer because budget savvy users aren't going to spend more money upgrading their monitors and accept less performance from their graphics card or spend more money for the same performance. It's an ADD-ON like eyefinity or physx (and that's assuming it does make games appear smoother which I'm still skeptical about) -- not a defining feature.

As far as I'm concerned right now, it's a marketing ploy. These developers need to be concerned with pushing the envelope and forcing hardware vendors to advance monitor and GPU technologies, not squeezing every last MS of frametime out of 1080p gameplay. We're over it -- it's old technology and it's time to move onto something better.
 

Chance are not all ASUS monitors will include it.

Off topic: I am using an ASUS monitor now and it has a very sharp picture. Better then most others I have used.
 
This is just another Physx proprietary offering from Nvidia that will ultimately fail when monitor makers realize that people are not going to rush out and buy them like Hapkido has stated! Money is the bottom line for these companies and if it doesn't make enough then it will be scraped. Only the few hardcore Nvidia fans will waste their money on a closed tech... Don't get me wrong if they actually licensed it out then that could be a good thing but it's Nvidia they want to squeeze the competition out and rip us all off if they can. As for Mantle if anyone actually paid attention it is an open standard meaning that Nvidia if they so chose to could use it.
 
Haha, it's insane to me how many are just poo-pooing this. You clearly don't understand the technology or what it means.
 
Hmm. I have to wonder if the people who are poo-pooing this even understand the technology. If you don't think this is a major game changer, you simply don't understand the core concept of lag vs screen tearing, or how GPUs and monitors work.

(sorry for double post, first one wasn't showing for some reason)
 
My current Asus Z87 / Intel i7-4770 / GeForce GTX 770SC 4GB rig I just built was based around the future G-Sync. Within the next couple weeks I'll be buying three Asus VG248QE monitors to replace my existing three LCD monitors. Then once Nvidia releases the G-Sync adapters to VG248QE owners to install on their own, I'll get them too. The new Asus VG248QE G-Sync ready monitors won't be available for another 3-6 months yet. As for the rumor that Asus is the only company getting the new technology that is false. Asus only happens to be the first because it was the first VRR (variable refresh rate) monitor G-Sync was installed on and kits already exist. My understanding is within 10-months from now, at least 5 different companies will have G-Sync VRR ready monitors. But for now if you intend to be among the first adopters of this new technology, you must outfit your rig with the latest Nvidia video card(s), and for now the only monitor that can be used is the Asus VG248QE 144Hz VRR 24" LED/LCD monitor but you have to install the card yourself or take those monitors to a qualified person and they can install the G-Sync card. After Q1 next year, the card will come already installed in that monitor. Visit Nvidia and follow G-Sync developments and release dates for do-it-yourselfers. I know I will.
 


You are correct. With the added statement that once it's inside the monitor out of the box, you will lower costs because you will no longer be replacing something with an external card. There are chips inside the monitor currently that the card replaces when installed. A monitor company won't need to buy the chips this card replaces so it should drop premiums to $50-75 then instead of $100 currently for the card. Of course this should get even better over time as R&D to get it to work with each monitor should get figured out faster as they now know what they are doing probably unlike when they had to come up with the stuff on day one with Asus from a blank slate. All they knew at that point was what the problems were that had to be solved. Now they know that, and how they fixed it. It will be easier for each model vs. Asus.
 


Umm, something wrong with you if you like crap graphics :)

Agreed on proprietary, but still recognize that it sometimes makes others up their game for a standard. Glide etc made MS put out far better versions of DirectX (still proprietary I guess...LOL, but got everyone on the same thing at least back then). If someone doesn't pave the way, we never get anything. RIMM had enterprise email all to themselves. Then apple came along (and a few others) with Exchange for mobile stuff and Rimm and it's wallet drenching pricing got screwed 😉 Rimm's great stuff cause the world to catch up (which killed rimm...LOL). See how that works? Qcom has been doing the same with modems, but we're now seeing the world catch up and their margins just Wed wen to crap from 60+ down to 54%. So 10% drop due to competition all getting cheap modems into asia. Watch as they drop further as this gets to everywhere with everyone supporting multimode (wel, all modes for all countries and types).

So it's not really that bad to me. Without the people forking over dough for R&D for new crap, we never get the world making it a commodity item eventually. Note NV licensed physx as it's used in all engines basically and also all consoles. Though that doesn't seem to have made it much more popular. It's only in ~60 games (even though it's included in unreal engine etc etc, being in there doesn't mean used, just like mantle). I predict the same or worse for mantle as it only works on a VERY small subset of the smaller company here in market share (amd only has ~35% discrete and only a portion of those will be R7/9 for years). AMD will pay every time it gets used and devs can't charge more for optimizing for this little niche.
 

Oh well. There are alot of things wrong with me but that seems to be the case with alot of people on the internet.

Alias shimmering is much more annoying to me. Perhaps because I don't play FPS's has something to do with it.

Agreed on proprietary, but still recognize that it sometimes makes others up their game for a standard. Glide etc made MS put out far better versions of DirectX (still proprietary I guess...LOL, but got everyone on the same thing at least back then). If someone doesn't pave the way, we never get anything. RIMM had enterprise email all to themselves. Then apple came along (and a few others) with Exchange for mobile stuff and Rimm and it's wallet drenching pricing got screwed 😉 Rimm's great stuff cause the world to catch up (which killed rimm...LOL). See how that works? Qcom has been doing the same with modems, but we're now seeing the world catch up and their margins just Wed wen to crap from 60+ down to 54%. So 10% drop due to competition all getting cheap modems into asia. Watch as they drop further as this gets to everywhere with everyone supporting multimode (wel, all modes for all countries and types).
Agreed to some extent. But it would be better if there was a standard created by monitor manufacturers. That would pretty much force it to work with both brands.

Perhaps even a new DVI standard.
 
Status
Not open for further replies.