Report: Nvidia G-Sync Exclusive to Asus Until Q3 2014

Status
Not open for further replies.
Actually, its the other way around......like many technologies such as physX, mantle,etc many customers have been reluctant to adopt such "exclusive technologies"..Asus are the only one to get on board the g-sync technology.....other competitors in the monitor markets want to wait out and see..if this technology really makes a significant change in the user-gaming experience.......the markets are tight and no company would want to increase the price of their monitors on g-sync unless it delivers as promised.......my guess, it will fade away......just like 3d monitors, physX.etc..........
 
Personally I don't have a problem with this as I prefer my Asus screens. On the other hand, if you want a new technology to be adopted into the tech world, you don't want to make it exclusive...
 
Great, if people dig this idea, there's a bigger likelihood that companies who are left behind (everyone beside nvidia and asus) might develop an alternative open standard.

Death to proprietary.
 


You're talking about Nvidia.
 
There is an incorrect statement in this article:

"G-Sync is a technology that fixes screen tearing in Kepler-based games."

The author probably meant to say "Kepler based video cards". There are no such things as kepler-based games. Kepler is the architecture of Nvidia's current line of video cards and has nothing to do with the games themselves. This isn't something that will have to be programmed into each game; it will work with all games as it will be built into the video card itself.

As it is, the article makes it seem like only certain games will support this, but in reality the games won't have to support it and it will regulate frame rate regardless of what you're doing.

G-sync will work with all games on Kepler video cards (GTX 600 and 700 series), and future cards as well probably.
 
@Steveymoo, for performance reasons if you have v-sync on you probably want triple buffering, meaning you are always at least 25ms behind. Assuming you want vsync to avoid tearing switching to gsync solves the tearing problem, stops the judder problem on unsynched frames, and reduces your input lag by 33% . . . What's not to like?
 
Change my GPU every 2 years, ok. I can use mantle and true audio or physX if devlopper adopt it.
Change my 3 ASUS screen that i just bought that are not compatible with G-Sync: not the same goddamn ballpark. People stick with their screens. This is gonna be hard to force adoption by the market, no mather how cool the technology is.
 
I agree with the last part of the article an most of the first comment. Beside being tired of tech that ties you to a specific hardware (being it nvidia or amd) and still being doubtful about the price they will charge for g-sync monitors.
P.s. PhysX dedicated hardware was an unsuccesful experiment but their acquisition from nvidia meant both lots of money for ageia and widespread adoption.
 


With a card able to hold 120fps you will not get page tearing, but as soon as it drops you will fall all the way to 60fps without triple buffering.

The idea with g-sync is that your monitor can adjust its refresh to match the video(even as the frame rate changes) card instead of the card trying to match the monitor(something that does not always go over well).This will allow the timing(since even with a 60fps frame cap a 60hz screen can page tear because the card and screen to not refresh the image at the same time) to match ALL the time even if you are at an odd ball frame rate like 50 or 90.

I think this is GREAT, but we need an AMD and Nvidia solution.
 
As nifty as an idea this may be, it is wrought with potential pitfalls. They may get away with exclusivity for a brief time because this is a technology that will be adopted slowly anyway. Restricting use to one manufacturer helps identify intrinsic flaws, rather than troubleshooting a bunch of different flavors. But it's a bit of a tall hill to climb asking people to pair their GPU to specific monitors. The opportunity for mass confusion for consumers is fairly high. Furthermore the market for this is already fairly niche (many may like the end results, but how many will be willing for the hassle and expense of achieving those ends for ultimately a limited number of games). Nvidia will have a tough time making a robust profit from this for a while. Their potential profit may come in licensing this out to AMD, Intel, and monitor manufacturers, so that this benefit becomes more universal and a simpler choice for consumers.
 


What does input lag have to do with refresh rate?
 
Already like ASUS monitors, so wouldn't bother me much. However, it should be expanded to a few others.

For the people saying that V-Sync will fix their problems, you need to read up on what G-Sync does before posting.
 
G-Sync is so much more than a "nifty" idea for FPS gamers whom get the visual tearing and stuttering but currently just have to suck-it-up. It's the sort of technology that is so beneficial that it *needs* to be rapidly adopted industry wide. A v-sync timed to the graphics card output... I mean, it's makes you wonder why it wasn't a ubiquitous component in monitor manufacturing twenty years ago.
 
To everyone saying Nvidia's proprietary software should be open, It can't. Their cards are designed specific to their software. Hardware/software harmony. Physx might be an exception but ShadowPlay, GSync, TXAA, HBAO+,Game streaming(Shield, Grid and TV) etc uses specific Nvidia architectures to support them. AMD cards don't have the required hardware on board to uses these things. Just like how Nvidia can't use the AMD audio software because Kepler cards don't have the processor.

One other thing to consider is that consider how much more money Nvidia spends in its software department than AMD. AMD would quit researching anything because they'd rely on Nvidia's advances and make none of their own.

Ps. Physx isn't "dying" but its still limited. More games are adopting it as time goes on.
 


Tearing and stuttering are two completely separate problems at the opposite end of each other.

Gsync won't affect screen tearing any more than vsync already does with the possible exception of it also reduces input lag. In that sense, it's a less terrible vsync, but using the game's engine or third party software to limit framerate already are good solutions for screen tearing and don't require proprietary technology and a new monitor.

And let's be clear, people without really nice setups aren't getting any screen tearing in cutting-edge games at high resolution -- screen tearing is from getting TOO MANY fps. It's a problem when you play something like Portal 2 with good hardware at 1080p, not playing Battlefield at 1600p with midrange hardware.

Stuttering is mostly commonly noticed with multi-GPU setups. It's because the frames aren't delivered in consistent timeframes and humans are really good at noticing changes like this. A solution to stuttering is lowering your settings to raise fps above your monitor's refresh rate. "Uh, I get a lot of microstutter in BF3 at ultra settings with my multi-GPU setup and my minimum frame is 30 fps." Lower your settings to raise your fps, dummy -- problem solved.

Gsync is one of those things I'll have to see to believe. I really doubt a dynamic monitor refresh rate will fool our eyes and brain from detecting drastic frame time differences. Furthermore, I'm not going to pay a premium on the video card AND a new monitor for gysnc... I'd be better off putting that money towards a better GPU.
 


I think you have it backwards :)

Anyhow I can believe this as they may have worked for a few years on the tech with Asus specifically. That module may be made for their monitors specifically and it may take them time to get them prepped for others. Under a year for dev on the rest and troubleshooting, bug fixes etc seems reasonable. Or ASUS paid them. Again, not NV's fault they are a business and if they don't think anyone else (AMD) can do it in a year and someone offers them millions for exclusivity for 10 months or something they'll take the cash. It gives them time to get everyone else on board with QUALITY products, and ensures cash, while ensuring the first tested units are all perfectly working and won't receive some crap review due to malfunctions.

You want to put your best foot forward when putting out a brand new tech that is a huge game changer (by all accounts, no review site said it was NOT good and all said they hope it becomes ubiquitous). You don't want half-a$$ed stuff getting reviewed. Like I said it may have taken 2yrs of working with Asus to figure out all the bugs. They had to make a friggen $100 card for this feature so it's not some simple gpu fix. I'm sure they are capable of doing it faster on other systems now that they know what they're doing, but it may take a year to integrate into everyone, or they don't want to make 100 different cards for every model out there etc. Who knows why, but there could easily be some VERY good reasoning behind the rest taking some time to get it integrated.

Also, AMD doesn't get it then either right? Mantle doesn't work with anything but a VERY small set of AMD cards, no backward compatibility even with their own stuff...So if NV doesn't get it, AMD gets it even less. NV owns 65% of the market. Mantle works with a small number of 35% of the market.

From their page:
G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher
GTX TITAN
GTX 780
GTX 770
GTX 760
GTX 690
GTX 680
GTX 670
GTX 660 Ti
GTX 660
GTX 650 Ti Boost
Display:

G-SYNC DIY modification kit requires an ASUS VG248QE monitor.

Just from that you can guess I'm right. It doesn't work with anything right now but one monitor, which is what I think they built and tested this tech with. Makes total sense to me. The rest will take time to test and I think they want to get rid of the external crap and integrate the module before allowing others on board just for customer simplicity.
 
So while mantle/Trueaudio works with NOTHING old, at least a lot of people just need a monitor (as you can see from the list above 650TI+ works fine) and a lot of us are planning 1440p monitor purchases in the future anyway, so for a lot of people this works fine. By the time I want it, I should have dozens to choose from next xmas.
 


http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate/Potential-Benefits-New-Interface
"We can also eliminate the horizontal tearing of games that occurs when the refresh rate of the monitor does not match the frame rate produced by the graphics card. By only sending complete frames to the monitor and having the panel refresh at that time, you could maximize frame rate without distracting visual anomalies. If you are able to run your game at 40 FPS then your panel will display 40 FPS. If you can run the game at 160 FPS then you can display 160 FPS."

HE bolded it, not me in the article. :)
"G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync. "

He seems pretty clear here TWICE and he discussed this crap at length with NV, and devs.

"Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync."

Again VERY CLEAR. WITHOUT ANY TEARING. Everyone who has seen it says THEY believe and claims it has to be seen!...LOL.

Your responses are AMD is ok, it's ok to look like crap lower your settings...blah blah...The point is we don't want to say it's OK any time to look like crap or to change settings or to put up with ANY junk in our image. This solves it all at once. It's better tech, get over it. It's also allowing devs to use the extra power to AMP up graphics when your gpu can pump out far more frames than needed. All the devs on stage loved this idea as it frees them to do whatever they want on the fly.
"It is one thing to know that NVIDIA and the media are impressed by a technology, but when you get the top three game developers on stage at once to express their interest, that sells a lot. John Carmack, Johan Andersson and Tim Sweeney stood up with NVIDIA CEO Jen-Hsun Huang all raving about the benefits that G-Sync will bring to PC gaming. Mark Rein was standing next to me during a demonstration and was clearly excited about the potential for developers to increase visual quality without worrying about hitting a 60 FPS cap 100% of the time."

You do what you want. I only have a gysnc monitor in my future, nothing else is acceptable 😉 You keep jacking your settings around, I prefer allowing NV to do it on the fly and FIX it for me for good, while giving devs freedom to do what they want with my extra gpu power. I didn't see anyone on stage bragging about mantle :) They all pretty much dogged it, here and elsewhere and while on stage. Lets be clear, you apparently will put up with things the majority of us would like to be rid of :) Even if I'd just bought a card that didn't support gysnc, unless my monitor just died, I'd wait for gsync monitor for my next purchase (hoping AMD licenses it, or comes up with a compatible deal, or I'd just go NV by default for the next card).

In a stock fight (money, stocks I mean), I'd bet on the guy who has the tech everyone WANTS, not the one nobody really NEEDS (die shrinks will get more perf for years to come all the way to 5nm or so, with no extra DEV work on games). Mantle doesn't change the world, it just speeds up a few select cards and i'd assume every card they make next version+, though they've left it off of a lot of cards already this time...WHY? Whatever it's a failed idea as it costs devs more programming for no extra return in money (can't charge more for mantle games).

Gsync changes the world and in ways we all want to see happen including devs and it makes their job easier in coding (freedom from things like 60fps caps on consoles etc). Die shrinks, better perf, drivers etc don't fix what Gsync fixes. It's a hardware solution or deal with problem forever. This is basically NV admitting it can't be done in drivers alone. Good luck to AMD funding research to resolve it their own way, I hope they just license it (and hopefully NV is open to that for mobile and everything else). Considering it's only working with ONE monitor currently it's clear NV took some work to get this done (R&D - how long did it take working with ASUS for ONE monitor, how fast can they roll it to others?). How fast could AMD do this alone now that they either have to match it or license it? I vote license.
 
Status
Not open for further replies.