hapkido :
Fierce Guppy :
G-Sync is so much more than a "nifty" idea for FPS gamers whom get the visual tearing and stuttering but currently just have to suck-it-up. It's the sort of technology that is so beneficial that it *needs* to be rapidly adopted industry wide. A v-sync timed to the graphics card output... I mean, it's makes you wonder why it wasn't a ubiquitous component in monitor manufacturing twenty years ago.
Tearing and stuttering are two completely separate problems at the opposite end of each other.
Gsync won't affect screen tearing any more than vsync already does with the possible exception of it also reduces input lag. In that sense, it's a less terrible vsync, but using the game's engine or third party software to limit framerate already are good solutions for screen tearing and don't require proprietary technology and a new monitor.
And let's be clear, people without really nice setups aren't getting any screen tearing in cutting-edge games at high resolution -- screen tearing is from getting TOO MANY fps. It's a problem when you play something like Portal 2 with good hardware at 1080p, not playing Battlefield at 1600p with midrange hardware.
Stuttering is mostly commonly noticed with multi-GPU setups. It's because the frames aren't delivered in consistent timeframes and humans are really good at noticing changes like this. A solution to stuttering is lowering your settings to raise fps above your monitor's refresh rate. "Uh, I get a lot of microstutter in BF3 at ultra settings with my multi-GPU setup and my minimum frame is 30 fps." Lower your settings to raise your fps, dummy -- problem solved.
Gsync is one of those things I'll have to see to believe. I really doubt a dynamic monitor refresh rate will fool our eyes and brain from detecting drastic frame time differences. Furthermore, I'm not going to pay a premium on the video card AND a new monitor for gysnc... I'd be better off putting that money towards a better GPU.
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-Death-Refresh-Rate/Potential-Benefits-New-Interface
"We can also
eliminate the horizontal tearing of games that occurs when the refresh rate of the monitor does not match the frame rate produced by the graphics card. By only sending complete frames to the monitor and having the panel refresh at that time, you could maximize frame rate without distracting visual anomalies. If you are able to run your game at 40 FPS then your panel will display 40 FPS. If you can run the game at 160 FPS then you can display 160 FPS."
HE bolded it, not me in the article.
![Smile :) :)](/data/assets/smilies/smile.gif)
"G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync. "
He seems pretty clear here TWICE and he discussed this crap at length with NV, and devs.
"Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync."
Again VERY CLEAR.
WITHOUT ANY TEARING. Everyone who has seen it says THEY believe and claims it has to be seen!...LOL.
Your responses are AMD is ok, it's ok to look like crap lower your settings...blah blah...The point is we don't want to say it's OK any time to look like crap or to change settings or to put up with ANY junk in our image. This solves it all at once. It's better tech, get over it. It's also allowing devs to use the extra power to AMP up graphics when your gpu can pump out far more frames than needed. All the devs on stage loved this idea as it frees them to do whatever they want on the fly.
"It is one thing to know that NVIDIA and the media are impressed by a technology, but when you get the top three game developers on stage at once to express their interest, that sells a lot. John Carmack, Johan Andersson and Tim Sweeney stood up with NVIDIA CEO Jen-Hsun Huang all raving about the benefits that G-Sync will bring to PC gaming. Mark Rein was standing next to me during a demonstration and was clearly excited about the potential for developers to increase visual quality without worrying about hitting a 60 FPS cap 100% of the time."
You do what you want. I only have a gysnc monitor in my future, nothing else is acceptable
😉 You keep jacking your settings around, I prefer allowing NV to do it on the fly and FIX it for me for good, while giving devs freedom to do what they want with my extra gpu power. I didn't see anyone on stage bragging about mantle
![Smile :) :)](/data/assets/smilies/smile.gif)
They all pretty much dogged it, here and elsewhere and while on stage. Lets be clear, you apparently will put up with things the majority of us would like to be rid of
![Smile :) :)](/data/assets/smilies/smile.gif)
Even if I'd just bought a card that didn't support gysnc, unless my monitor just died, I'd wait for gsync monitor for my next purchase (hoping AMD licenses it, or comes up with a compatible deal, or I'd just go NV by default for the next card).
In a stock fight (money, stocks I mean), I'd bet on the guy who has the tech everyone WANTS, not the one nobody really NEEDS (die shrinks will get more perf for years to come all the way to 5nm or so, with no extra DEV work on games). Mantle doesn't change the world, it just speeds up a few select cards and i'd assume every card they make next version+, though they've left it off of a lot of cards already this time...WHY? Whatever it's a failed idea as it costs devs more programming for no extra return in money (can't charge more for mantle games).
Gsync changes the world and in ways we all want to see happen including devs and it makes their job easier in coding (freedom from things like 60fps caps on consoles etc). Die shrinks, better perf, drivers etc don't fix what Gsync fixes. It's a hardware solution or deal with problem forever. This is basically NV admitting it can't be done in drivers alone. Good luck to AMD funding research to resolve it their own way, I hope they just license it (and hopefully NV is open to that for mobile and everything else). Considering it's only working with ONE monitor currently it's clear NV took some work to get this done (R&D - how long did it take working with ASUS for ONE monitor, how fast can they roll it to others?). How fast could AMD do this alone now that they either have to match it or license it? I vote license.