G-Sync Technology Preview: Quite Literally A Game Changer

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

JUICEhunter

Honorable
Oct 23, 2013
1,391
0
11,960
I could see why the first model is a 1080p 144hz panel, they can say 30-144hz depending on the performance of your GPU, people with faster cards will get the joy of high refresh rates, people with the entry level 650ti boost will still get a smoother experience than without G-sync. Although we all want 1600/1400p G-sync screen and would buy a 60hz panel to enjoy the higher resolutions it would be less exciting for NVidia to say "This screens refresh rate goes from 30-60hz"
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510


G-sync doesn't smooth out inconsistent fps caused by an under powered computer. It can only dynamicly lower the display frequency to prevent frames from being dropped. So this technology is most effective when the frame rate is less than 60Hz and consistent, like a video recorded at a lower frame rate. Thing is there are already solutions for this called pulldown and interpolation. Movies are filmed at 24Hz and to make that sync to 30Hz they do a thing called 3-2 pulldown.

Most likely what most reviewers are bragging about is actually a combination of the quality, power and condition of the computers they are running the demo on. Most shuddering is caused by too many background programs running at the same time as your game.

Its something like what's happening in this video. Even though some of these people already have the same exact phone, they have more programs running on them and the memory has become fragmented.
http://m.cnet.com/news/kimmel-fools-people-into-believing-iphone-4s-is-iphone-5/57512267

If your computer is really running your game at a minimum greater than 60fps then there should not be any shuddering or input lag with v-sync on.
 


Stop making yourself sound stupid and move on.
 

Nossy

Distinguished
Apr 5, 2005
216
0
18,680
Love my 144Hz display.

TN panel? Doesn't bother me as a gaming monitor. Since when did "very" accurate colors and viewing angle was big for PC gamers?

Proprietary technology for only Nvidia GPU? sure, they spent money on RnD, it's a money making business. If you come out with something first you should enjoy the $$$. If you want a HEMI engine, buy a dodge. If you want VTEC, buy a honda.

When I bought the ASUS VG248QE, i didn't think much of it, skeptical at best. If I didn't like it, I could always return it. Turns out, made a huge difference and I kept it.
 

MrAMD

Honorable
Aug 27, 2013
65
0
10,660
Wild article you know this G-sync shouldn't allowed any FPS to whatever your video card is doing but this would go all over the place noticing different flickering from high to low onto the monitor super lag eye strain. This should keep a stable FPS no matter how much floods is going from your GPU between graphic on going floods from the games. This could cause a huge heat issue fighting over a FPS on both ends. You know it could be driver issue too. I have notice more tearing in my 780 GTX TI than my 480 GTX it have something to do with Shader speed then again could be memory issue not keeping a complete shutting timing at a good prefect stable speed. The 480 GTX is just plan slow but better in picture high quality and less tearing. Give it a test. I remember the days from my Viewsonic CRT 120 HZ was the best and better in speed and quality in picture compare my 60 hz LCD monitor took a long hard yrs to get used to. I could hook this right now and see the different all over again with this video card. But back in the days the 9800 GTX was an awesome card when having a 120hz vs a 60 hz on the CRT lol huge improvement on games
 


The only reason the 480 has less tearing is because it produces less FPS. You get exactly 1 tear per Frame with V-sync off, except for the few that land during a Vertical Blanking period, which is not often. A 780 TI produces a lot more FPS, and in turn, produces a lot more tears.
 

Nossy

Distinguished
Apr 5, 2005
216
0
18,680
Love my 144Hz display.

TN panel? Doesn't bother me as a gaming monitor. Since when did "very" accurate colors and viewing angle was big for PC gamers?

Proprietary technology for only Nvidia GPU? sure, they spent money on RnD, it's a money making business. If you come out with something first you should enjoy the $$$. If you want a HEMI engine, buy a dodge. If you want VTEC, buy a honda.

When I bought the ASUS VG248QE, i didn't think much of it, skeptical at best. If I didn't like it, I could always return it. Turns out, made a huge difference and I kept it.
 
A lot of confusion here...

To be CLEAR, G-Sync is not a gimmick. There are several issues that plague PC gamers, and G-Sync is the ONLY solution that will solve them all at THE SAME TIME. These issues are:

1) Screen Tearing
2) Lag
3) *Stutter and jutter

*Some stutter is still unavoidable due to poor coding.

If you enable VSYNC to avoid screen tearing, then you introduce some LAG due to the need to buffer the GPU output to match the monitor's refresh cycle (G-Sync just displays the new screen very quickly after it is generated).

JUTTER even exists when synched at 60FPS (not always). Frame Times can be different, even when FRAPS displays 60FPS. Ever wonder why 60FPS doesn't appear really smooth? Jutter may be the reason.

PRICING:
I've seen some strange arguments about this. Seriously, everybody is different in how much they are willing to pay for a great gaming experience. You may have just bought a new monitor, can't afford one, don't care (or understand) but plenty of others will be willing to buy one.

If the best gaming experience means little to you then you aren't the target audience in which case maybe an XBOX 360 is a better choice anyway.

The price starts at a $150 premium (i.e. $400 for a normally $250 monitor) and NVidia wants to reduce this price as quickly as possible so it might be a $50 premium within two years.

If we consider that the TARGET AUDIENCE easily spends $1000 to $3000 on a system it's really not that much extra.

LICENSING:
It's a proprietary solution, so it must be licensed. It requires a SOFTWARE solution on the GPU side, and a HARDWARE solution on the monitor/HDTV side.

Thus, as I said before, it would be possible for a company like SONY to build a new HDTV with G-Sync, license the PS4 to work with this and have the PS4 running much, much SMOOTHER on their HDTV's.

Sony would sell more HDTV's and PS4's if they licensed G-Sync on both. That would be a WEIRD scenario though with an AMD solution working much better because of an NVidia solution.



 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
Assuming purely for the sake of argument that this is not a gimmick. Why do laptops, tablets and smart phones use old fashioned vsync and frame buffering, not something like the method gsyncgsync uses? They have their display panels hardwired to the graphics chip.
 


Because they have used v-sync and standard refresh rates for nearly a hundred years. They chose not to reinvent the wheel. It is also not all that easy to achieve and requires special hardware to have variable hz on the fly, which is why it requires special hardware.

This is also only useful for gaming. Video, captures and displays frames much like a monitor refreshes, at defined intervals. Gaming, on the other hand, has frames created on the fly, and not at a fixed rate.
 

GalibTheGamer

Distinguished
Mar 26, 2013
27
0
18,540


How about using V-sync with triple buffering instead of Dynamic V-sync or G-sync?
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510

All the more reason it's curious that Nvidia the maker of mobile graphics chips and their own SoC for tablets and smart phones did not first introduce this technology in hardware with built-in displays.

Seems pretty stupid to release a tech that requires a special display, to match a certain graphics chip, and only one ones that are top of the line AND sold only as stand alone products. If this really is cost effective they would release it as a prominent feature of all their new SoCs and mobile graphics chips.

Most likely the truth is that it is not cost effective just like dedicated physX cards failed to be.

Nvidia is starting to remind me of Sony 10 years ago. Boasting about the Cell processor and blu-ray and not too long ago 3D. All fell far short of their lofty promisses. Blu-Ray is mildly successful only because movie studios refuse to release licenses to stream/download their content. Sony finally got smart and went back to basics with the PS4. Might be too little too late for them though with Steam Machines and ARM powered devices gaining so quickly.
 
Whether this becomes popular doesn't change the fact that it works, and does an important thing for gaming. Because it is only for gaming, it may not become mainstream.

And how on earth do you expect anything new to come, if you consider any new way of doing things a gimmick? Nvidia does not make smartphones or tablets. They make only the CPU/GPU inside them. Just because a display is built into the unit, does not mean it does not adhere to the same standards as monitors. And when it comes to tablets and phones, you don't have room for the added circuitry, you have more power constraints, as well as it has to conform to the OS standards that run them. They are hardly the place to experiment on new tech like this. They are also not gaming platforms, which this tech is designed to address.

There is always a first time for everything. This is the first variable refresh monitor. If it is even half as good as every review considers it, it will succeed to some degree, and likely pave the way to a new standard in monitors. Though I doubt it will ever become part of TV's and handheld devices.
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
Umm... Smartphone are THE place for new technology. Did you know there are games that are designed to only use unique effects avaible only on Tegra SoCs? This is why there are two versions of certain games like Killzone and Dead Trigger made just to take advantage of Tegra's unique features.

The Moto X has a special chip called x8 and software running what it calls "active display" to on it's AMOLED display.
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510


That is what Android does since 4.2 and up. But it uses a dobble buffer until a triple buffer is needed.

http://www.androidpolice.com/2012/07/12/getting-to-know-android-4-1-part-3-project-butter-how-it-works-and-what-it-added/
 


Clearly you are a waste of time. You just want it to fail because its Nvidia, I assume, or too ignorant to understand it. Probably both.
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
This is a message board. Of course its a waste of time. It's supposed to be entertaining and informative. I thought I was bragging about Nvidia in my last post. I have a tegra 3 tablet and I like it quit a bit. I bought my cousins old droid x2 for my son to play with and it works fine aside from the old version of android on it that is. Also my laptop is an Intel core 2 duo with a Nvidia gs8400 and my old desktop had a pasivly cooled Nvidia GT 7600 that I thought worked great.

So yeah, I'm not exactly anti-Nvidia. I'm anti-Nvidia wasting time and resources on failures.
 


Well then, you just don't understand the tech or the existing tech. You seem to think you do, but fail to grasp the differences. You might try rereading the article. Or reread about the tech you think you know. You are having some sort of failure to understand what it does and why all these sites are so excited about it.

And you are quite ridiculous to consider it a gimmick because it didn't show up on all these other devices as well. EVERYTHING ever invented had to start somewhere first. Nvidia chose the PC as the introduction to the tech. There are a number of reasons they may have chosen the PC. Smartphones are not really the target audience, nor are they an easy place to experiment with this type of tech anyways.
 

lowstandards

Distinguished
Feb 10, 2004
90
0
18,630
I may be in a small group when it comes to LOVING v-sync but I do! I've always used it as tearing is sooo distracting to me. However, I don't see the "stuttering" or jumpiness that is supposed to be a side effect of V-sync. I keep my system high end and use D3Doverrider utility to force V-sync and with the latest titles I'm running anywhere from 42-60fps, settings maxed and really don't know what this stuttering is. I enjoy a smooth gaming experience with V-sync..always have.
 


Anytime you are between 30 and 60 FPS, there is mild stuttering. You may not be distracted by it, and it is not severe, but if you see 60 FPS with V-sync next to 40 FPS with V-sync, it is clear which is smoother. G-sync will be as good or better than 60 FPS with V-sync, even at 40ish FPS or at least close to it.
 

Henry W

Honorable
Oct 5, 2013
3
0
10,510
Ohim your logic us completely flawed my friend. Producing frames over 60_fps still produces screen tearing. Turn on vsync you have input lag which is not always such a big deal but in games like Battlefield 4 it is A huge deal. I toyed around with the settings for months. One the one hand bf4 was beautiful with vsync on but the input lag was terrible on the other hand with it off the screen tearing was distracting and induced a large amount of eye strain. I found capping my frame rate at72fps solved much of the teaing but of course its still there. Once this technology is out I will adopt it unless And has something in the works. This will be as very interesting year for GPU tech, I look forward to it
 

pepe2907

Distinguished
Aug 24, 2010
643
0
19,010
I am curious how somebody could notice such inconsistency in an image for 1/60th of a second but isn't able to see even a completely different image shown for twice longer, and completely able to perceive jumping of pixels between frames as smooth movement.
 


We are used to seeing constant motion with no interruptions. Why are you surprised that we would notice interrupts in motion when the system has hickups in it? Not only do you end up with repeating frames inconsistently, but it causes the action to get out of sync with what was meant to be displayed.

I suppose you don't believe any of the reviewers in how much of a difference it makes? :sarcastic:
 

squirrelboy

Honorable
May 3, 2013
89
0
10,640

VAT here is 21%. €180 * 1,21 = €218.
from €180 in the USA to the €300 here is a 70% increase. SEVENTY percent! it's like they make them in taiwan, ship them to the US, and then ship them to europe from there. that's the only thing that could explain the ridiculous price.
 

All I know is I'm told if you live in Europe, you buy BenQ, if in the US, you buy Asus.
 
Status
Not open for further replies.