G sync question: Normal 144hz vs Gsync 144hz (if you have a gsync monitor, please answer this)

Status
Not open for further replies.

Gartin

Reputable
Oct 5, 2014
2
0
4,510
Ok so i'm building my first pc here soon. I'm coming from console gaming, where there isn't really any screen tearing. Gsync of course removes any screen tearing. I've heard though that a non gsync 144 hz monitor makes gysnc unnoticable. So my question is that is there a difference between a normal 144hz monitor or a 144hz gsync one when accounting for screen tearing? If you have a gsync monitor, do you a notice a difference between gsnc on vs off with screen tearing? Thanks for any responses.
 
Solution
First of all, welcome to the better side of gaming (GO PC GO PC!).

I'll do my best to make this as straightforward as possible.

"Tearing" in PC games occurs when your GPU is putting out frames faster OR slower than your monitor's refresh rate. If you have a 144hz monitor, that means your monitor is refreshing the image on the screen 144 times per second. If your GPU is only putting out 100FPS, that means your monitor is refreshing nearly 1.5 times for every frame sent form the GPU. This causes the image to tear because they are not synced so you will see a portion of one frame sent form the GPU and a portion of another frame from the GPU on the screen at the same time. This same phenomenon occurs when you have a slower monitor refresh...
I can't answer the 1st hand experience on g-sync vs. not at 144hz, but I can answer the rest of your questions. Without G-sync or the use of V-sync, you get tearing. At 144hz, tearing is less noticeable, but it does not eliminate it. Tearing is still there. If you turn on v-sync, then if your FPS are not pegged at your refresh rate, you get some stutter. G-sync fixes the tearing without adding stutter at any FPS rate from 30-144 (it caps at your refresh rate).

The next problem is that with a 60hz monitor and V-sync, when you reach your refresh rate, you get latency issues. While this may not bother a console gamer who is using a controller, it is quite bothersome with a mouse where feedback is far more sensitive. You can set a FPS cap below the refresh rate, but then that adds stutter, unless you are using G-sync.

Anyways, G-sync removes the need to compromise, though 144hz without tearing isn't terrible, at least not like it is with a 60hz monitor. G-sync is better.
 
screen tearing occours when your game fps is faster than your display can handle... typical displays can do 60hz if your game can do more.. then screen tearing will occour.. modern games are very taxing on the gpu and will never reach 144hz for the near future..unless you have multiple gpus... since you have one gpu then any typical 144hz display is good... dont buy 144hz gsync display.. it is unnecessary and a waste of money
 
^ Incorrect.

Screen tearing happens any time v-sync or g-sync is not used while gaming. The monitor not keeping up with the FPS has nothing to do with tearing. Tearing is simply a matter of the monitor and GPU not being in sync. V-sync (stands for Vertical blanking synchronization) and G-sync are both methods to sync the two, but without it, tearing occurs regardless of how low or high your FPS are.
 

Automatiic

Reputable
Oct 24, 2014
162
0
4,710
First of all, welcome to the better side of gaming (GO PC GO PC!).

I'll do my best to make this as straightforward as possible.

"Tearing" in PC games occurs when your GPU is putting out frames faster OR slower than your monitor's refresh rate. If you have a 144hz monitor, that means your monitor is refreshing the image on the screen 144 times per second. If your GPU is only putting out 100FPS, that means your monitor is refreshing nearly 1.5 times for every frame sent form the GPU. This causes the image to tear because they are not synced so you will see a portion of one frame sent form the GPU and a portion of another frame from the GPU on the screen at the same time. This same phenomenon occurs when you have a slower monitor refresh rate and your GPU is sending frames faster than the monitor refreshes - just backwards.

No GPU will ever output the same framerate throughout an entire game session because as you move in game or more things happen in the game, your FPS will rise and fall based on how hard or easy it is to render what is happening in the game. Meanwhile, your monitor refresh rate stays constant.

What GSYNC does is this: It gives the GPU control over when your monitor refreshes so your monitors refresh rate becomes dynamic, as opposed to static. As your GPU produces a frame, it is sent to your Monitor. When the frame is received, the monitor refreshes the on screen image. So, regardless of how fast or slow (as long as it is above 30 fps) your GPU is able to render images, the product on the screen is buttery smooth with no tearing because 1 Frame Rendered = 1 Image Refresh on the monitor. Simply, GSYNC doesn't allow an overlap between the monitor's refresh rate and the frames sent from the GPU because the monitor only refreshes as it receives a frame from the GPU.

So to answer your question, YES there is a difference between a 144hz monitor with and without GSync. If your monitor does not have GSync and your GPU is unable to produce 144FPS, you will notice tearing. While the tearing won't be as pronounced on a 144hz monitor as it would be on a 60hz monitor because the image stays on the screen for a shorter amount of time, it will still be there.

I hope this helps to clear things up for you!
 
Solution
While more correct, it isn't even a matter of being different than the refresh rate. If you use FPS cap, and sync it exactly with your refresh rate, even if it is perfectly at the same speed, you'll most likely have a tear and that tear will be fixed.

Wiki's definition is spot on: http://en.wikipedia.org/wiki/Screen_tearing

Screen tearing is a visual artifact in video display where a display device shows information from two or more frames in a single screen draw.[1]

The artifact occurs when the video feed to the device isn't in sync with the display's refresh. This can be due to non-matching refresh rates—in which case the tear line moves as the phase difference changes (with speed proportional to difference of frame rates). It can also occur simply from lack of sync between two equal frame rates, in which case the tear line is at a fixed location that corresponds to the phase difference. During video motion, screen tearing creates a torn look as edges of objects (such as a wall or a tree) fail to line up.

Tearing can occur with most common display technologies and video cards, and is most noticeable in horizontally-moving visuals, such as in slow camera pans in a movie, or classic side-scrolling video games.

Screen tearing is less noticeable when more than two frames finish rendering during the same refresh interval, since this means the screen has several narrower tears instead of a single wider one.

Note: this is a very frustrating topic, as there are a lot of incorrect myths about FPS and refresh rates. I'm glad no one mentioned triple buffering. That has nothing to do with tearing either.
 

jake_larkee

Distinguished
Jul 1, 2011
164
0
18,690


Somebody please correct this foolishness as I am too lazy to do so.
 

SaberEdge

Honorable
May 25, 2012
1
0
10,510
blacksheep123 is flat out wrong. You will always get screen tearing if you are not using vsync. I don't understand how anybody could fail to notice it since it is really obvious. But there are always some unperceptive people that say "I don't notice that" about any visual artifact.

Gsync makes a world of difference. It's the biggest difference in how our games are displayed that has come along in a very long time. It makes all those variable framerates between 30fps and 144hz actually enjoyable and smooth, instead of having to suffer through screen tearing, stutter and lag.
 

Wally_2

Reputable
Nov 12, 2015
1
0
4,510


^ incorrect

freesync is included in the graphics card and is not a physical module inside of the monitor
 

Diego1909

Commendable
Jun 4, 2016
1
0
1,510
freesync is part of the display port 1.2 standard. every 1.2(+)dp monitor has freesync. gsync is just another try of nvidia to empty ur wallets with some monitor manufacturers... see the horrible comments on asus and acer gsync products on amazon. ridicliously overpriced crap tn monitors with unexeptable colors and backlit bleeding... wait another year and nvidia will support freesync und ull get proper displays for under 500$
 


Freesync is the software that AMD uses to sync you to the monitor. Adaptive sync is the spec in displayport. G-sync was created before adaptive sync was ever created. AMD GPU's also have to have specific hardware for adaptive sync to work which Nvidia doesn't appear to have atm.
 

Drummer1976

Distinguished
Feb 13, 2009
22
0
18,510
Hello!

Tearing happens always where there is no adaptive sync. Stutter happens with v-sync. Nothing is like G-sync.
I can tell you from experience. I have been gaming on PC for longer than many had their first console. Over this time I have used many combinations of monitor/tv/card. I can tell you, without a doubt, that if you have a card that can produce up to 144 fps (in whatever resolution you choose, mainly now 1080p or 1440p) an you have a 144hz g-sync monitor (1080p really good up to 24', 1440p nice 27' and up if available) you will have an absolutely beautiful experience gaming. The bright, lightning fast, synchronized refresh rate needs to be seen to appreciated. G-sync is still superior to freesync for technical reasons that I am not going to get into. It is expensive and command the premium for a reason. And I am an AMD fanboy so.... there you go. I paired a 780 ti with a 24" 144hz 1ms TN 1080p monitor and fired up Overwatch at max settings (be sure ALL setting are correct! Nvidia panel and in game!) and it was second to NONE. No tearing at all, no stutter at all, frame rates synced ranged from 70-135 in perfect harmony. I am very affected by tearing and stutter, as FPS gaming is the most affected by this.

TLDR; G-sync at 144hz with a card that can handle it at the native resolution of the monitor is awesome. Period. And TN is still great.
 

Sami 1999

Reputable
Apr 8, 2015
13
0
4,520
I suppose if your Gay/Freak sync monitor is 144hz and and gpu is giving more than 144hz, then you still get input lags. It's not much of a problem though. Cause you can limit your frame raye to 144fps via Nvidia Inspector.

Anyway, Gsync and Free sync> Any other monitor.

By the way, if you drop below 30fps, Gsync will still work. For example, if your Fps drops to 15 then Gsync will turn refresh rate to 30hz and just double up the frames which works fine.

Also note: Gsync 40 fps wont make it look magically smoother. You just won't get stutters and tears which is great. But you can still feel the choppyness of 40fps.

Tip: incase you have 120 hz monitor and yur average Fps stays around 40~50 and minimum fps is 40, then set a custom refresh rate of 80hz and set "Adaptive Vsync (half refresh)" via your Gpu driver. Works better. I got a 75hz monitor and games that suffer severe frame drops- I just set its fps to 37 and refresh rate to 74hz.

Consoles work this way. They are locked at 30fps and display at 60hz so that tearing doesn't occur most of the time.
 

Abmario

Reputable
Oct 1, 2014
525
1
5,360



Gsync / Freesync (root: sync) - deals with synchronizing Monitor and Graphics card...

Running game below 30fps or 15 fps is either graphics card is not able to run the game at playable fps on 1080p, 2k, 4k panel (also, fps varies depends on game graphic effects) OR the game itself is broken or cap to 30fps (or cinematic)... Adaptive sync should remove the stuttering on 15fps as for 30hz monitor (if exist) or 60hz monitor... but sync has nothing to do with adding fps from 15fps stuff...
 


This is a technical issue with the way LCD work. Below 30-40 hz, they flicker. As a result, they are required to refresh at a minimum of 30-40hz (varies depending on monitor). Freesync currently has no backup plan and simply turns off Freesync when your FPS drop below that threshold. Gysnc started off turning it off, but has since then started doubling up the refreshes for FPS that drop below 30 (maybe 40?).
 

VelimirSaban

Commendable
Sep 2, 2016
2
0
1,510


Hey, thank you for your detailed answer, but, one quick thing to clear things up even more :) - Should I use 144hz in game settings always when possible, or is it all the same, 60 or more for best experience? If I set it on 60 hz, is g-sync still working with full potential or not?

Thanks!

Vel.

 

Aaron_83

Commendable
Sep 18, 2016
1
0
1,510


 

SGTRock82nd

Commendable
Nov 22, 2016
1
0
1,510
I have both the ASUS vg248qe 144 hz and the PG248q 144hz with gsync... I was playing on an asus g20 with a nvidia 960gtx . I just bought the one with gsync and have been testing it out to see if my extra 150$ was worth it. Ive tested it with BF4, last 3 CODs, Gears of war 4, Fallout 3, Crysis 3, new Wolfenstien, Far cry 4... an a ton of other current, and older FPS games.

I have not noticed a huge difference with my eyes or the movement of my mouse. If i look hard for tearing, I can see the difference... However most reviews will show footage slowed down and zoomed in to see the effects of gsync, this is not how we play games. I maxed out most settings and was getting between 40-110 FPS on both monitors and did not "feel" any game changing smoothness with Gsync.

The most noticeable smoothness was with Shadow warrior, i was getting 144 FPS and while spinning things appeared to be more in focus. I feel that this however does not justify the price tag of gsync.

This was just my experience with my aging computer and by no means is any kind of technical review of these products.

 
Feb 19, 2020
7
0
10
Listen bro, Ive been playing online video games since Quake 2. As in against other people . Back when alot of these younger "Tech" snobs who are pushing Gysnc on you were either inside the Womb or still on those dirty consoles , I was Hitting Mid Air rail shots playing quake at 100 frames per second (the CRT I had pushed 100hz at the time) on the very First TNT Card (First Open GL mass produced Graphics card for AGP slot)

I currently have a Gsync Monitor. 165 Hz 1ms respons time.
I never use Gysync ...know why? All it manages to do is cause your Monitor to get hot faster.

Its worthless man don't let these chodes tell you any different. They are the same type of guy that says they prefer the taste of Hops from Europe vs Hops from Budweiser.

Hops taste horrible , beer tastes horrible always has ...its worse warm and not much better cold. These are what we call posers. Esports gamers for the most part don't use G-Sync because well...its pointless.

Know why? Because we have been capable of Capping our Frames to within 3 fps of our mointors for Oh I dunno...20 years with console commands.

The advantage to Display port is allowing these junky flat screens to run over 120 Hz. Its not because it has !!!! G-Sync !!!!.

You are not going to see ANY TEARING unless (A) You run a 60hz monitor and are pushing 75-200 Frames per second. ....Cap them its not difficult.

The idea is to cap your frames just under the refresh of your monitor.

If the setup you have isn't capable of pushing + or -5 frames of the max refresh rate of your monitor ...what do you do ? Enable G-Sync (which yes does add input lag...funny u guys are sooo sensitive to tearing ...but don't notice the input lag ? lol ..u know the issue that actually does matter?)

In anycase , If you can't push 144 frames per second u have multiplie options , (A) u can't hit 144hz anyway ...DOWNCLOCK YOUR MONITOR ... Set a custom Resolution that your setup can actually push.

If you can afford a Gsync monitor i imagine u can just build a machine capable of pushing the frames u are looking to push tho ...so this should NOT be a problem.
You can lower your graphical settings ...No one cares about 144frames per second playing RPGS . alot of snobs talk about Ultra detail but in the same breath talk about using the 80 $ gaming mice and 400 Dollar 144-165 -240 hz monitors for those leet frames..... These are the people you must ALWAYS IGNORE.

No competitive Shooter players in their right mind give 2 $hits about Ultra Detail ...why? Because undergrowth = bad (cover for enemy) Ultra Lighting ? = BAD glare /reverb causes DUR blind spots ...especially if yer fighting jerkwads that fly say in BFV . Shadows ? Sure those can be useful ...but in general almost NO AAA game that anyone plays on is even Optimized at full Ultra settings .... we all play on Low/High combonations with the most sweaty try hards playing with EVERYTHING set to low and off , for the best visability lol.


So yea don't listen to these idiots ...if u can find a High Refresh monitor that doesn't have Fsync or Gsync ....go for it man High Fps gaming IS great always has been , if saving the extra money there allows you to purchase a real case that you can actually stick 7-9 fans in ... or Multiple rads for cooling blocks etc ...thats what u should spend the money on so you can actually Cool the components you are going to purchase.

Don't build console....avoid midtowers for god sakes don't mention "mini" itx amongst men. Get a big case ...get a real powersupply that can provide consistent ...clean low jitter/noise power to your components .... keep your ram cold and processor cold ...tighten those timings 6ms frametime is possible even at 120hz (odd but managed it yesterday) Latency matters , Input lag = bad ....packet loss = worse ... "tearing" = unicorn unless u are a moron and it never occured to you to spend the extra 80 bucks on something that matters and just cap your frames / adjust resolution or graphical settings.
 
Status
Not open for further replies.