Asus Announces G-Sync Enhanced VG248QE Monitor

Status
Not open for further replies.

mr grim

Distinguished
Aug 20, 2012
18
0
18,510
It's not that bad for 1080p and 144 Hz refresh rate, I would love to use this monitor instead of using v-sync on my 60 Hz monitor, I will be waiting till I get my next GPU upgrade though, sadly my GTX580 doesn't support G-Sync, maybe by then the price will have come down a little.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
G-sync doesn't make sense on a 1920x1080 panel maybe a higher resolution panel like 2560x1440. A fast card will do more for your frame rates than a G-synced monitor. If your card is fast it can usually push more frames than your monitor is set to at 60hz and you are not likely to notice any tearing or sync issues. Even at 120hz with a pair of fast cards it is rarely an issue.
 

mr grim

Distinguished
Aug 20, 2012
18
0
18,510
@ warezme you obviously have no idea what your talking about, 1920x1080 is perfect for G-Sync as is any resolution, G-sync is the only way to get the most from your monitor and GPU as they are both working together perfectly, no need to force v-sync and have 60fps that stutters and drops to 30fps when the game dips below 60, and no screen tearing if you have v-sync turned off, personally I have always used v-sync because I hate screen tearing and my monitor is only 60hz anyway so it wont display more then 60fps, so what that means is I am not getting the best performance possible either way without using G-Sync.
 
The idea of g-sync is great the fact its proprietary is not. I wish Nvidia would make it an open standard so Intel and AMD can update GPU's to use it. They make the chip to sell to monitor vendors you would think that would be enough profit instead of making it a closed system. Hell I suspect Nvidia would make more money by opening up the GPU side hence pull in more monitor vendors.
 

bochica

Distinguished
Oct 7, 2009
146
0
18,680
G-sync will eventually be available for the others, but Nvidia has the opportunity to get their share of the pie. It will be just like the PhysX support.

@booga, what the hell are you smoking? ASUS, Samsung, NEC, and Viewsonic are all up in the top 5.
 

boogalooelectric

Distinguished
Jul 1, 2009
266
0
18,860
@bochica,

I am smoking ASUS craptastic monitors, I HAD ONE, it was total junk, personal testimony here, not reading stats off of a website.

The thing only lasted a year, it was grainy, the colors were off and the panel was so cheap that once I had to turn it off while moving furniture and when I turned it back on the switch panel got stuck and forced the panel controls to cycle continuously.

Yeah, they make GREAT stuff /sarcasm off

Replaced it with a Viewsonic and have never had an issue.
 

joebob2000

Distinguished
Sep 20, 2006
788
0
18,980
The big advantage in G-sync is that the screen can update the instant the next frame is ready to go. By the time a 60hz frame is done drawing, the data used to draw it is already 16ms old, which means there 16ms of lag introduced even before the refresh speed of the lcd panel comes into the equation. If the frame can be shown the instant it is done drawing, the gpu and panel speed become the only limit (aside from internet speed if the game happens to be played via a remote server).
 

Serpent of Heaven

Honorable
Oct 10, 2013
13
0
10,510
Samsung can easily come up with it's own niche that serves as a Monitor Refresh Regulator or frame buffer just like the G-Sync Tech. The G-Sync Chip is a leftover Tegra4 chip with three Hynix memory modules... G-Sync, all it really does is regulates the refresh rate, and possibly stores a frame or two from the Primary Card. Problem is it could lag out itself if it stores frames. If something like a middle man is telling the Monitor the refresh rates from the graphic card, for fames A, B, C, D, E, etc... are coming somewhere between 12.6 ms to 16.68 ms, for each window-time of display, it could easily send frames to each window at the correct frame times, for each individual frame. NVidia isn't being bright. If it's only going to focus on NVidia users, it will only focus on the money-pie that supports it's base, and it may not even be 100% of the base. If Samsung made a similar gimmick, sold to both bases, that would increase profit revenue returns. Samsung would make more money than NVidia because 1, it could sell to NVidia consumers that don't equal to 100% + the AMD base... NVidia will stop making it proprietary when other Monitor Manufacturers start doing it. Samsung, Panasonic, Sony, Hansen, the Japanese monitor companies, Dell, LG. These companies isn't apart of the Asus-NVidia-G Sync clique... Samsung could easily use it's own processors, from it's own forge, to dominate this little, new nich in the monitor market.
 

bochica

Distinguished
Oct 7, 2009
146
0
18,680


So you either had a lemon, knock off, or you abused your monitor. You had ONE monitor out of hundreds of thousands of other ASUS monitors out there that somehow had the quality of a $50 TV. I'm not reading stats either aside of from global rankings. I've used ASUS (various models) both at work and at home (used Samsung's before that). We even have Planar, Acer, Viewsonic, HP, Samsung, Dell, and other monitors here at work. Everyone would rather have an ASUS, Viewsonic, or Acer over the others. ASUS's monitors are pretty far up there in quality. If it was only a year, you should have submitted for warranty work (covered for 3 years on ASUS).

Also most LCD monitors don't constantly cycle through inputs if the button were to get stuck. About 75% of the monitors I have used will show that it attempts to LOCK the button if held down. The buttons are part of the frame/bezel, no the panel. Panels are made by companies such as LG/Samsung, and most monitor manufacturers buy the panels from them to build their monitors.
 

rk0629

Distinguished
Apr 11, 2008
5
0
18,510
User comments on this article prove one thing: NVIDIA isn't doing a very good job of clearly explaining what G-Sync is or why it's important.

Attention general public: Search YouTube for LinusTechTips' video wherein Linus interviews Tim Sweeney and John Carmack at an NVIDIA press event where G-Sync was shown to developers.
 

Duckhunt

Honorable
Sep 22, 2012
339
0
10,810
I have ASUS and Yamakasi ( perfect pixel) and Hyundai and HP and others. None have been lemons. The monitor that really gets me excited is the Eizo FORIS FG2421 23.5" 240Hz Turbo Gaming Monitor. I can sit with this baby for hours and i don't get the eye strain I use to have. I almost bought 144hz but found this instead.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
@boogalooelectric Any successful company has some bad products. I bought an ASUS motherboard back in 2002 and The thing used to have issues with detecting hard drives if the drive configuration changes. It took me some years to trust ASUS again. When I finally bought another ASUS board in 2012, I had no issues with it what so ever.
 

Serpent of Heaven

Honorable
Oct 10, 2013
13
0
10,510


No offense, but AMD really doesn't need to do anything. One, they have Frame Pacing Software, and in January 2014, they will have it for Surround. Two, from what I have read about NVidia G-Sync, you're going to be capped on the FPS Bandwidth. So say for example you play a game like COD, and you get over 243 FPS 60% of the time, and you drop to 143 for the remainder. That's your lows and highs. With G-Sync, and the fact that your monitor is only 144hz or 60 hz, you are going to be locked in between 144 hz, or 60 hz. Your minimum will be 30 hz. It could probably go below that. You won't see that 243 FPS because G-Sync squeezes your frame rate bandwidth. It's being shrunk for more consistent frames. Third thing, in 2014 to 2015, you're going to start to see monitors that can support 240 hz, or less than 5.0 ms frame times. There's already true 240 hz TVs on the market, and 480 Hz TVs are starting to appear. So if your computer is spitting out 169 FPS, you won't have a single dropped frame, runt frame, screen tear, ghosting, etc... You can't see every 5.0 ms window to see or not see a frame.

G-Sync will be that little known NVidia Niche that came and went in a short time. Either that, or NVidia will continue to refine G-Sync, but it won't be necessary because the real issue is the static refresh rate on your monitor. It's not keeping up with the FPS performance of your Graphic Card.
 

Serpent of Heaven

Honorable
Oct 10, 2013
13
0
10,510


No offense, but AMD really doesn't need to do anything. One, they have Frame Pacing Software, and in January 2014, they will have it for Surround. Two, from what I have read about NVidia G-Sync, you're going to be capped on the FPS Bandwidth. So say for example you play a game like COD, and you get over 243 FPS 60% of the time, and you drop to 143 for the remainder. That's your lows and highs. With G-Sync, and the fact that your monitor is only 144hz or 60 hz, you are going to be locked in between 144 hz, or 60 hz. Your minimum will be 30 hz. It could probably go below that. You won't see that 243 FPS because G-Sync squeezes your frame rate bandwidth. It's being shrunk for more consistent frames. Third thing, in 2014 to 2015, you're going to start to see monitors that can support 240 hz, or less than 5.0 ms frame times. There's already true 240 hz TVs on the market, and 480 Hz TVs are starting to appear. So if your computer is spitting out 169 FPS, you won't have a single dropped frame, runt frame, screen tear, ghosting, etc... You can't see every 5.0 ms window to see or not see a frame.

G-Sync will be that little known NVidia Niche that came and went in a short time. Either that, or NVidia will continue to refine G-Sync, but it won't be necessary because the real issue is the static refresh rate on your monitor. It's not keeping up with the FPS performance of your Graphic Card.
 

Serpent of Heaven

Honorable
Oct 10, 2013
13
0
10,510


No offense, but AMD really doesn't need to do anything. One, they have Frame Pacing Software, and in January 2014, they will have it for Surround. Two, from what I have read about NVidia G-Sync, you're going to be capped on the FPS Bandwidth. So say for example you play a game like COD, and you get over 243 FPS 60% of the time, and you drop to 143 for the remainder. That's your lows and highs. With G-Sync, and the fact that your monitor is only 144hz or 60 hz, you are going to be locked in between 144 hz, or 60 hz. Your minimum will be 30 hz. It could probably go below that. You won't see that 243 FPS because G-Sync squeezes your frame rate bandwidth. It's being shrunk for more consistent frames. Third thing, in 2014 to 2015, you're going to start to see monitors that can support 240 hz, or less than 5.0 ms frame times. There's already true 240 hz TVs on the market, and 480 Hz TVs are starting to appear. So if your computer is spitting out 169 FPS, you won't have a single dropped frame, runt frame, screen tear, ghosting, etc... You can't see every 5.0 ms window to see or not see a frame.

G-Sync will be that little known NVidia Niche that came and went in a short time. Either that, or NVidia will continue to refine G-Sync, but it won't be necessary because the real issue is the static refresh rate on your monitor. It's not keeping up with the FPS performance of your Graphic Card.
 
Status
Not open for further replies.