AMD Fires Back at G-Sync With Non-Proprietary Alternative

Status
Not open for further replies.

nilfisktun

Distinguished
Aug 19, 2010
14
0
18,510
But if there is no changes to the screen, the refresh rate will still be fixed on it at 60, 120 or 144 hz. So how does this work? Whats the new tech here? If they are just fixed to the screens refresh, thats exactly the same as v-sync.. wtf?
 

Jeff Gilleese

Honorable
Dec 14, 2013
14
0
10,510
I had a long argument with someone on the g-sync article. No mater what you are going to be keeping an image on the screen as a buffered frame unless you show a Blake screen inplace of a buffered frame witch would cause a screen flicker. G-sync is a gimick. Any monitor manufacturer could make a panel controller that does the same thing and not need some expensive add on chip to implement. Nvidia is seeming desperate to control a market, even if they have to fabricate one. As I said last time. Nvidia is starting to remind me of Sony 8 years ago. Arrogant and grasping at straws.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
AMD has always been an innovative company, yet they still struggle financially due to various reasons. Good for them for bringing this for free with no extra hardware in the monitor. I understand innovation and the right to patent things to protect your investment in R&D but sometimes some things should not be proprietary (aka rounded corner rectangles - apple ).
 

houldendub

Distinguished
Dec 19, 2011
470
0
18,960
Lol @ everybody hailing AMD as the next messiah here. They say it's going to be freely available, but much like TressFX and Mantle, no doubt this is for GCN chips only, which, shock and horror, are only available on AMD cards.At least Nvidia goes out and shows this stuff off to people, has anyone seen any Mantle powered footage of BF4 yet? Is frame pacing available to DX9 games yet? Or Eyefinity setups? They're not exactly the pinnacle of innovation either; I mean, Right after Nvidia announce and then release the Geforce Experience, suddenly AMD slaps it's name on a third party tool that does much the same (albeit, and let's be honest, worse) thing. Then, after Gsync, suddenly AMD's here and has this technology that's apparently ready in drivers, but has absolutely no demo, no proper press announcement and no confirmation how it might work or anything. They haven't even sorted out frame pacing, now that the discussion around it has seemingly died down, so has development on getting it properly working it seems.C'mon AMD, do something magical that Nvidia hasn't done yet. Go wild and do something like hell, I dunno, dual GPU versions of every card, what about a triple GPU card? Or here's a wild one, actually get one of your proprietary technologies working in more than 1 game yeah? Such lack of imagination.
 
so amd says that they don't know why nvidia need extra hardware to do the g-sync stuff since it can be done on software entirely. alright amd can you show us this freesync running real games? so far they only have that windmill demo. when can we expect amd showing something really work at the very least?
 
VBLANK and variable VBLANK need to be supported in the monitors Scaler unit.it has mostly started to take off and was originally designed as a power saving feature as part of the VESA specification since it reduces the number of times you have to send current to the panel to refresh the LCD pixels which will degrade in structure over a very short half life which is why panels still need to be refreshed and the hold period cant be too long which is why Gsync disables bellow 30fps since the frame times become longer than the period it takes for the colours to degrade as the liquid crystals in the panel approach their half life.AMD saw that Variable VBLANK can essentially do the same thing as nvidias Gsync hardware and have supported the VESA standard for a good couple of generations, nvidia tends to trail when it comes to adopting new standards but loves to make its own proprietary ones.but the monitor has to support variable VBLANK if aditional or different circuitry is required it is obviously cheap enough to do since it is being implimented as a standard and is implemented in cheap notenooks.
 
As far as i Can tell, and understand Im not a tech expert, If you get better fluidity is at the cost of increased image latency/lag.
This seems rather Logical and obvious, as in, if you want a good V-Sync, the aded input lag is not from you waiting for the entire image/frame to load, but for the system to notice that and decide to display the image.

This might seem like little to most people, but Ive been an avid amateur Starcraft Broodwar player for a long time, and even thou i can bearly tell that my laptop monitor displays faster an image than my TV (and bare in mind this is when i have them next to each other and actually try to notice if there is a difference), When i play the game at many actions per second, the TV feels terribly slow.

I know that in Single player games this does not happen often, but in fast paced games (for example in car racing) this can clearly be a deal breaker.

Id rather AMD and nVidia work on better AA systems, cosidering that SweetFX is basicly better than any technology out there right now, and its produced by some dude (and I asumme not only him), while AMD and nVidia have a decent amount of tech profesionals that could improve it even further.

I do like AMD more for making the technology free compared to nVidia, but in the end, performnace to price is my golden rule, unless a technology is actually worth the extra dollars/euros.
 

Kewlx25

Distinguished


While the time it takes to display the frame is the same, the timing of when the frame starts is variable. Current monitors are synchronous. This means if your GPU does not have a new frame ready, it doesn't need to wait until the next frame starts, it just doesn't push a new frame and waits until the next is done. This only affects the monitor when the FPS drops below the refresh rate.

 

thegreatjombi

Honorable
Jan 8, 2014
2
0
10,510
Does anyone know if this deals with input lag? From what I understand the extra Gsync hardware was dealing with the input lag portion of equation too. I was looking forward to Gsync because of the increase in video quality while not having to deal with the massive amount of input lag added when using V-sync.The AMD solution seems like a hardware V-sync with no consideration for input lag. is that true?
 

houldendub

Distinguished
Dec 19, 2011
470
0
18,960


Well it'd be nice to hear something from AMD that wasn't a childish "lol ours is better trust us!" first.

While Nvidia is crap with stuff like this locking it down to certain platforms, no doubt AMD's implementation will be just as bad.

For instance, can anyone at all here name a monitor that supports VBLANK?
 

eklipz330

Distinguished
Jul 7, 2008
3,033
19
20,795


"Using two Toshiba Satellite Click notebooks purchased at retail, without any hardware modifications, AMD demonstrated variable refresh rate technology. According to AMD, there’s been a push to bring variable refresh rate display panels to mobile for a while now in hopes of reducing power consumption There’s apparently already a VESA standard for controlling VBLANK intervals. The GPU’s display engine needs to support it, as do the panel and display hardware itself. If all of the components support this spec however, then you can get what appears to be the equivalent of G-Sync without any extra hardware.
In the case of the Toshiba Satellite Click, the panel already supports variable VBLANK. AMD’s display engines have supported variable VBLANK for a couple of generations, and that extends all the way down to APUs."


key words is that they purchased at retail. im assuming there are plenty of products that already support this. it's just about time manufacturers start tagging it on their monitors like "haswell ready"
 

houldendub

Distinguished
Dec 19, 2011
470
0
18,960
Laptop screens work differently than normal monitors. Normal monitors have scaler ASICs in them, whereas laptop monitors have a much more direct connection with the laptop's internals, with weird connectivity methods like LVDS or Embedded DisplayPort, meaning yeah, some laptops may have the correct functionality, but very few at most, and zero normal desktop monitors have the functionality, which is why Nvidia made the G-sync module in the first place. Did you not think it weird that a graphics powerhouse showed off their fancy new tech on low budget, low powered laptops?

When asked if a scaler ASIC with variable refresh capability exists, Tom Petersen from Nvidia at CES had this to say: "We would know."
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680
While AMD has been "innovative" as people say, the problem is they don't seem to push it very well. Or perhaps they push the right thing at the wrong time. Maybe this is the right thing to push at the right time.

However, AMD still has a problem. It's targeting a small portion of people that actually care. NVIDIA's been venturing out to larger markets and making them care. Sure a lot of what they do is proprietary, but the point is, they made people care. And that's how you do business.
 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
I like Nvidia for my high end gaming cards because it doesn't matter if I get one high end or two, I can depend on the hardware and drivers working well together fairly painlessly. I have also enjoyed add stuff like physx and 3D gaming as an option. That said I don't support G-sync as a physical add-on. It always seemed to me there was a way to do something similar in software but I had no proof. I don't support G-sync because the main benefit as I see it is for low end GPU's that can't push high framerates. Someone with fast GPU and CPU and 2 or 3 high end monitors isn't really interested in G-sync. It should have been pushed to the xbox crowd.
 
Status
Not open for further replies.