[done]Nvidia’s FleX Technology Used In Killing Floor 2

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.


It is not nVidias job to make their products work with AMDs GPUs nor is it AMDs job to make their products work with NVidia GPUs. AMD doesn't sit and code TressFX to make sure it works right on nVidias GPU or Mantle.

It has long been that way with GPUs. I remember when 3DFX was the king and they had multiple technologies the others did not have that only worked on a 3DFX Voodoo GPU.
 


When I buy an Nvidia card or an AMD card I'm not paying them to make proprietary software that won't support my Windows DirectX system. They should not be in the game engine side of things paying or enticing game developers to use code that won't work on their competitors hardware. You mentioned 3dfx from ages ago I'm talking about Windows 7 operating system that supports DirectX. I buy games that support Windows and DirectX not Nvidia or AMD. They are both evil companies that just want your money but AMD has been way more open. Nvidia wants to turn the PC into their console I don't want them to. You want a console, go buy one.
 


The problem is not that it's not NVidias job to make their software solution work on AMD's hardware, but rather NVidia actively preventing their products from working on AMD's hardware with licensing restrictions. When AMD releases a new technology they have it open so anyone can take it and make improvements; AMD doesn't actively code to work on competitor's solutions, but at least AMD gives them the opportunity to improve. When AMD released TressFX, it sucked on NVidia's GPUs, but NV was able to make optimizations and the performance hit was reduced. When NV released a similar technology, HairWorks, it sucks on both NV and AMD, but disproportionately worse on the compute monster that is GCN, which is nonsensical unless it was intentionally designed to perform poorly on the competition's hardware (which isn't unheard of). And due to licensing restrictions, neither AMD nor Game Devs can work on improving performance on AMD hardware.

Also NVidia's refusal to adopt open standards screws over the consumer as well. I wanted variable refresh rate, and I drank the G-Sync KoolAid, but I need multiple inputs, and I need a reasonably priced monitor, not something that costs more than my entire computer. AMD introduces FreeSync, which utilizes open standards, so there is no reason for Nvidia not to implement it, especially with FreeSync Monitors shipping out faster, cheaper, and with more diversity vs G-Sync Monitors. However, they are not, which means as an NVidia Consumer, I'm screwed out of variable refresh rates because reasons.

With these practices, NVidia is hurting and stifling the industry.
 
Also NVidia's refusal to adopt open standards screws over the consumer as well. I wanted variable refresh rate, and I drank the G-Sync KoolAid, but I need multiple inputs, and I need a reasonably priced monitor, not something that costs more than my entire computer. AMD introduces FreeSync, which utilizes open standards, so there is no reason for Nvidia not to implement it, especially with FreeSync Monitors shipping out faster, cheaper, and with more diversity vs G-Sync Monitors. However, they are not, which means as an NVidia Consumer, I'm screwed out of variable refresh rates because reasons.

With these practices, NVidia is hurting and stifling the industry.

As for your complaints about G-Sync you're leaving out an important point.

FreeSync fails horribly at any refresh rate outside of the panel's hardware refresh rate. FreeSync can only match refresh rates to frame rates. If the video card puts out a higher or lower frame rate than the monitor's refresh rate it turns FreeSync off which means you face the exact same problems you have now.

So if you're playing a game on a slow computer where you get super low refresh rates then FreeSync is ok...at least until you hit a lull in the action and your FPS jump or a spike in the action and your FPS plummets. Likewise if you're gaming on a fast computer the problem is even worse because you'll output FPS above the panel refresh rate even more often.

nVidia's G-Sync solution doesn't suffer from this fundamental flaw which is why it's a superior technology and why they developed an in house solution which requires assist from the GPU. Now if you want to talk about pricing, you can have a FreeSync solution that can avoid this flaw, if you buy a FreeSync panel that has a high enough refresh rate. That's a bit of a problem because there aren't any 240hz or higher FreeSync panels and even if they become available you're going to be paying more than what you would for a G-Sync module.

FreeSync is just a stop gap by AMD until they develop a true competitor to G-Sync, and FreeSync is only free because it wasn't developed by AMD, it's simply utilizing existing video standards. When AMD develops their solution it's going to require the same overhead as G-Sync and maybe at that point in time there will either be two dedicated modules in gaming panels or nVidia and AMD will converge into a common module technology.

At the end of the day this will be just like Crossfire and SLI. Originally motherboard chipsets could only handle one or the other (funny, that sounds like panel modules). You bought a motherboard for either Crossfire OR SLI which locked you into a specific GPU type (funny, that sounds like Free/G-Sync panels). However, as the technology matured motherboard chipsets evolved in order to handle both multi-GPU specs. In the future you'll be buying an 'Adaptive Sync' panel that supports both AMD and nVidia GPUs.
 
AMD stall/pause code included at no cost.

That's interesting...do you currently own a specific AMD GPU that you're using to play KF2 ? I'm playing on a 280x backed by a heavily overclocked i5 2500k and I have no pausing or stalling at all with the graphics cranked all the way up. KF2 is sensitive to lag though, so I wonder if your confusing GPU issues with your internet connection...
 
...FreeSync fails horribly at any refresh rate outside of the panel's hardware refresh rate. FreeSync can only match refresh rates to frame rates. If the video card puts out a higher or lower frame rate than the monitor's refresh rate it turns FreeSync off which means you face the exact same problems you have now...

Ok. VESA adaptive sync (around since 2009) does not fail horribly. Adaptive-Sync protocol is in the DisplayPort standard and Nvidia said they will not support it. Easy for them to do but you can guess why they will not. There is already a 30-144Hz VESA adaptive sync display and VESA adaptive supports down to like 9Hz. Yes, if your game drops below 30 FPS on this new display… Wait, why are you playing games at 20-ish FPS? Cleaning up some horizontal tearing is not going to make that a fun experience. G-Sync tries to help with this by passing the same frame over again, like V-Sync already does. With VESA adaptive sync you can use it with or without V-Sync turned on. So if you go below 30Hz V-Sync kicks in. So your 10-20FPS slide show has no horizontal tearing. For the high end or going over the displays refresh rate, again you can choose to use V-Sync or not and most new cards have frame limiting so it’s not generating 300FPS on your 60 or 120 or 144Hz display and wasting electricity. Besides, horizontal tearing is not normally an issue if you consistently above your displays refresh rate.

So again, thanks to Nvidia you have to try and pick the video card you want and worry if the display you want will support adaptive sync.
 
...FreeSync fails horribly at any refresh rate outside of the panel's hardware refresh rate. FreeSync can only match refresh rates to frame rates. If the video card puts out a higher or lower frame rate than the monitor's refresh rate it turns FreeSync off which means you face the exact same problems you have now...

Ok. VESA adaptive sync (around since 2009) does not fail horribly. Adaptive-Sync protocol is in the DisplayPort standard and Nvidia said they will not support it. Easy for them to do but you can guess why they will not. There is already a 30-144Hz VESA adaptive sync display and VESA adaptive supports down to like 9Hz. Yes, if your game drops below 30 FPS on this new display… Wait, why are you playing games at 20-ish FPS? Cleaning up some horizontal tearing is not going to make that a fun experience. G-Sync tries to help with this by passing the same frame over again, like V-Sync already does. With VESA adaptive sync you can use it with or without V-Sync turned on. So if you go below 30Hz V-Sync kicks in. So your 10-20FPS slide show has no horizontal tearing. For the high end or going over the displays refresh rate, again you can choose to use V-Sync or not and most new cards have frame limiting so it’s not generating 300FPS on your 60 or 120 or 144Hz display and wasting electricity. Besides, horizontal tearing is not normally an issue if you consistently above your displays refresh rate.

So again, thanks to Nvidia you have to try and pick the video card you want and worry if the display you want will support adaptive sync.

i think he was talking about the initial freesync monitors that skimped out/were half baked but attributes that to the standard and not to companies cutting corners.
 

I think you have a lack of understanding when it comes to Vsync and adaptive sync technologies.

First off Vsync works at a SINGLE frequency. So on your example of a 30-144hz FreeSync panel, FreeSync would 'fail' or be turned off for anything below 30 or anything above 144. If you have Vsync on then your monitor attempts to refresh at a SINGLE frequency just like old monitors...it still can't handle variable frequencies. If it could then we wouldn't need Adaptive Sync technology. So yes, with FreeSync you have a fail condition for any FPS that doesn't fall inside the FreeSync panel refresh rate. Furthermore, anyone who seriously games NEVER uses Vsync because part of Vsync technology is delaying/buffering frames which obviously leads to display lag and missed frames which is the prime evil for gaming panels.

So if you're seriously gaming then FreeSync is a hard fail outside the panel range. If you're a casual gamer then FreeSync is a soft fail and just turns into a standard panel outside the FreeSync range.

You also have a lack of knowledge about tearing as well. Tearing will ALWAYS happen if you're not running Vsync and your FPS is higher than the refresh rate of the panel. It's easy to understand, if you're at 100 FPS and your panel is at 60hz then on the panel refresh you draw one screen and then are 67% through drawing the second when your monitor refreshes for a new screen (aka 'tearing' the screen because you didn't finish a draw). You won't see this on static screens because the image isn't changing but it's still tearing. You will notice this in any kind of a fast paced game. The whole point of Vsync is to essentially buffer frames so that they only draw at the max refresh rate and only complete frames. This mostly works fine except that this buffering obviously introduces delay which causes lag for a gamer.

FreeSync and G-Sync essentially operate the same in the panel 'hardware band'. However, FreeSync panels turn into a 'dumb panel' when outside that range. On the other hand G-Sync has the capability to perform additional refresh rate mechanics like driving the panel at adaptive integer multipliers of the GPU just like TV refresh rates that are designed at even multipliers of all the potential connected devices refresh rates (which is why they don't stutter/tear). The other shortfall is that FreeSync will always be chained to the VESA standard which means that even if AMD has the technological capability they're locked into the DisplayPort 1.2a refresh rate until VESA decides to make a change which can quite literally take years. If you don't believe me just ask HDMI and DP fans how long a standard was made before it was officially approved. In the case of G-Sync nVidia can iterate the hardware as fast as they want.

So yes, it's a format war. People who want the maximum GPU power already use nVidia hardware even though AMD can be more cost effective. Do you really think those people are going to balk at an extra $50-$100 for a G-Sync panel that will last a decade when they're spending that premium annually on one or more GPUs?
 


When I buy an Nvidia card or an AMD card I'm not paying them to make proprietary software that won't support my Windows DirectX system. They should not be in the game engine side of things paying or enticing game developers to use code that won't work on their competitors hardware. You mentioned 3dfx from ages ago I'm talking about Windows 7 operating system that supports DirectX. I buy games that support Windows and DirectX not Nvidia or AMD. They are both evil companies that just want your money but AMD has been way more open. Nvidia wants to turn the PC into their console I don't want them to. You want a console, go buy one.

" I'm not paying them to make proprietary software"

News flash, yes. You are. You want one company to do all the R&D and give you a choice not to pay them for it? lol. This isn't a Direct X thing, it's a proprietary firmware feature on the videocard.
 

I think you have a lack of understanding when it comes to Vsync and adaptive sync technologies.

First off Vsync works at a SINGLE frequency. So on your example of a 30-144hz FreeSync panel, FreeSync would 'fail' or be turned off for anything below 30 or anything above 144. If you have Vsync on then your monitor attempts to refresh at a SINGLE frequency just like old monitors...it still can't handle variable frequencies. If it could then we wouldn't need Adaptive Sync technology. So yes, with FreeSync you have a fail condition for any FPS that doesn't fall inside the FreeSync panel refresh rate. Furthermore, anyone who seriously games NEVER uses Vsync because part of Vsync technology is delaying/buffering frames which obviously leads to display lag and missed frames which is the prime evil for gaming panels.

So if you're seriously gaming then FreeSync is a hard fail outside the panel range. If you're a casual gamer then FreeSync is a soft fail and just turns into a standard panel outside the FreeSync range.

You also have a lack of knowledge about tearing as well. Tearing will ALWAYS happen if you're not running Vsync and your FPS is higher than the refresh rate of the panel. It's easy to understand, if you're at 100 FPS and your panel is at 60hz then on the panel refresh you draw one screen and then are 67% through drawing the second when your monitor refreshes for a new screen (aka 'tearing' the screen because you didn't finish a draw). You won't see this on static screens because the image isn't changing but it's still tearing. You will notice this in any kind of a fast paced game. The whole point of Vsync is to essentially buffer frames so that they only draw at the max refresh rate and only complete frames. This mostly works fine except that this buffering obviously introduces delay which causes lag for a gamer.

FreeSync and G-Sync essentially operate the same in the panel 'hardware band'. However, FreeSync panels turn into a 'dumb panel' when outside that range. On the other hand G-Sync has the capability to perform additional refresh rate mechanics like driving the panel at adaptive integer multipliers of the GPU just like TV refresh rates that are designed at even multipliers of all the potential connected devices refresh rates (which is why they don't stutter/tear). The other shortfall is that FreeSync will always be chained to the VESA standard which means that even if AMD has the technological capability they're locked into the DisplayPort 1.2a refresh rate until VESA decides to make a change which can quite literally take years. If you don't believe me just ask HDMI and DP fans how long a standard was made before it was officially approved. In the case of G-Sync nVidia can iterate the hardware as fast as they want.

So yes, it's a format war. People who want the maximum GPU power already use nVidia hardware even though AMD can be more cost effective. Do you really think those people are going to balk at an extra $50-$100 for a G-Sync panel that will last a decade when they're spending that premium annually on one or more GPUs?

Instead of being calling out others for not researching, why don't you stop being a hypocrite and actually do what you advise.

http://www.techspot.com/review/978-amd-freesync/

FreeSync beats G-Sync in every way and is the only tech that can actually make it to every monitor. G-Sync will never make it to budget builds due to the cost and that alone is enough to give FreeSync a huge advantage (aside from it's obvious technical ones).
 


I don't see that as a problem with Freesync, but rather a problem with the Panel. Any display technology is going to fail when operating outside of the panel's parameters, including G-Sync. Just because my my computer is able to pump out 500FPS doesn't mean the panel is going to keep up, G-sync, or otherwise.

So if you're playing a game on a slow computer where you get super low refresh rates then FreeSync is ok...at least until you hit a lull in the action and your FPS jump or a spike in the action and your FPS plummets. Likewise if you're gaming on a fast computer the problem is even worse because you'll output FPS above the panel refresh rate even more often.

nVidia's G-Sync solution doesn't suffer from this fundamental flaw which is why it's a superior technology and why they developed an in house solution which requires assist from the GPU. Now if you want to talk about pricing, you can have a FreeSync solution that can avoid this flaw, if you buy a FreeSync panel that has a high enough refresh rate. That's a bit of a problem because there aren't any 240hz or higher FreeSync panels and even if they become available you're going to be paying more than what you would for a G-Sync module.

I'm not sure if you know this, but there is such a thing as capping framerates. I do it all the time so my GPU doesn't cook and noise stays minimal, especially when my framerate exceeds my refresh rate. I personally haven't played on a 144Hz monitor, but there aren't too games that run at that framerate regardless of hardware configuration, much less past it. Not only that, but with high-refresh rate monitors, screen tearing becomes significantly less noticeable, reducing the value of both FreeSync & G-Sync. If you have adaptive sync and running high framerates, turn up the visual settings. Framerates too low? Turn down settings. The bread and butter of adaptive sync lies within the 40-75Hz range. FreeSync can support 9Hz, but the issue there is LCD technology, where if an image stays on screen too long without being refreshed, there can be a strobing or fading effect.

FreeSync is just a stop gap by AMD until they develop a true competitor to G-Sync, and FreeSync is only free because it wasn't developed by AMD, it's simply utilizing existing video standards. When AMD develops their solution it's going to require the same overhead as G-Sync and maybe at that point in time there will either be two dedicated modules in gaming panels or nVidia and AMD will converge into a common module technology.

FreeSync seems to be working just fine. I'm not sure what gave you the impression that AMD is working on another adaptive sync technology, but I doubt it.

At the end of the day this will be just like Crossfire and SLI. Originally motherboard chipsets could only handle one or the other (funny, that sounds like panel modules). You bought a motherboard for either Crossfire OR SLI which locked you into a specific GPU type (funny, that sounds like Free/G-Sync panels). However, as the technology matured motherboard chipsets evolved in order to handle both multi-GPU specs. In the future you'll be buying an 'Adaptive Sync' panel that supports both AMD and nVidia GPUs.
The problem is G-Sync actually limits feature sets, such as HDMI, or DVI inputs, it spikes manufacturing costs, where as FreeSync practically does the same thing, doesn't restrict inputs, and doesn't cost more to implement. That could happen if NVidia was so stubborn and greedy - it wouldn't cost them squat to support FreeSync on their GPU's, but they won't because it would likely kill G-Sync. In fact, with Mobile G-Sync they are relying on eDP standards instead of a separate module to get adaptive sync - I wonder where I heard this before...
 
Why is everyone so against Nvidia for trying to run a business? It is the nature of a business to carve out their own niche that makes them more desirable than a competitor. So how about this: imagine a world without Nvidia? Look good? Thought not.
 
It's very understandable that people, including me, are not so happy about many of Nvidia's decisions. I'd like to buy a new videocard based on price/performance and not on brand. If you invest in G-Sync, PhysX, Gameworks etc etc You are bound to buy only Nvidia videocards even if the provide worse price/performance than AMD. Now this is still a free choice I can make but it just plain sucks that Nvidia is even refusing to provide support for open standards. At least for now as with CUDA they had to also adapt OpenCL en DirectX compute because ignoring that was not an option. If FreeSync grows they can always jump onboard. They problem is that they other way around is just impossible.If AMD wants to do g-sync its impossible, if AMD want to do Physx accel, not possible. There is no technical reason just a restriction put their by Nvidia. Nvidia's is even so childish that they disabled PhysX acceleration for people running a GeForce next to their AMD GPU for hardware accel of PhysX while it worked before. So its really simple, yes Nvidia is allowed to do what they do and I am free to choose otherwise. But if too many people jump aboard these Nvidia only tech Nvidia gets a monopoly which is very unhealthy for everyone. Closed tech is not the best option in the long run especially for the buyers. So many people prefer using open technology like OpenCL/DirectCompute/FreeSync etc but Nvidia is blocking any open tech they have a closed alternative for and so I still have only one possible choice get AMD graphics cards. In the long run Nvidia will support FreeSync as G-Sync is too limited ad restrictive even if it is a better tech (which I doubt)
 


It's one thing to run a business, it's another to be anticompetitive and anti-consumer. If NVidia at least let developers optimize their games on all hardware platforms, fine. But crippling performance intentionally on competitors and even your own older hardware, and then bribing developers to use your broken APIs and then contractually forbid the developer from working with anyone else to optimize their own games hurts the industry, and it hurts consumers and just bad business all around, and that's Nvidia's GameWorks in a nutshell. And their shady practices don't stop at GameWorks either. Whether it's outright lying about card specs, disabling features that were used a selling points (OCing on mobile chips), refusing to adopt open standards to force their own proprietary implementation, but then turn around and use said standards in the mobile market (G-Sync vs FreeSync vs Mobile G-Sync), intentionally crippling performance on your own previous generation hardware to force your customers to upgrade, etc. Nvidia has been involved in a lot of shady practices lately, and it's something gamers should let NVidia know how they feel with their wallets, which NVidia has demonstrated is the only language they understand.
 

Your article is also bogus, as it doesn't understand how G-Sync works below the a panel's default refresh rate range. If a so called 'tech' article doesn't even understand the technology how can they be considered a viable resource on the topic? Additionally, your article never points out a SINGLE technical advantage of FreeSync, it only says that it performs like G-Sync. Matching in most areas is not the definition of a superior technology. For crying out loud, nVidia created a 'FreeSync' solution and upgraded it to G-Sync because it had limitations. Are you seriously going to link a cherry picked review that you didn't even read?

You say G-Sync will never make it to budget builds when it is in fact the ONLY adaptive sync technology with a solution below the panel's refresh range. FreeSync will ALWAYS have stuttering or tearing when FPS drops below the panel's refresh range. So in reality G-Sync is the ONLY technology that will remove stuttering or tearing in low end budget builds, not to mention, G-Sync works with video cards that are years old while FreeSync needs brand new cards.
 


Not every technology fails when outside the default panel range. Check out the difference and explanation of FreeSync and G-Sync as explained by actual science...
http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ

G-Sync doesn't default to 'dumb panel' mode when FPS drops below the panel range, instead it continues to provide a smooth output at any FPS all the way to to zero. FreeSync is incapable of this feat because they adhere to the VESA standards. Yes you can cap FPS but that's a software solution, not a hardware solution. FPS caps also only work at values greater than the max panel refresh rate anyway which you've correctly stated isn't the major issue.

I think you've got the wrong solution in tweaking your game though. The whole point of adaptive sync technology is to run your game at the max settings so that it looks as awesome as possible but at the same time you limit tearing and stuttering with a hardware solution to achieve buttery smooth beautiful gaming.

You're also tad off on the bread and butter range. FreeSync's bread and butter is the range of the panel. That 40-75hz number came from the first FreeSync panel that was released. Newer FreeSync panels are good up to 144hz. However, below 40hz there are still flickering issues (strobing/fading) as you mentioned with FreeSync which is why the theoretical 9hz is just that for FreeSync. This sub 40hz issue is the exact reason nVidia created G-Sync. Only G-Sync has the technology to get over this LCD hurdle. It does this by recognizing the FPS and then driving the panel at an appropriate multiplier in the panel's normal range.

Example: If the FPS drops to 29, G-Sync recognize this is outside the ideal range and drives the panel at 58hz (2x) which results in a perfect ratio to eliminate stuttering and tearing. If the FPS further drops to 14 FPS, then G-Sync drives the panel at 42hz (3x), another perfect ratio.

This implementation is always going to be superior to FreeSync and what 99% of people don't understand when they say FreeSync is just as good as G-Sync. FreeSync simply cannot do this, for anything outside it's normal range FreeSync just shuts everything down and you get the normal 'dumb panel' complete with stuttering or tearing.

FreeSync is obviously better than a non-sync panel, however it IS inferior to G-Sync below 40hz. If AMD wants sync technology below 40hz they'll have to come up with an implementation similar to nVidia.

As for the mobile implementation it already existed on laptops with eDP because performance wasn't an issue and variable refresh rate was implemented as a power saving feature. In fact, AMD first demoed their FreeSync on a laptop panel (CES 2014) because they had nothing to do in order to make it work. The problem was always with normal LCD panels. Let's also not forget that is was nVidia who pushed this technology, AMD rushed out 'FreeSync' on that laptop panel at CES as a panicked response because they weren't pushing and researching adaptive sync technology while was nVidia was.

TLDR FreeSync is good, G-Sync is better. The only question is if you want to pay the slight premium for the best hardware, but this is nothing new.
 


I don't know, I think many people will vouch for Tech Spot over your opinion.

Oh and for your fail at reading the whole page and then telling me I don't know how to read
http://www.techspot.com/review/978-amd-freesync/page4.html

why don't you read that whole page and then come back and tell me how much of a fool you just made of yourself.
 


As a GTX 970 owner, you sir are awesome. Thank you for fighting against bad business practices.
 
 

You can vouch for TechSpot, but at the very least the author doesn't even understand how G-Sync works. That's like saying you think A is better than B but you can't tell me anything about B but you're still sure A is better 'just because'. In the real world we call than an opinion not a fact.

I'll quote some of your last page statements...

"The company has delivered on their promises to create a cheaper, more flexible, open standard for variable refresh, which compared to Nvidia's closed G-Sync implementation, makes it the better choice for gamers."
He says FreeSync is better because it's cheaper and open NOT becasue it's superior technology. With that philosophy we're all ordering cheap knock-off products from China right?

"Wherever possible, the display will refresh itself at the instant a frame from the GPU has finished rendering, removing stuttering, tearing and general jank that is present from fixed-refresh solutions."
He says 'whenever possible' which for FreeSync is only in the approved panel range, and NOT a limit of G-Sync. He doesn't include the fact that G-Sync's 'whenever possible range' is all the way down to 0 FPS.

"G-Sync's choice to force v-sync on introduces stutter and an extra performance hit below the minimum refresh, which can be resolved on FreeSync by disabling v-sync."
Again, proof he doesn't understand how G-Sync even works at low refresh rates because G-Sync doesn't have Vsync on at minimum refresh rates, it uses unique multipliers to drive the panel at perfect ratios. He just assumed that because FreeSync goes to 'dumb panel' mode sub-40hz that G-Sync did the same. The reason FreeSync has to give you the option of on/off Vsync is because they want you to be able to eliminate stuttering OR tearing at min rates, something that doesn't even exist as a problem for G-Sync at those same min rates.

This whole article reads like a puff piece that is an advertisement for AMD. Now granted, it is a FreeSync review and not a real comparison, but that isn't surprising either. A look at the author's article history shows he primarily covers AMD and smartphones...to top it off he's a newer blogger on TechSpot, certainly not someone with a long history of hard hitting factual unbiased reviews. I also find it funny that you're almost the same...you won't take the time to understand the proven science http://www.pcper.com/reviews/Graphics-Cards/Dissecting-G-Sync-and-FreeSync-How-Technologies-Differ and keep saying "because TechSpot!".

Again you're making things up.
"G-Sync works differently, doubling the refresh rate and inserting duplicate frames starting at around the 37 FPS mark. This continues until the game frame rate hits 19/20 FPS where a third frame is inserted and the refresh rate is increased again. The result is the dashed line representing the effective experience of G-Sync."
It clearly states it can handle more than half the min panel rate and if you understood the science you would have also seen that in the oscilloscope graphs.

You only have ONE right answer here and that is to say you want FreeSync as your Adaptive Sync solution because it's cheaper. That is your only answer for FreeSync because it's proven to be an inferior technology. There isn't anything wrong with that, it's the same reason people build gaming computers that don't have Intel chips, because people want to save money. Just please stop trying to wave around a red herring about FreeSync being a better technology.
 
If I have a 144Hz panel, and my game drops down to 20fps/Hz.... how is screen tearing even an issue at that point? My understanding is that tearing is a problem when the fps of the game exceeds that of the panel, not the other way around.
And if you do use Vsync at that point, are you actually waiting for frames when your monitor is refreshing ~7 times for every new frame to display?
 
Status
Not open for further replies.