Christopher Shaffer
Honorable
While I agree with the top post that there is some misinformation here, your post is just as misinformed. V-Sync functions by dropping FPS to 1/2 at a certain level; there's no two ways about it. Adapative Sync (Nvidia) is a little bit more "adaptive" but still has similar issues.
G-Sync is really the only solution ATM.
That said, there is a TON of outdated, incorrect and just plain WRONG information about VRAM. For example, you claim VRAM size doesn't matter, yet point to the (correct) fact that the size of the textures and the number of on-screen textures are affected by VRAM. That alone is enough to tell you that depending on texture depth, your FPS *and* input lag could be affected by your VRAM. If your VRAM is not large or fast enough to handle the quality setting you've chosen, your FPS and ultimately, response time, are negatively affected. This is science and math, not opinion.
I will also state that 1GB vs. 2GB vs 3GB vs 4GB on GPUs is subjective: it depends entirely on the game, resolution, texture depth, etc. and it makes a massive difference in games that require a high FPS (such as BF4) and games that have a lot of things going on all the time on-screen (like AC4). I was able to hit a solid 100FPS on a single GTX 770 4GB while an online buddy was only able to hit about 85FPS with the EXACT same configuration, except that he had the 2GB version of my same card (Gigabyte GTX 770 OC 4GB / 2GB). The difference is that the game will use around 70% of your available VRAM. If you have 2GB, this is about 1.4GB, if you have 4GB, obviously it's twice that.To say that xAmount of VRAM is not necessary completely ignores modern game architecture.
Once I got about 1/2 way through this article, I checked the date again to make sure it wasn't 2010. I'm still surprised it's not.
Those of you who keep coming in here and saying that 4GB or whatever above 2GB makes no difference are ignoring the obvious: I can monitor a game via Afterburner and verify the exact amount of VRAM used at any time. During busy times in BF4 it regularly uses 2.6GB of VRAM. And yes, I am using SLI, and yes I'm aware that it's still effectively 4GB. And no, Afterburner isn't stupid and somehow gets the math wrong. The game simply has a lot going on under Ultra settings.
The games tested and comparisons made are virtually irrelevant. Skyrim? Hitman? Give me a break. Skyrim doesn't even use DX11. Comparison of the average VRAM used by Steam users? WTF does that even matter? So, I don't need more VRAM because a bunch of other people using Steam won't pony up the cash for it?
That's what is called a logical fallacy: you're attributing one factor to another even though there is no relation between the two other than coincidence. It's the equivalent of saying I don't need size 12 shoes because the average shoe size is 10. It really depends on my feet, just like VRAM really depends on the use case, not the average user.
The average user can't hit 144FPS @1080p on Ultra settings on a modern game, either.
G-Sync is really the only solution ATM.
That said, there is a TON of outdated, incorrect and just plain WRONG information about VRAM. For example, you claim VRAM size doesn't matter, yet point to the (correct) fact that the size of the textures and the number of on-screen textures are affected by VRAM. That alone is enough to tell you that depending on texture depth, your FPS *and* input lag could be affected by your VRAM. If your VRAM is not large or fast enough to handle the quality setting you've chosen, your FPS and ultimately, response time, are negatively affected. This is science and math, not opinion.
I will also state that 1GB vs. 2GB vs 3GB vs 4GB on GPUs is subjective: it depends entirely on the game, resolution, texture depth, etc. and it makes a massive difference in games that require a high FPS (such as BF4) and games that have a lot of things going on all the time on-screen (like AC4). I was able to hit a solid 100FPS on a single GTX 770 4GB while an online buddy was only able to hit about 85FPS with the EXACT same configuration, except that he had the 2GB version of my same card (Gigabyte GTX 770 OC 4GB / 2GB). The difference is that the game will use around 70% of your available VRAM. If you have 2GB, this is about 1.4GB, if you have 4GB, obviously it's twice that.To say that xAmount of VRAM is not necessary completely ignores modern game architecture.
Once I got about 1/2 way through this article, I checked the date again to make sure it wasn't 2010. I'm still surprised it's not.
Those of you who keep coming in here and saying that 4GB or whatever above 2GB makes no difference are ignoring the obvious: I can monitor a game via Afterburner and verify the exact amount of VRAM used at any time. During busy times in BF4 it regularly uses 2.6GB of VRAM. And yes, I am using SLI, and yes I'm aware that it's still effectively 4GB. And no, Afterburner isn't stupid and somehow gets the math wrong. The game simply has a lot going on under Ultra settings.
The games tested and comparisons made are virtually irrelevant. Skyrim? Hitman? Give me a break. Skyrim doesn't even use DX11. Comparison of the average VRAM used by Steam users? WTF does that even matter? So, I don't need more VRAM because a bunch of other people using Steam won't pony up the cash for it?
That's what is called a logical fallacy: you're attributing one factor to another even though there is no relation between the two other than coincidence. It's the equivalent of saying I don't need size 12 shoes because the average shoe size is 10. It really depends on my feet, just like VRAM really depends on the use case, not the average user.
The average user can't hit 144FPS @1080p on Ultra settings on a modern game, either.