660 or 760

whiteboy523

Honorable
Jun 23, 2012
27
0
10,530
I am upgrading from a msi 6670 and im debating on getting the EVGA GeForce GTX660 3GB GDDR5 or the GIGABYTE GeForce GTX 760 4GB but hear that it cant power 760 the 4gb and is just a gimmick to sell it if any one could answer my question i would greatly appreciate it.
 
Solution
4GB versions of the 760 do exist and work, but for a card of it's performance level, the extra 2GB is hardly needed.

For the price of a normal 760 ($250-$260) you can get the AMD equivalent 7950 which has 3GB of vram.
If you want 4GB of vram on the Nvidia option it will cost upwards of $300

If your going to be playing BF4 or any of the future Frostbite engine games, they prefer AMD cards. Mantle will likely further improve their performance when it comes out. http://goo.gl/nak5JR

On the other hand if you play much PhysX enabled games and do a lot of encoding, you'll want to go with Nvidia.

EzioAs

Distinguished
If you're gaming at 1080p, don't bother with the 4GB version, just get the 2GB GTX760. If you still need more than 2GB VRAM (for whatever reason), the other option at that price point is to get the Radeon HD7950.
 
Get either the Evga 600W ($40 after MIR) or the Antec HCG 620M ($50 after MIR)

I would suggest an 7950 with 3GB of ram, Sapphire has a good one selling for $10 more than the cheapest 760 [though the 760 [since cheapest 760 is factory OC] is slightly faster compared to boost clocked 7950]
 
4GB versions of the 760 do exist and work, but for a card of it's performance level, the extra 2GB is hardly needed.

For the price of a normal 760 ($250-$260) you can get the AMD equivalent 7950 which has 3GB of vram.
If you want 4GB of vram on the Nvidia option it will cost upwards of $300

If your going to be playing BF4 or any of the future Frostbite engine games, they prefer AMD cards. Mantle will likely further improve their performance when it comes out. http://goo.gl/nak5JR

On the other hand if you play much PhysX enabled games and do a lot of encoding, you'll want to go with Nvidia.
 
Solution