GTX 670 or HD 7970 on a low end CPU & RAM

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

kalash156

Honorable
Jan 29, 2013
10
0
10,510
I've always upgraded my graphics cards to just "good enough" cards. So I finally decided to get a good card. I'm pretty much decided on either a GTX 670 or Radeon HD 7970 card. The thing is, I'm running a Sandy Bridge Core i3 (3.1 Ghz) CPU and 8 gigs of 1333 RAM. I know there's going to be some bottlenecking as far as frame rate, so my question is, how much of a frate rate drop should I expect? And which of the two cards would you recommend (or other cards withing the price range)?

I've settled between:
This Gigabyte GTX 670 Windforce
And this XFX Radeon HD 7970 Black Edition

A few points to note: I don't particularly care about PhysX, just wanna be able to play at max settings on most games with nice frame rate. I will be playing on a single 27-inch 1080p display. I might overclock the card if I need to, but if performance is there to begin with, I'll leave it stock. I will not be using the card for any professional video / graphics editing or modeling software. I wouldn't mind sacrificing a bit of performance for better cooling (hence why I'm interested in that Windforce card). My budget is around $450. Games I mostly play are: Skyrim (with a few graphics mods), Sleeping Dogs, Saints Row 3, Borderlands 2, Batman Arkham City, Far Cry 3, Civ 5, Anno 2070.

Other specs are:
Motherboard: ASrock Z75 Pro3.
PSU: Thermaltake Toughpower 675 watt.
Storage: Samsung 840 Series SSD (system) & 1 TB HDD (media)
Current graphics card: Saphire Radeon HD 6770 (overclocked to Core = 960; Mem = 1330)

I'll probably upgrade my RAM in the coming weeks. Maybe 8 or 16 gigs of 1866. I figured I'd wait to see what Intel has in store with their Haswell CPU's. Don't really feel that I need to upgrade since all I use my PC for is web browsing and gaming. I know Intel is primarily focused on laptops with Haswell, but I think they said that new line of desktop CPU's will be out by the end of the year. Correct me if I'm wrong on that one.

Any suggestions or recommendations are appreciated :)
 
Solution
CPU:
Yea I think the Ivy bridge scenario will be quite attractive I would imagine the pricing may be a little cheaper however it wasn't this last generation. The Sandy Bridge CPU's held there ground price wise due to there overclockability.

Physx:
Fair enough just thought I'd throw it in there since you brought up some major Nvidia based titles. In Batman I feel its more subtle but it adds to things here and there. With Borderlands 2 you can tell a little more guns reacting different effects against objects but your opinion is fair.

You would be bottlenecked more by the fact you have a dual core versus a quad core. I'm not sure how much FPS would be lost in games in games that benefited more from a quad core you could lose quite a...


In Borderlands 2, it does give enemies a slight advantage in some situations. Ex. using PhysX, I mean fake PhysX :sarcastic: , the game has those flapping blankets and rags everywhere. A lot of times those rags are close to cover positions. So whenever I try to shoot enemies from behind those rags, I can't see where I'm shooting; but the enemies can see you perfectly. This was actually one of the reasons I stopped using PhysX in BL2.

MatildaPersson
Visual quality is subjective. In Arkham City, PhysX does make the game look much better because it "interacts" with objects and scenery already present in the game. In Borderlands 2, aside from those rags, it simply spawns new particles out of thin air when you shoot at things. Which to me, makes it look tacky and not "good". Just because a game features some kind of extra visual effects, doesn't mean that those effects will make the game look any better. Case and point: grain filter in Grand Theft Auto 4. It's an extra visual effect, but it made the game look like crap. Very subjective, again.
 

tumblr_m99hb5ApIy1qc7mh1.jpg


Flapping blankets? Really? Like, you actually typed out that paragraph for real?

I don't even...
 

If someone considers "flapping blankets" an issue, they.... I don't even... How can you read that and not laugh?
 
Fake PhysX? Runs like crap?

Keep in mind the difference between running on GPU and CPU is just alleviating the load from one component to the other. I can run PhysX high on my 6850 fine. Just gets really distracting sometimes.
 


Thank you. And I didn't mean to sound like a douche about it. You have been actually helpful. I just didn't like the way BL2 looked with PhysX. I think I'll go with that Gigabyte Ghz card I've found.


MatildaPersson
I'm new to these forums, but I see I've found my first troll. Well done! Carry on preaching about PhysX.
 

I pray to God they don't put flapping blankets in any future games. You might not be able to play! Oh the horror.
 
You will find them all the time Kalash but I do promise you that there are those on the forums that want to see people get what they are after and to see them helped. I'm someone who tries to be unbiased. I learned in a speech class that I took in school that the moment you show bias you lost your debate/argument.
 

No. That's not how PhysX works. It's specifically programmed to function on a GPU architecture. CPUs handle it poorly. It is not simply "alleviating the load from one component to another." Aside from that, Nvidia owns the rights to it and has no intention in the world of making it accessible outside of their hardware.
 
The question in the title is very adequately and pointedly addressed here:

http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407-9.html

Our benchmark results have long shown that ATI's graphics architectures are more dependent on a strong processor than Nvidia's. As a result, we usually arm our test beds with high-end Intel CPUs when it comes time to benchmark high-end GPUs, sidestepping platform issues that might adversely affect results designed to isolate graphics performance.
 

Not exactly. That article takes into account only 2 processors, neither of which is the one I am using and it doesn't really address my original question. With my original question I was hoping to get specifics on just how much it would affect gaming performance, in terms of fps or other performance issues (random lag or graphical anomalies) compared to higher-end CPUs. Or if it will actually negatively affect performance compared to my current setup of Core i3 2100 and HD 6770 (overclocked).
 
dude phsyx is the best thing it acctually increases your fps from 60>120fps and also it acctually makes the game look like 2000x better and it also gives you more maps and guns buy nvidia and u get free maps and candy its teh best 😀
 
I for once will agree that turning some graphical features helps me play better and enjoy the game more. For example, motion blur is annoying and makes me feel sick, post processing in some games simply blurs everything out to make textures SEEM nicer, but it gives me a headache because my eyes feel like they cant focus. Shadows and ambient occlusion make it somewhat harder to see enemies in online FPS games. When playing at 1080P, more than 2X anti aliasing makes no visual difference, so why burn extra energy and generate extra heat?

Just how I prefer to roll....
 
Status
Not open for further replies.