7970 vs 670

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

sitbar

Honorable
Oct 24, 2012
191
0
10,680
HI guys, so I need help choosing between the 7970 and the 670. I was gonna get the 670, but now seeing as how the latest drivers for the 7970 has really ramped up the speeds for the 7970 in games like bf3, its more even. I need help choosing in between the

Msi 670 power edition

Asus direct cuii 670 nontop (will oc myself)

And the XFX Radeon HD 7970 Double D Ghz edition.
Any help will be appriciated.

Edit: Will be ocing all cards.
 
Arkham city and Borderlands 2 are Physx games, playing with the feature enabled or not did not make the gameplay anymore immersive in my opinion. 3D is amazing however, and thats a feature that should be pushed more as far as I am concerned
 
Well your opinion on those games with Physx is your own opinion. With borderlands 2 it changes every shot and the environment changes more then any game that uses Physx. Again just your opinion and with Batman you are dead wrong on that as well. 3D I don't know is opinion to I guess you won't find too many people who care about 3D much until monitors do it more passively without the need of glasses I don't see it taking off. Its being forced into the market but that doesn't make it in demand.

And getting back to things like its been stated over and over physx is not worthwhile to him unless he plans to ever pick up those games that do use it. It has also been stated that Skyrim performs better with mods and in general on a AMD solution. And stuterring can happen on any card in these games because most games don't have the demands that these newer titles are producing.

For what hes doing he will be happier with a 7970 both are good cards but for him its the best. Leave opinions about visuals at the door.
 
A lot of speculation has been raised about that 'Tech Report' article. It has appeared on other posts. It was pretty much the first of its kind.

Several people have noted that the AMD test bed was utilizing a beta driver while Nvidia test bed used an officially released driver. Thus questions were raised about validity of the results.

I would like to see more tests conducted using officially released drivers from both cards. It may possibly hold merit however testing methodology utilized definitely could have been better.
 
What would be cool rwayne and I'm not sure this is possible for a site to do is have a bench list of every card at least 2 generations back I think would be the limit and each time there is a update driver wise meaning only release drivers and bench every card just stock cards and the top 3 or so non stock cards.
 
:lol: +1
 
Yea, the vapor x is expensive. How about the 3rd card the other sapphire card? Couldn't I just oc it my self, and does the vapor x justify its much higher in front of the other sapphire card? I heard that the dual x (the other sap card) Is actually one step one from the vapor x, so that could help with ocing.
 

Well i'm not gonna be over volting the cards so thats not a problem with me. So do you think the vapor x is still gonna be worth it if I dont over volt it? So should I just get the 7970 oc and oc it my self by adding a bit of voltage? How much do you think it can go? Will it perform just as good as a vapor x?
Sorry for all these question :/
 

Ah dang, it is quite expensive :cry:
But I guess if it's worth it, I shall save a bit longer which is hard since im in highschool and have to work a job to get the money XD
 


What engine is used is irrelevant. You don't tell Gearbox which to use - they decide for themselves. They chose PhysX. And you can't prevent them (or any other developer) from doing the same in the future.



You obviously don't know how adaptive v-sync works. If you have basic v-sync enabled on a card that would be putting out 55fps, it'll put out 30fps (if frame can't be rendered in 16ms for 60Hz, it's held over until the next refresh cycle). 55fps vs 30fps is a 83% performance gain.

EDIT: To clarify, I should have said "your argument for what physics modelling could have been used is irrelevant.". Was on my way out, so fast post.
 
So what you are saying is that vsync the way that it works is a bit more violent of a frame change then with adaptive vsync.

Either way though it kind of gets away from the topic of hand and for this guys solution he has it figured out. I see what sams doing hes trying to push aside the fanboyism that the AMD folks on the forums show but it won't help this thread either side.
 
Ah I didn't realise guy had already chosen 🙂 I normally read all posts, but there's been a ton of new posts since I was last on here and I was just about to go out, so fast response. It's worth understanding adaptive v-sync though, because it's definitely not just a gimmick (the classic response of the AMD fanboy - if it's an NV feature AMD lacks then it's a gimmick 🙂). I'll admit that 55 vs 30 is a worst-case scenario for basic v-sync - at 45fps the GeForce would just have a 50% lead and at 35fps, a 16% lead. But it's well-worth having over 30fps limit.

Anyway, to explain: v-sync (as in basic v-sync) is a technique used to manage framerates, so that if your framerate exceeds your monitor's refresh rate (typically 60Hz), you won't experience screen tearing (Google screen tearing images and you'll see what I mean). All cards support basic v-sync, including Radeons. The drawback is that if your card drops below 60fps, even just slightly, the framerate will be significantly reduced by v-sync.

To sync with the monitor's refresh rate, the frame has to either be rendered in 16ms or less (for 60fps) or it's held over to the next refresh cycle. One frame over two refresh cycles (on 60Hz monitor) results in 30fps. So even if your card would be capable of 50-55fps, you drop to 30fps! So v-sync is really bad for performance if your GPU can't stay above 60fps, but v-sync is needed if you sometimes go over 60fps and so need to prevent tearing.

Adaptive v-sync will deliver smooth, fluid performance, since it won't restrict your framerate like basic v-sync does. With adaptive v-sync, when you drop below 60fps, v-sync just switches itself off so you get 55fps instead of 30fps. As soon as you reach 60fps again, adaptive v-sync switches itself back on. Sounds like a very simple solution, but it must be hard to build into the drivers because AMD still hasn't done it (and it's taken nVidia years to do aswell).

You can read about it in detail at http://hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review

"With Adaptive VSync turned on, the feeling of the game felt smoother compared to regular VSync turned on. The performance felt much like the game felt with VSync turned off. This is the kind of technology we like to see which has improved the innate nature of the gameplay experience. If all you need is 60 FPS in a game for it to be playable, then why not just go ahead and cap the game there so it doesn't exceed your refresh rate. Then, if the game has to fall below that, allow the game to perform at its real-time actual framerate, and Adaptive VSync allows that. It really is the best of all worlds, with no drawbacks. We didn't find any negatives to using Adaptive VSync, and we tried it out in a handful of games."
 


How many of the existing PhysX games did we know would use PhysX well before their release? You may be right. Infact there may never be a single new game from now on that uses PhysX. But there might be a few. There might be a whole load. I can't see the future though, and neither can you.

For me, it's really important to me that I'm playing the game the way the developer intended it to be played (isn't that why it's important to us to play games at max settings?). If that developer implemented PhysX for the physics modelling (and there's no option to get the same effects via Havok or an in-house implementation) then PhysX is the way they intended it to be seen.

I played Borderlands 2 and the existing DLCs on a Radeon, but I'll be waiting until I get a GeForce to play the remaining DLCs, so I can play them the way Gearbox intended them and so I can see them at their best.