Best Graphics Cards For The Money: January 2012 (Archive)

Page 22 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Yeah, but then the 660 Ti is kinda redundant with the faster and cheaper 760 in the picture. The real competition is between the 7950 and the 760. The 760 has the performance edge at stock, the 7950 may more or less catch up to it overclocked. But at the end of the day it comes down to pricing. The 760 gave AMD a shiner with its low launch price. Now AMD has turned the table by making the 7950 $50 cheaper (plus the game bundle).

At least for me, the difference between $200 and $250 is bigger than the difference in performance, features, power consumption etc. between the 7950 and the 760.
 
Looks like on performance per dollar part Don forgot to add performance per dollar chart.
IT is here
646f0419d60d

Which clearly makes 7790 value winner.

So after 7790 we start to loose value on our purchase but questions stand are the performance gains worth it ? From statics stand point yes, but it's up to you what quality(resolution) you want to game on.
Chart of performance gains by value lost represented in as blue bar.
Clearly shows that 7970 is not the place where diminishing returns kick in... it's actually sinkhole point, but that's just statistics.

 




i got my 660Ti in December 2012.
right now would definitely pick the 760
the problem with electronics is something newer better and cheaper come along as almost as soon as you buy
still pretty happy with the 660Ti
mainly got if for Folding at Home
http://folding.stanford.edu/
a distributed computing project to help medical research with cancer, alzheimers etc
also one of the hardest stress testers there is

my six month upgrade plan is to get a 2700k or 4770K and a second 660Ti for SLI
that should hold me for quite awhile
 
We all need to be completely honest here and admit that the 7950 is a faster card then the 76. I'm not saying the 760 is slow or talking bad but the 7950 is faster, case closed.
 
so both multiple anandtech and toms hardware benches which the majority showing the 760 beating the 7950 are wrong?

I love blanket statements with no links to any kind of proof

if you notice that since I am a long time THG member that if I make a statement I post a link to a review or bench backing up my facts

that is how it is properly done her on THG
 


I got some benchmarks to back up the claim that a OC 7950 beats a OC GTX 760. Although at stock speeds they are pretty similar in performance and pretty much trade blows with certain games. And even then with only a few FPS.

Benchmarks OVERCLOCKED :http://www.youtube.com/watch?v=dmGWyAyO9mc
 


and did you bother looking at the spreadsheet and see how pathetic the 760 "overclock" was?
100 more on the core clock for 1080 when most any 760 will hit 1140 -1200 (not including boost) against a 7950 with 1100 on the core - close to the max of 1150 unless you pump in 1.25+ volts for ~1200.
sorry linus failed at that OC attempt; he throttled it like a choked chicken.
 


Oh okay then here is a link to a different bench mark with the gtx 760 at 1280 CORE CLOCK vs 7950 1150 CORE CLOCK. As you can see the 7950 still beats the gtx 760 by a tad bit.

Link: http://www.hardocp.com/article/2013/07/02/msi_n760_tf_2gd5oc_gtx_760_overclocking_review/5
 

I think the sensible thing here is to ignore the flamebait.
 


much better! and i am familiar with that article, it is a good one imo.
the MSI N760 OC, at stock settings, technically beat out, in raw performance, the Radeon HD 7950 Boost. The gameplay experience was the same. However, when we overclocked both, the Radeon HD 7950 turned around and provided the technically higher raw performance. However the gameplay experience was still the same.
sounds to me they are saying stock or OC both are equal when sitting front of a game. but i'll add that tomb raider and hitman absolution would be a 7950 game.

 


this is just fundamentally wrong. and overclocked tahiti le catches up to a stock 7950 and even surpasses it a little but it does not compare to an equally overclocked 7950. It just has less processing power

1800 stream processors compared to 1540
and a 256 bit mem interface compared to a 384.

sure these cards compete at 1080p but not above that.

 
@ Cleeve : A couple months ago we spoke about the Intel HD 4000 being on par with a radeon 6400M series card on the hierachy. I completely forgot about testing it, but did so last weekend.

Hardware:
Common spec:
8GB DDR3
128GB SSD

HP ProBook 6560b
i5-2410M (2.3Ghz) & Radeon HD 6470M

HP ProBook 6570b
i5-3210M (2.5Ghz) & Intel HD 4000

Software: War Thunder open beta. Benchmark Pacific @ 1024 with low preset and no sound.

Results:
6560b with Radeon: Avg. FPS 41.1 Min FPS 23.5 Score: 4086
6570b with Intel : Avg. FPS 41.3 Min FPS 26.9 Score: 4106

Intel likely scores a bit higher due to 200Mhz more on the core.

Doing a test with furmark I noticed no change in performance with or without prime95 running in the background.

In short, the hierachy speaks truth. It really is on par.

That being said, we've got issues with commercial software and the intel drivers. Stuff like Planit Edgecam and Autodesk Inventor don't work properly. Seems they've hired ATI driver programmers @ Intel now.
 

I want to add AMD cards have more settings to enhance video quality, though they use more power during video playback to do that. http://www.tomshardware.com/reviews/hqv-2-radeon-geforce,2844-10.html
 


We will just ignore that since drivers have addressed that issue AGES ago.
 


I guess AMD haven't found a way to provide us the highest video quality without making their cards more power hungry yet, you can notice here http://www.techpowerup.com/reviews/MSI/GTX_760_HAWK/24.html that AMD cards use more power during Blu-ray playback than NVidia's cards, the same thing happen with 720P youtube videos, the reason behind this is AMD cards run 720P/1080P videos at 450mhz GPU clock and not idle GPU clock 300mhz (it depends on what GPU you're using), whereas NVidia's cards playback videos at idle GPU clock.
For example the idle GPU clock and blu-ray playback of this GTX 760 is 135MHz http://www.techpowerup.com/reviews/MSI/GTX_760_HAWK/30.html ,whereas the idle GPU clock of 7870 is 300mhz and the Blu-ray playback is 450mhz http://www.techpowerup.com/reviews/Powercolor/HD_7870_Devil/30.html
Nobody cares about the 10w-20w more power consumption, but the truth has to be said.
 
I do not understand how the latest Intel Integrated graphics (4400 , 4600 , 5000 , 5100 , 5200) are not included on the chart but the newer 7730 from AMD is. Could you please include those IGPU's on the hierarchy chart on the next release? It would be good since many folks have laptops and use those chips
 

Kinda difficult to rate laptop GPUs, even more so laptop iGPUs. Besides, this is perhaps not the most obvious article for someone with a laptop to check out, since it's about desktop graphics cards only.
 
Its funny how GPUs are not droping in prices once a new generation is out (at least not as much as they used to 5 years ago).
Im assuming this is due to the fact that I can still actually play most games with an old 560 TI, or even a ATI 4850?

I remmber when we had the jumps from nvidia 9800 GT to GTX260 and 280. Now that was something.

The reason I point this out is becouse when I first got my 4850 back in the day, it costed 149 Euros. Then i Got a GTX 560 TI for 196 Euros (kinda the same performance for its time, but 50 Euros more), and now if i wanted a a 660, 220 Euros, and a 760 260 Euros.

Seems like there are no price drops really
 

Dude... the 7970 launched at $550. Now you can get it at $300, minus rebates, and with a 3-game bundle thrown in for good measure. If anything has changed, it's just that the price drop seems to happen during the middle of the product life cycle instead of closer to the end.
 
Status
Not open for further replies.