AMD Radeon HD 7970 GHz Edition Review: Give Me Back That Crown!

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
At $0.61 per frame, far from impressed:

Using common games in Guru3D 2012 test suite: Hard Reset, COD-MW2, Far Cry 2, Metro 2033, (no test on ANNO 1404 on 7970 GHz) ANNO 2070, BFBC2, BF3, Crysis 2, AvP, Lost Planet 2. Total fps (summing fps in each game @ 1920 x 1200) for the various options in parenthesis (single card / SL or CF) are tabulated below along with their cost in dollars per frame single card - CF or SLI:


Card - Cost - fps - $ / Frame

GTX 680 - $ 500 - 870 - $ 0.57
680 DCII Cu TOP - $ 520 - 954 - $ 0.55
GTX 670 - $ 400 - 807 - $ 0.50
670 DCII Cu TOP - $ 430 - 879 - $ 0.49
7970 - $ 480 - 750 - $ 0.64
7970 DCII Cu - $ 580 - 795 - $ 0.73
7970 Ghz - $ 500 - 818 - $ 0.61

Published data, ya wanna argue with it .... go to Guru3D.com
 
Realize that this was probably a planned release. I am an AMD / ATI fanboy; but truth be told, AMD knew they were ahead of Nvidia allowing them to come out with a card that could beat AMD - and it did. But AMD forsaw this and intentinally crippled their card to give nvidia something to beat so that AMD could come back and "give an update" to turn the card around to become the new crown. Good marketing scheme really. I dont want to see this all the time, tbh it really shows that AMD is scared of what Nvidia can offer - Otherwise they would have come out with this on release and had full faith nvidia wouldnt beat them. But hey thats y 2c. what you think?
 
WTFis this sh1t? Page being flipped up at bottom right corner and the whole thing blur out every time I visit TH.

Tomshardware, stop hijack my fukcing phone.
 

:pfff:
 
Kudos to AMD on this card. It looks like we have the usual AMD vs Nvidia dog fight these days and that's the way it should be. Nothing beats good competition. Prices on the AMD cards are already dropping since the release of the 670 and when the 660 comes out at the end of July / early August prices should drop even more. All the better for the consumer.
 

OK.
because GTX 560-Ti in SLi battle a GTX 680 and not GTX 580.
the GTX 580 is 2 x GTX 460 in SLi..
 

no...
that GTX 580's down-clocked make a GTX 590 or it might be 2 x GTX 570's make a 590.
one or the other..
 

I think it's two down-clocked 580s. Can't remember clearly about the 590, but the 690 is definately two downclocked 680s.
 
that's what I thought, thanks for verification.
I'm more or less disappointed in those thinking that SLi 460's could equal a GTX 590
and how someone else thinks that SLi 560 Ti's = GTX 580...
now that's just absurd...

(no offense to anyone unless your offended.. 😛)
I have a GTX 580 and SLi GTX 560 Ti's..
 


Well for some reason I couldn't find your post, but it was the first time you introduced the benchmark to us, if I am not wrong you said was max settings + 1920 x 1080 + fxaa (I guess high) , which was the same settings I used. Regards
 


That clearly shows the gtx 670 is the best bang for the buck, but also there's a huge difference between that and "best the money can buy" in which the 7970 and the 680 wins.
 
GTX 560 SLi = GTX 580..
that's about right, maybe a little better than in gaming 1080p and less.

GTX 560 is basically a GTX 460 (1GB 256-bit) with a faster clock speed and slightly lower power consumption, that's it.
same amount of CUDA cores (336).

it's the 1GB of VRAM that limits the SLi 460/560's @ the higher resolutions.
but you already knew that.

edit:
yes, the GTX 680 is a different level of gaming...
 
When did i say they suck?

Or is it your tiny mind playing tricks?
Or are you so illiterate that you cannot read previous posts?




"So yes, AMD suck yes?

No, not at all, Bulldozer sucks.

End of, Phenom II is still rocking in my Dragon rig and i love it, AMD was my old favourite until we hit the floor hard with BD.

No point saying modding is going to help, when all it does is make it a lesser CPU."

I wouldn't go as far as to say Bulldozer sucks as much as today's software does. nVidia chips suck eggs on Linux; are you going to blame the card, or nVidia's lackluster proprietary driver support? Again, thread scheduling in Windows 7 hinders Bulldozer's performance by 10-20%. Addressing this with a fix doesn't make it a lesser CPU. After all, the end result is an INCREASE in performance. When does increase in performance = lesser CPU? If you're a gamer, you can make this CPU really perform by disabling 1 core per module since gaming IS NOT a heavily threaded task and then crank up the clock way past SB/IB. You can address the issue without disabling cores by manually setting process affinity, but then you probably won't be able to push it to 5GHz. Regardless, Bulldozer performs well in heavily threaded workloads, even with improper thread scheduling (though there is room for improvement with a proper patch that utilizes resource sharing). It's a good chip, you just have to know how to work it. And with that said, any REAL enthusiast would probably pick Bulldozer over a locked SB/IB chip any day, especially given the price. At $170, I see a locked 3.3GHz dual-core i3, a locked quad Core i5 at 2.9GHz, or quad core Bulldozer chip at 5GHz with a cheaper on a less expensive platform.

I agree that *STOCK* performance on a Bulldozer chip leaves much to be desired if you aren't using it in a heavily threaded workload, but what enthusiast uses stock settings? Half the fun is milking every last bit of performance without breaking the bank. Further more, with Windows 8 supposedly integrating a proper patch and Piledriver phasing out Bulldozer in the near future, I see a real bargain coming up.
 
Thanks for the fish, but this article has been well and truly trolled, the author should have controlled it rather than let blabberings by the Blue and Green FTW crowd that do their rounds trolling threads with misguided inuendos.


Isn't also TH's responsibility to ensure quality of information, rather than let lunacy prevail so as to create a myopic view that AMD don't release good products, when clearly this is not true. Bais and inuendo needs to be stamped out.
 
[citation][nom]esrever[/nom]50 mhz boosts are kinda low imo[/citation]

This is what 50 Mhz and the awesome driver team did 6 months late for epic failure:

" Perhaps the most ironic data points come from the GeForce GTX 680 and 670, though. The same AMD-supplied, AMD-optimized build of MediaConverter also supports CUDA, *snip!* And the 670
and 680 STOMP THE 7970 INTO THE DIRT.

Are amd employees committing suicide like foxconn's ?
 


Want to compare compute logic?

luxmark.png


Yeah, talk about getting stomped... The 7950 kills the GTX 690.

Seriously dude, take your trolling elsewhere. Both GPU's have strengths and weaknesses and there isn't a clear winner; it depends on the user and their application(s).
 
[citation][nom]jerm1027[/nom]Want to compare compute logic? Yeah, talk about getting stomped... The 7950 kills the GTX 690. Seriously dude, take your trolling elsewhere. Both GPU's have strengths and weaknesses and there isn't a clear winner; it depends on the user and their application(s).[/citation]

Yes, let's compare it. nVidia WINS BIG, because amd support and drivers SUCK there as well. The example I highlighted should be a LESSON in that for you... somehow you're totally oblivious...


How lost are you ? Need some anti-depression meds too ? Wait, you don't, you're still living in the self deceived blissful idiot mode...
 


Umm... links or it didn't happen. :non: Media conversion IS NOT compute. And a 3rd party (<--meaning Arcsoft, not AMD) CUDA application (<-- meaning nVidia optimised) beating an AMD GPU is hardly compelling.

talking about oblivious, you clearly missed all of page 14. The 680 failed to chalk up a single W against the 7950, much less the 7970, in an entire suite of applications.

I guess this is why they say don't feed the trolls... :pfff:
 
[citation][nom]jerm1027[/nom]Umm... links or it didn't happen. Media conversion IS NOT compute. And a 3rd party (<--meaning Arcsoft, not AMD) CUDA application (<-- meaning nVidia optimised) beating an AMD GPU is hardly compelling. talking about oblivious, you clearly missed all of page 14. The 680 failed to chalk up a single W against the 7950, much less the 7970, in an entire suite of applications.I guess this is why they say don't feed the trolls...[/citation]


Luxmark is a BENCHMARK, not an application.

Arcsoft media converter is a PROGRAM PEOPLE USE.

Get a clue, you WON'T be "computing" with your 7970.

However, nVidia has MASSIVE application support, while amd has winzip.

You are CLUELESS.
 
How clueless is the amd compute fanboy ?

" Adobe Photoshop CS5 automatically detects NVIDIA® GeForce® or NVIDIA® Quadro ® GPUs to enable these accelerated features. "

That's how clueless they are.
 
Status
Not open for further replies.