AMD Radeon HD 7970: Promising Performance, Paper-Launched

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mergatroid

Distinguished
May 2, 2008
40
0
18,530
First, great review. Very informative.

This looks like an awesome piece of hardware. After they improve the drivers and their partners tweak their software with this card in mind I think its performance will be great. I'm using two HD6970 in crossfire right now, and they perform very well. Two HD7970 cards would be awesome but they will have to get the price under $400 before I'd even think about it.

Again, thanks to Tom's for another great video card review. Hop you do it again when they have optimized the drivers.
 

fulle

Distinguished
May 31, 2008
968
0
19,010
70% higher transistor count than at 40nm, but we're seeing about a 30% performance gain? I'm pretty underwhelmed tbph. Overclocking headroom is high, and these are open to custom cooling solutions from the start, from what I've heard, so with that added to the picture it's less of a disappointment, I guess, but the price sucks.
Yeah, I said the price sucks. This is a chip that costs about the same to make as a 6970, but the price went up from 350 dollars to 550? What the hell?
Reasonably, at launch, a price of 450 dollars would have been appropriate, but the 550 tag is gouging. Probably will be reflex reduced to 450 when Nvidia releases appropriate competition. I find it pretty annoying actually.

The table is set for Nvidia to win me back, if this is the best AMD can do at 28nm.
 

kaitheus

Distinguished
Sep 4, 2009
189
0
18,690
I hate how they do a review before the cards are officially released, don't have the *Proper* drivers don't even have a PCI-E 3.0 board to use during tests and then expect us to believe there not worth getting and that we should look at *Kepler* as an Alternative. I'm srry but these cards are a Huge stepup from all current cards in every way, and now that they decided to finally start using there memory at 384-bit there's ganna be some great competition when Nvidia releases there new set.
So until we see these tested on a PCI-E 3.0 board I'm not think these aren't worth it, how ever there a bit pricy so now its a waiting game till they drop a bit :p lol.
 
[citation][nom]fulle[/nom]70% higher transistor count than at 40nm, but we're seeing about a 30% performance gain? I'm pretty underwhelmed tbph. Overclocking headroom is high, and these are open to custom cooling solutions from the start, from what I've heard, so with that added to the picture it's less of a disappointment, I guess, but the price sucks.Yeah, I said the price sucks. This is a chip that costs about the same to make as a 6970, but the price went up from 350 dollars to 550? What the hell? Reasonably, at launch, a price of 450 dollars would have been appropriate, but the 550 tag is gouging. Probably will be reflex reduced to 450 when Nvidia releases appropriate competition. I find it pretty annoying actually.The table is set for Nvidia to win me back, if this is the best AMD can do at 28nm.[/citation]

Well as it stands, it pretty much beats the 580, so why wouldnt they price it more than a 580? That makes sense for now until its out in the wild and they see what consumers are willing to pay, then adjust from there.

[citation][nom]Kaitheus[/nom]I hate how they do a review before the cards are officially released, don't have the *Proper* drivers don't even have a PCI-E 3.0 board to use during tests and then expect us to believe there not worth getting and that we should look at *Kepler* as an Alternative. I'm srry but these cards are a Huge stepup from all current cards in every way, and now that they decided to finally start using there memory at 384-bit there's ganna be some great competition when Nvidia releases there new set.So until we see these tested on a PCI-E 3.0 board I'm not think these aren't worth it, how ever there a bit pricy so now its a waiting game till they drop a bit lol.[/citation]

All your issues were addressed in the *preview*. They called it a preview. They never once pretended it was anything more, and stated it and re-stated it throughout the article, promising a real review in the near future. They also speculated that there's no gaming advantage to pcie 3.0 over pcie 2.0 even at 8x. They hypothesized there may be some advantage when doing computation when multiple cards are speaking to each other directly.
 

maxinexus

Distinguished
Jan 1, 2007
1,101
1
19,360
It kicks 580 right in the groin. From every angle it is superior. With 3GB memory it is actually cheaper than 3GB versions of 580, that is sweet. I can't wait and see 7950 previews. Hope it will match or be slightly better than 580. Good job AMD/ATI
 

redeye

Distinguished
Apr 29, 2005
225
0
18,710
Are 2D tests the only thing that the 7970 does poorly?... 2D?, and the space heater gtx580 is better?
so judging from the data points provided regarding 2D performance, i should get a gtx580 to run my powerpoint/spreadsheet/wordprocessing tasks? of course not.

thank-you for including the obscure 2d gdi tests that have never appeared in any other review of yours ( if so please correct me lol...) well if you had included them, you would have had data from other far cheaper cards that would have blown the gtx580 out of the water, or matched it) in 2d tests...
meaning the conclusion would have been, in the section of the article, that who cares about 2D...
further more why would i buy a gtx580 for fast 2D performance?, MATROX video cards are still around and they are most likely better than the 580 in 2D (anything other than gaming).

Perhaps, your contract with Nvidia (or subconsciously agreed to contract...) that you must find at least one advantage that a nvidia card blows AMD out of the water. i will be awaiting your 'roses and chocolate' review of the New flagship card from Nvidia... (ironically, you probably won't be reviewing it so that subconsciously agreed to contract stays intact...)



 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
[citation][nom]redeye[/nom]Are 2D tests the only thing that the 7970 does poorly?... 2D?, and the space heater gtx580 is better?so judging from the data points provided regarding 2D performance, i should get a gtx580 to run my powerpoint/spreadsheet/wordprocessing tasks? of course not. thank-you for including the obscure 2d gdi tests that have never appeared in any other review of yours ( if so please correct me lol...) well if you had included them, you would have had data from other far cheaper cards that would have blown the gtx580 out of the water, or matched it) in 2d tests... meaning the conclusion would have been, in the section of the article, that who cares about 2D... further more why would i buy a gtx580 for fast 2D performance?, MATROX video cards are still around and they are most likely better than the 580 in 2D (anything other than gaming).Perhaps, your contract with Nvidia (or subconsciously agreed to contract...) that you must find at least one advantage that a nvidia card blows AMD out of the water. i will be awaiting your 'roses and chocolate' review of the New flagship card from Nvidia... (ironically, you probably won't be reviewing it so that subconsciously agreed to contract stays intact...)[/citation]


The 2d stuff is on their benchmark charts for every card they have tested. But take heart young padawan this card still beat the 580 which is the current top nvidia single chip card. So its not all bad.
 

hannibal

Distinguished
The gpu accelerated 2d would save some electrisity and offer smoother scrolling. Not so important, but something that has to be followed. With 7970 the beta stage drivers are not covering every aspect so far... Not astonihed, nor should you. It is just a matter of time until those things are also ok.
This is a fine card! This does not chance it!
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
this will be on page 9 or 10... so lets mention this.

the gpu compute will be working soon. i mean seriously, look at their apu chips, the can compete with a mid range card to some extent, and kick the crap out of intels offerings.

now imagine that gpu working to do more than just graphics, hopefully you see what i see.
 

lott11

Distinguished
Oct 23, 2009
33
0
18,530
Ho boy has someone gotten bashed for playing down ATI.
It is just like when I bashed down the reviewer for a $500 Intel rig.
Yes we all know that if anyone spends $7000 bucks he is going to get the best score.
That is why most people complaint, yes if you spend $1000 bucks for a CPU it is obvious!
That the machine would be faster, like wise if you spend less you would get less.
But the point is!! it is not what you spend, but what you get for it.
At that price point!!
We read Tom's review to get a impartial point of view, or so we think.!
That is the reason for everyone coming back to read this articles every week or days.
Or am I wrong!
We all come to see and get the most for the least, or at least best info possible for a impartial report.
And for the most part we do get that, but some times it just looks to obvious that it is not.
We all have favorite components, as Engineer I am more partial to some of AMD architecture for some APP.
Also some of Sansung's, Misubishi, and IBM and others, but not for desktops.
The point is be honest and impartial, like a reporter is so pose to be.
We get enough BS from politicians and edited news for prime time, from other corporate news.
“Yes it is not ready for prime time!! but it is a good video card at this price point, with out it being finished.”
Just wait for the real review with full benchmarks.
That is what we are expecting some benchmarks and a clear and unpretentious report.
 

JerryC

Distinguished
Nov 20, 2007
143
9
18,695
What many of you are forgetting is that Nvidia will be releasing their new gtx 600 cards very soon. A smart person would wait to see how those new cards perform before taking that plunge. Yes, waiting sucks, but how are you going to feel if the new Nvidia cards perform 25% better than these new ATI cards do?
 
[citation][nom]jerryc[/nom]What many of you are forgetting is that Nvidia will be releasing their new gtx 600 cards very soon. A smart person would wait to see how those new cards perform before taking that plunge. Yes, waiting sucks, but how are you going to feel if the new Nvidia cards perform 25% better than these new ATI cards do?[/citation]

Nitpicking is annoying but come on... ATI is dead, Radeon brand is now called AMD. Yeah AMD owned ATI Radeons for years but AMD isn't calling any new Radeon cards ATI anymore.

Besides that, yeah we really should wait until the next Nvidia series comes out before making any decisions. Honestly I don't think Nvidia will blow AMD away but they might be better. Regardless, AMD came a long way, much farther than I expected.
 

masterofevil22

Distinguished
May 13, 2010
229
0
18,690
I'm not waiting Another six months for Nvidia cards to come out. Nvidia always bring out there new designs with Crazy bad thermal envelopes and energy consumption and although they "may" come out faster; it seems to be nvidia's mentality that this is all that matters at the expense of everything else. Not to mention the fact that by that time their cards come out the Radeon 7000 series will have a very mature and stable driver and feature set and the new nvidia cards will be just that; new. They'll have the same growing pains as any new architecture and I'm not waiting a year for a (possibly) semi-faster, power guzzler with decent drivers.. but that's just me.

If Nvidia drops the price of the gtx580 to an appropriate level, aka ~300 and the price of the 6970 goes down to ~225 or so and so on and so forth...I'll be pretty happy with GPU landscape for the time being.
 

nvidia can get away with power guzzling inefficient but high performing cards because of few things.
one of them is nvidia's very good relationship with game developers. they sponsor(using a very loose definition) games and let devs use nvidia in the ads e.g. twimtbp-something games. more games use nvidia's gfx tech (fxaa, physx, cuda) too. secondly, people who call themselves gamers do not really care about power efficiency. i've read about many people practically berating lower power consumption for higher fps.
amd otoh, has really bad history with driver release. once a new game launches, amd almost makes it their job to release new drivers that will mess with the game. it takes them quite a few releases to get the drivers right. amd doesn't let users edit cfx profiles. they have very little contact with game devs (from what i've seen). fewer games support amd's tech (amd's 3d, mlaa etc.).
 

masterofevil22

Distinguished
May 13, 2010
229
0
18,690
I admit that Nvidia is more aggressive at getting game/software devs to use their tech, but they both have no choice other than to constantly update drivers to meet what software/game devs are doing; aka putting out new games/patches/etc. and neither company gets it perfect the first time around. I'm not an ATI fanboy by any means, I've had my fair share of Nvidia cards and 3dfx voodoo cards b4 that believe me. However, now that the economy is what it is and I work hard for my money I like having my 80plus gold psu and i like getting more horsepower for less watts. To say that no "true" gamer cares about efficiency is false. I know my electric bill cares..
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
[citation][nom]shin0bi272[/nom]I cant wait to see how this stacks up against nvidia's 780 (yes they are skipping 600s for the most part it seems). . . "According to the released info, Nvidia’s Next Gen flagship GK-100/GK-112 chip which will feature a total f 1024 Shaders (Cuda Cores), 128 texture units (TMUs), 64 ROP’s and a 512-bit GDDR5 Memory interface. The 28nm Next Gen beast would outperform the current Dual chip Geforce GTX590 GPU."http://wccftech.com/nvidia-kepler- [...] h-q2-2012/Also if you look at the specs for the 590 its just about identical to what the 780 will have.[/citation]


Since Toms doesnt seem to want to post any news on kepler... *cough*AMDbias*cough* I'll post this news too

http://flyingsuicide.net/news/nvidia-kepler-to-run-synchronous-geometry-and-shader-clocks/
 
I take back what I said about the 7k series, that it was merely a rehash of the 6k series. I was totally wrong.

The raw performance of this card in most 1080p titles have given me a bit to think about for my card in my (hopefully upcoming) build. Eyefinity is very interesting, since I'd like a three-display setup which would require SLI if I went with my preferred brand Nvidia.
 
[citation][nom]eddieroolz[/nom]I take back what I said about the 7k series, that it was merely a rehash of the 6k series. I was totally wrong.The raw performance of this card in most 1080p titles have given me a bit to think about for my card in my (hopefully upcoming) build. Eyefinity is very interesting, since I'd like a three-display setup which would require SLI if I went with my preferred brand Nvidia.[/citation]

Three display gaming is still very intensive... At max settings (or close to max) even a single 7970 often has problems keeping decent frame rates. I think a 7990 would have enough horsepower to get excellent frame rates on a 5760x1080 resolution without sacrificing other settings.
 

beltzy

Distinguished
Jan 25, 2010
481
0
18,860
Dear Tom's,

as a happy consumer of your content, I have some feedback and a request. I appreciated you being frank about the rushed paper release- no other review site really called this out as (appropriately) harshly as TH. I also was glad to see you include 5760x1080 in your performance benchmarks. After all, this is becoming a more common setup for enthusiasts, is interesting, and represents the most demanding resolution as far as number of pixels to push. My excitement over this inclusion was brought down a little when I realized you did not run benchmarks at 2560x1600. As a user of this resolution, I consider it an important performance indicator not only because we can use it to compare to previous reviews but because many enthusiasts use this (it's sort of the display equivalent of the fastest single GPU card :p ). If you do a follow up review, I would ask that you include this setting! Oh, and a crossfire set of benchies would be great if you can fanagle another card. Keep up the great work- you guys are awesome.
 
[citation][nom]Anonymous[/nom]"Decibels are a unit on a logarithmic, not linear scale. A 3db increase is a doubling in intensity. Why don't you abstain from b***s******* if you don't have any clue what you're talking about?"6db is a linear doubling, not 3db, but all the same, it's measured that way for a reason, because the human ear does not perceive loudness linearly.[/citation]

You have some truth in what you say. HOWEVER...
if you wish to use dB where it is mostly used, in audio, to obtain a 3dB increase in sound, you have to double the power of the amplifier, and this is perceived as slightly louder by most ears. To seem 2x as loud, you have to increase that power by 10x.
 
[citation][nom]jerryc[/nom]What many of you are forgetting is that Nvidia will be releasing their new gtx 600 cards very soon. A smart person would wait to see how those new cards perform before taking that plunge. Yes, waiting sucks, but how are you going to feel if the new Nvidia cards perform 25% better than these new ATI cards do?[/citation]

Following your advice, then, I'd never buy a new video card, CPU, or anything else. Why? Because something better is always on the horizon. So... you buy what you feel is the best for you at the time you're ready to buy, or you'll still be using a C=64 and wondering why BF3 and Skyrim isn't being recognized by your slow (even for its time) floppy disk drive.
 
Status
Not open for further replies.