AMD Radeon HD 7970: Promising Performance, Paper-Launched

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

julianbautista87

Distinguished
Jan 6, 2010
153
0
18,690
1
[citation][nom]airborne11b[/nom]Some rumors say Q1 for Nvidia's next generation, some say Q2. But from benchmarks I'm seeing here, I'm not impressed. In the most important benchmarks (1920 x 1080 w/ AA) it bearly outperforms the MUCH older 580 GTX. I say most important because this single card isn't capable of handling anything more then 1080p w/ some AA on. So all I'm taking away from these benchmarks is that AMD has a card that plays about the same (or a tiny bit better) then the one YEAR old 580 GTX, and in a couple months, Nvidia is going to launch it's Kepler cards which will be better then the 7000 ATI's. So what's the big deal? All the ATI fanboys drooling over a card that gets 3-8 FPS more then a year old Nvidia card? LOL doesn't take much to impress some kids I guess lol.[/citation]

Nvidiaders gonna Nvidia
 
G

Guest

Guest
Hi... your estimation of the bitcoin mining of the 7970 is probably wrong..
as it has 2000+ sp, i would estimate approx 470mhs (stock clock
 

intel4eva

Distinguished
Oct 12, 2011
166
0
18,710
6



Decibels are a unit on a logarithmic, not linear scale. A 3db increase is a doubling in intensity. Why don't you abstain from bulshitting if you don't have any clue what you're talking about?
 

intel4eva

Distinguished
Oct 12, 2011
166
0
18,710
6
Hi... your estimation of the bitcoin mining of the 7970 is probably wrong..
as it has 2000+ sp, i would estimate approx 470mhs (stock clock
I don't know what's stupider, buying an AMD card for gaming, or getting into bitcoin mining at this point.
 

Reynod

Administrator



I would not insult Don or Chris by making an anon comment here on the forums - I'm a mod and thereforepart of the team ...

I can still disagree with a story though ... Don isn't going to beat me with a stick for disagreeing with him.

I can certainly understand the drivers and paper launch criticisms.

 
G

Guest

Guest
gto: Probably not going to happen. Did you notice how the entire internet chose to benchmark Bulldozer with Nvidia GPUs to determine that it sucks at gaming? An AMD rep stated that Bulldozer beat any Intel CPU in gaming when both were paired with a 6970, but I guess the internet has no intent of ever testing that configuration, as I still haven't seen Bulldozer benched with any AMD GPU yet.

 
G

Guest

Guest
Intel4eva: That was stupid, please get your facts straight.

"Decibels are a unit on a logarithmic, not linear scale. A 3db increase is a doubling in intensity. Why don't you abstain from bulshitting if you don't have any clue what you're talking about?"

6db is a linear doubling, not 3db, but all the same, it's measured that way for a reason, because the human ear does not perceive loudness linearly.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
9
This is the kind of thing I was hoping to see from AMD. Kudos to them for the 28nm milestone they've reached, and for making good use of it. I'm looking forward to seeing full reviews with optimal drivers, where I expect even better performance.

I'm with AMD's (and Nvidia's) effort to develop the hardware for software engineers to take advantage of now (in terms of compute). Developers can't work with something that doesn't exist, and are very reluctant to spend time and effort on something that might come in the future (and who can blame them for that?).

Thanks for including the triple-screen tests in your data, and having a wide variety of types of games. I appreciate the technical aspects of the new architecture, as well. Nicely done.

;)
 

Reynod

Administrator
gto: Probably not going to happen. Did you notice how the entire internet chose to benchmark Bulldozer with Nvidia GPUs to determine that it sucks at gaming? An AMD rep stated that Bulldozer beat any Intel CPU in gaming when both were paired with a 6970, but I guess the internet has no intent of ever testing that configuration, as I still haven't seen Bulldozer benched with any AMD GPU yet.

tuckfoms / truth_in_truth ... your way off base.

BD doesn't perform any better with an AMD vs NVidia GPU (of comparable power).

BD is a good CPU ... it just didn't meet the hype AMD gave it.

The Bulldozer and ATI 2900XT (XTX lol) rollouts are both examples where reality did not meet the marketing hype and performance touted ... and the community responded accordingly.

The 7970 however is a welcome refresh.
 

dalauder

Splendid
[citation][nom]a4mula[/nom]From a gaming standpoint I fail to see where this card finds a home. For 1920x1080 pretty much any card will work, meanwhile at Eyefinity resolutions it's obvious that a single gpu still isn't viable. Perhaps this will be something that people would consider over 2x 6950, but that isn't exactly an ideal setup either. While much of the article was over my head from a technical standpoint, I hope the 7 series addresses microstuttering in crossfire. If so than perhaps 2x 7950 (Assuming a 449$) becomes a viable alternative to 3x 6950 2GB. I was really hoping we'd see the 7970 in at 449, with the 7950 in at 349. Right now I'm failing to see the value in this card.[/citation]If it was over your head, then don't go through so much trouble criticizing it.

Microstutter's only an issue on midrange cards (6870 and lower). This thing stomps a GTX 580 without even having proper driver support/optimizations. If people buy those, this has a market too. On price/performance, this thing pretty much matches 2x 6950's (~6990) when both get overclocked while coming in at a price that's not completely outrageous and providing single GPU simplicity and compatibility. Honestly, it's the best high-end GPU price/performance I've seen.
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
12



I kind of agree but with a caveat. The test is a bit flawed. They pit this single chip card up against a couple of dual chip cards and people are shocked when it doesnt beat the dual chip cards but beats all the single chip cards. If the test was done against the 6970 and 580 this card would have left a different impression.

But as I posted back on page 2 and you alluded to with your post the new nvidia card will blow this one away. The specs for the 780 are almost identical to the 590. Add to that a die shrink, better heat management with those new cooler fins, lower power consumption, and lower noise and we should be looking at a beast of a card. Price wise though it might cost 700 when it comes out but that's just a guess.
 

9_breaker

Distinguished
Jul 24, 2011
865
0
19,010
19
[citation][nom]aznshinobi[/nom]OMG DAT OVERCLOCKING! I can't wait to get the 7850 or 7870 and OC the crap out of it![/citation]

the 7850 will be a re branded 6950 so the overclocking results should be the same .
 

jprahman

Distinguished
May 17, 2010
775
0
19,060
42
[citation][nom]9_Breaker[/nom]the 7850 will be a re branded 6950 so the overclocking results should be the same .[/citation]
No it won't. Didn't you read the FRONT PAGE of the article where it specifically said that the 7700 series and up would all be new cards, not rebrands. Reading comprehension fail.
 

masterofevil22

Distinguished
May 13, 2010
229
0
18,690
3
Eyefinity 2.0, Display Support, and Desktop Enhancements

According to AMD, the Radeon HD 7970 is the first graphics card able to provide multiple simultaneous independent output streams, called Discrete Digital Multi-Point Audio (DDMA). This means that each attached screen can output its own unique audio signal.
I can finally play BF3 with non-mixed audio while the gf watches a movie or show on the TV with her own discrete audio... I Like it
 

9_breaker

Distinguished
Jul 24, 2011
865
0
19,010
19
[citation][nom]jprahman[/nom]No it won't. Didn't you read the FRONT PAGE of the article where it specifically said that the 7700 series and up would all be new cards, not rebrands. Reading comprehension fail.[/citation]

I didn't see that I just read the benchmarks , my mistake . so its not a "reading comprehension fail " , besides reading comprehension fail doesn't make sense
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285
0
I believe the 6970 has a memory bandwidth of 176GB/s; I think the figure you're showing is that for the 6950. The 6990 has the same bandwidth per GPU as the 6950, so the 160 there is correct.
 

envolva

Distinguished
Dec 7, 2009
73
0
18,630
0
[citation][nom]9_Breaker[/nom]I didn't see that I just read the benchmarks , my mistake . so its not a "reading comprehension fail " , besides reading comprehension fail doesn't make sense[/citation]
It’s worst than failing to comprehend. You were spreading misinformation.
 

larkspur

Distinguished
[citation][nom]a4mula[/nom]From a gaming standpoint I fail to see where this card finds a home. For 1920x1080 pretty much any card will work, meanwhile at Eyefinity resolutions it's obvious that a single gpu still isn't viable. [/citation]

@4800x1200 a single 6970 runs games great. No it won't play BF3 or SW:TOR on ULTRA but it will play them with most settings on high. If you want ultra on everything with no Xfire micro-stutter you are out of luck at such high resolutions. But a 7970 with its avg 12fps increase at eyefinity resolutions looks very, very attractive to those who love 3 monitor gaming but hate the annoyances of SLI/Xfire. I mean, XFire doesn't even work in SW:TOR (unless I missed the announcement). Imagine someone with an eyefinity XFire setup with 2 x 6870 running SW:TOR @4800x1200 with a SINGLE 6870 because their 2nd card can't be used. Ouch!
 

larkspur

Distinguished
[citation][nom]dalauder[/nom]If it was over your head, then don't go through so much trouble criticizing it.Microstutter's only an issue on midrange cards (6870 and lower). [/citation]

No, microstutter is less noticed with the higher-end cards but only @1920x1080. At eyefinity resolutions micro-stutter is much more noticeable believe me. I've had both XFire eyefinity and single card eyefinity and vastly prefer the single card due to the microstutter and annoying inconsistencies of Xfire with certain games. Clearly if you want ultra settings you want to go Xfire/SLI at such high resolutions, but I get great frame rates on a single card setup by turning down settings from ultra to high and completely avoid the micro-stutter.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS