AMD Radeon HD 7970: Promising Performance, Paper-Launched

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
Why is your Bitcoin mining benchmark so screwed up? You don't say which miner you used, and what arguments you gave it. Also, DiabloMiner seems to function fine on 79xx, so why didn't you use that?
 

Mannaman

Distinguished
Sep 22, 2009
43
0
18,530
"If I ever find someone that will buy me a $500 graphics card for Valentine's Day I'll be proposing on the spot."

You've obviously never been married. There is no amount of money worth the hell through which a crazy woman will put you. The most important thing to remember is that virtually all of them are crazy (they all know it too -- they just don't like to admit it to men).

I suggest you ask your dad, your mom, your best friend and the wisest person you know for their approval on your mate choice and if they all don't agree, take a hike. "For better or worse" is
naturally in a man's heart but lies fly out of a woman's mouth like moist breath. Women love the romantic fantasy of marriage but most of them don't give five seconds thought to the importance of a vow.

Yeah, I know it's off-topic, but I just want to warn you single guys that the reticence you feel to avoid marriage has a multitude of good reasons and the best protection against being stupid is actually self-control. If you want a good marriage, keep your zipper zipped up until after the vows are made.
 
G

Guest

Guest
Performance increase to dollar increase has gone down with this card. Seems like a next gen card should have a 25% performance increase and launch at slightly similar price as its predecessor did. Performance to dollar ratio was about 1:1 before. Now it's lower. I don't see how that's gonna slash prices and benefit anyone unless Nvidia's line stiffens competition.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
To say no true gamer cares about effeciency is false.

People who gripe about effeciency have a legitmate logic behind their gripes (Electricity bill costs). Problem with the logic is that comparing equal-level GPUs from both ATI and Nvidia (Say GTX 590 vs HD 6990) the overall difference in wattage is extremely small. Both cards are gaming GPUs and power hogs. When you find the difference in wattage when running them at load in games for 12-16 hours a day at worst case scenario over the corse of 30 days (one billing cycle), the difference in power bills would is less then a couple bucks at most.

So where I understand that people have legitmate concerns when it comes to power consumption, claiming a victory for a GPU manufacturer because they save you 20-30 watts of power, when average state charges about 6-7 cents per Kilowatt, is not a valid point.

Sorry, but it's not. I live in hawaii, which has the highest kWh rate of 15 cents an hour, and even the cost difference for me between ATI and Nvidia is unnoticable on my bill. So get over it.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
Just to prove my point, I did the math on running 580 vs this new ATI card, which runs 40 watts less at load. 12 hours a day, over 30 days the difference is 2 dollars.

Someone sitting at home playing video games for 12 hours a day, 7 days a week, on $500 dollar GPUs should not be complaining about 2 dollars a month (Thats in hawaii). If you calculate the normal 6 cent US rate, it drops down to under $1 more a month. GG
 

marciocattini

Distinguished
Sep 15, 2010
36
0
18,540
This is really impressive, a single gpu performs close to and at times even beats current dual gpu solutions! I guess we could expect 7990's to perform twice as fast as current dual core graphics cards! Imagine that? happy 2012 every one! :)
 

AnUnusedUsername

Distinguished
Sep 14, 2010
235
0
18,710
I'm glad to see another high-end single GPU release. It seems like lately the market has been aimed at those that want to use dual GPUs, just look at every SMB article from the past few years even. You'd think companies with as much money as ATI and Nvidia could develop single-GPU options that can actually compete with a similarly priced dual-card setup, but it seems like they are still focused on the "why design a better GPU when you can just stack two of them together" mindset. This release seems to be a good step in the right direction.

And yeah, skyrim's a pretty lame benchmark. The game doesn't require much more than oblivion did. (doesn't look much better either...)
 

Headspin_69

Distinguished
Nov 9, 2011
917
0
19,010
[citation][nom]nebun[/nom]ok...bad software does not make up for good hardware[/citation]
Good thing Radeon has been on top of there driver optimizations as of late and I am sure it will be carrying into the 7xxx launch.One of Radeons trade marks is there drivers tend to mature and get allot better with time whereas Nvidia tends to rush and get the drivers right from the start but they never really optimize any better over time like the Radeons do.
 

nebun

Distinguished
Oct 20, 2008
2,840
0
20,810
[citation][nom]Headspin_69[/nom]Good thing Radeon has been on top of there driver optimizations as of late and I am sure it will be carrying into the 7xxx launch.One of Radeons trade marks is there drivers tend to mature and get allot better with time whereas Nvidia tends to rush and get the drivers right from the start but they never really optimize any better over time like the Radeons do.[/citation]
if this is true theny why is it that CUDA computing is still better and the nVidia's drivers are still more solid? explain this to me :)
 

Headspin_69

Distinguished
Nov 9, 2011
917
0
19,010
[citation][nom]nebun[/nom]if this is true theny why is it that CUDA computing is still better and the nVidia's drivers are still more solid? explain this to me[/citation]
Nvidias drivers are not more solid I have had just as much and more problems with them Nvidia drivers and CUDA is not for gaming LOL we are talking about gaming class GPUs here LOL.CUDA is another Changeling Nvidia marketing gimmick for fanboys and noobs.
 

jprahman

Distinguished
May 17, 2010
775
0
19,060

So the guys at Cray are Nvidia fanboys and noobs, despite the fact that they have decades of experience in the HPC market, AND have been a major customer of AMD for years?
 

Headspin_69

Distinguished
Nov 9, 2011
917
0
19,010

Wrong contexts mate CUDA is for business class GPU products that tried and failed real bad to put a Monopoly on the market just like Nividias in house version of Physx did when Havok, Unreal Engine, Source, Gamebryo, CryEngine, and many more do a much better and optimized job and some even are opensource. Plus Nvidias newest little 3D Vision Changeling Gimmick thingy is going to crash and burn pure marketing noob hook SHLOCK.
 

jprahman

Distinguished
May 17, 2010
775
0
19,060

So 3D vision is pure marketing, but eyefinity isn't? I smell a fan boy.
 

Headspin_69

Distinguished
Nov 9, 2011
917
0
19,010

Not meaning to troll on anyone dude but honestly dude I can comfortably go out on a limb and say 3D is a flop until the get it to the point were no glasses and it actually looks 3D, Physx is literally in only 5 games = Fail for how long it's been around. I want more GPU performance for my dollar not Fickle little Gimmicks thrown in that really are useless PEACE.
 

Headspin_69

Distinguished
Nov 9, 2011
917
0
19,010

I did not mention anything about Eyfinity or Nvidias version Surround Vision however multiply monitors are an actual tangible useful benefit 3D as it stands today is pure marketing Gimmickry today Glasses LOL its not 1980s. Ok Nvidias take on triple monitor gaming is for us to be forced to buy two of there cards to drive three monitors = FAIL for Nvidia. AMDs cards run at leased three monitors off a single card right outta the box and some Radeon cards even run 4 or 6. Triple monitor gaming takes allota of power so Nvidias rational is somewhat warranted based on the fact you really do need two cards anyway to run three monitors smoothly when gaming but once again Radeon offers more value because not everyone plays demanding games and some people just need tri monitors for business apps on the desktop.
 

Ricco1911

Distinguished
Nov 17, 2011
9
0
18,510
Now they must just fix their cooling issue's. My 6970 fries when playing BF3 (102C to be exact) Not kidding i can email you a pic if you don't believe me.
And its not my case (Have HAF X) Room temp is 28C. Luckily i am getting accelero extreme plus II in a few days :D
 

Headspin_69

Distinguished
Nov 9, 2011
917
0
19,010
[citation][nom]Ricco1911[/nom]Now they must just fix their cooling issue's. My 6970 fries when playing BF3 (102C to be exact) Not kidding i can email you a pic if you don't believe me.And its not my case (Have HAF X) Room temp is 28C. Luckily i am getting accelero extreme plus II in a few days[/citation]
That's a defective cooler or air flow issue not coding LOL
 

Ricco1911

Distinguished
Nov 17, 2011
9
0
18,510
Doesn't bother me anymore, can wait for my new cooler. I have tried to rma the card (MSI) but they said there was nothing wrong with it. Yeah right.....
 

psiboy

Distinguished
Jun 8, 2007
180
1
18,695
and still not a hint of the Radeon 6950 1gb in any of your benchmarks... the card that TomsHardware claims to be the sweet spot... and yet the hypocrisy of this continued omission continues to smack you in the face Don! It would be nice to have the previous Gen "sweet spot" to compare too...
 

Red Team FTW

Distinguished
Jan 6, 2012
73
0
18,640
[citation][nom]psiboy[/nom]and still not a hint of the Radeon 6950 1gb in any of your benchmarks... the card that TomsHardware claims to be the sweet spot... and yet the hypocrisy of this continued omission continues to smack you in the face Don! It would be nice to have the previous Gen "sweet spot" to compare too...[/citation]
6950 is not really a good card being it is way to high priced and in the wrong class it is a mid range card not high end.http://www.overclockersclub.com/reviews/msi_r6850_cyclone/7.htm
 
G

Guest

Guest
why would you use an i5???? to have bottlenecks on purpose.....u guys are suppose to use the highest end hardware inorder to prevent bottlenecking so that the cards you are testing can give it their all
 
Status
Not open for further replies.