AMD Radeon HD 6990 4 GB Review: Antilles Makes (Too Much) Noise

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

stingstang

Distinguished
May 11, 2009
1,160
0
19,310
There's no heat problem here. If you have enough dough to buy one of these, then you have enough to buy a water cooling block...and a 4 slot radiator per card.
 

K2N hater

Distinguished
Sep 15, 2009
617
0
18,980
[citation][nom]stingstang[/nom]There's no heat problem here. If you have enough dough to buy one of these, then you have enough to buy a water cooling block...and a 4 slot radiator per card.[/citation]
That's absolutely right but all AMD want is to release it as soon as possible no matter how immature their product is. It may take a few months until we see properly cooled models. Until then early buyers are to bash AMD for the noise, heat and driver flaws. These are the same key mistakes nVidia commited with their GTX 480.

Personally I see a single-slot 6990 with a factory-sealed water cooling solution + 2x120mm low noise coolers as much more refined and usable than the solution AMD picked. If a gamer is willing to build a PC then drilling the case to set it properly must not be a problem. To the lazy user there's Dell...
 

romulous75

Distinguished
Sep 20, 2009
12
0
18,510
From the figures comparing the cards on the first page, I can't help but feel that HD 69xx is being held up by it's memory bandwidth. The TFLOPS & Texture Fillrate is higher, but the real world performance does not reflect it potential. Pity.
 

superjasperge

Distinguished
Mar 13, 2011
1
0
18,510
i personally think this card is really cool but a bit unnecesary if i had the money i would buy it but for my budget i think i am going to buy the 6950 from sapphire (2 gb)
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]juliom[/nom]For crying out loud Chris, this is a DUAL GPU card! What did you expect? That it would behave like a passively cooled card? How can you compare it with a SINGLE GTX 480? Your bias never ceases to amaze me.[/citation]

LOL, it's not April Fool's day for another couple of weeks!
 

nebun

Distinguished
Oct 20, 2008
2,840
0
20,810
[citation][nom]Haserath[/nom]I think AMD will have the performance monster this round. It would be surprising if Nvidia was anticipating something like this. The GTX 570 already uses quite a few more Watts than the 6970; what will they do to match two 6970's on one board?This is isanity![/citation]
not for long...wait until the new 590 comes out :)
 

billj214

Distinguished
Jan 27, 2009
253
0
18,810
OK the graphics chip people will have to realize they are hitting the wall on performance since every new chip requires more power and produces too much heat!
It's only a matter of time before they will build 2 core and 4 core processors, all built on the die.
 

utengineer

Distinguished
Feb 11, 2010
169
0
18,680
[citation][nom]billj214[/nom] It's only a matter of time before they will build 2 core and 4 core processors, all built on the die.[/citation]

The GTX 580 already has 512 CUDA cores on one chip. I think you are confusing CPU architecture with GPU tech.
 

billj214

Distinguished
Jan 27, 2009
253
0
18,810
[citation][nom]utengineer[/nom]The GTX 580 already has 512 CUDA cores on one chip. I think you are confusing CPU architecture with GPU tech.[/citation]

Didn't know that, I wonder what the die size is currently? There has to be a better way! I hate buying a bigger power supply every time I buy a new graphics card!
Maybe they should utilize the CPU more, seems as of lately any small CPU will power the biggest graphics card.
 

sandmanwn

Distinguished
Dec 1, 2006
915
0
18,990
The complaining about noise was a bit silly. The amount of money you are spending on a system to handle this almost guarantees whoever buys it will be ripping off the stock cooler for an aftermarket cooler or water cooling solution. It was worth the mention but sometimes authors just go overboard with minutia.

If I was really interested in keeping the stock cooler I would probably have my case intake fans mounted on the side panel, reverse the front fan for exhaust, and move my hard drives up to a 5.25 bay with its own intake fan. All kinds of adapters with fans for 5.25 bays. Some sound dampening for the inside and done.
 

peddy_peddy

Distinguished
Jan 18, 2011
7
0
18,510
can any one help me please, i got one issue, when i am playing aliens vs predator on high mode i got slower performance after 1 hour playing game than i closed game and start after 10 min than it goes fast again, i am using two cards in slot 1 and slot 3 in my sabertooth x58 motherboard so can this one lead slowing down performance?my system config. is

i7 950 (3.06mhz), sabertooth x58 motherboard, two hd 6970 cards in crossfire with latest driver, 6gb corsair xms3 1600 mhz ram in tripple channel,1 tb wd cavier black hdd, corsair tx 950w psu
 

tmc

Distinguished
Aug 22, 2007
99
0
18,630
Power consumption is way too high.. they should try to get it down to 300-325 watts max overclock specs. Next, only the newest & most expensive m/b's will have pci-e 3.0 spec compatability enabling the true bandwidth for the card. Finally price will put this out of the mainstream market almost until it becomes discontinued and this solution finds its way onto single gpu designs rendering it obsolete in upto 24 months from now (thus meeting or exceeding the power reduction goal I've set out in my first point).
 
G

Guest

Guest
Some time ago I bought a Radeon HD 4870X2, and I had the same problem of "F-16 jet inside the case."

The solution to this is to eliminate the #$@%& "stock" fan and switch to a more efficient fan. The problem is that you lose the manufacturer's warranty if the card fails for any reason unrelated to the fan.
 
G

Guest

Guest
It must have been the 1st driverset that caused the fan issues. I've been completely satisfied with the low fan noise of my 6990..beats the crap outta the old 4870x2 and 2900xt 1gb.
 
Status
Not open for further replies.