AMD Radeon HD 6990 4 GB Review: Antilles Makes (Too Much) Noise

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
This would still work fine for me because my computer case is in the other room and I usually have the door closed anyways. BUT... I still don't think a video card should be that loud. I think instead of a super loud solution they should have just put a water block on it and said if you want one, it MUST be water cooled.
 

wasupmike

Distinguished
Oct 13, 2010
399
0
18,860
26
I don't understand benchmarking 'niche' (~$500+) cards like this one at 1680x1050 anymore. They're clearly not designed for such resolutions.

That time and energy wasted should be put towards benchmarking a 3 x monitor setup, for example... and show results from configurations that were actually meant for cards like these...
 

evga_fan

Distinguished
Aug 22, 2010
76
0
18,640
2
[citation][nom]cangelini[/nom]Hi evga_fan,I didn't have time to run 480 SLI figures, but assume they'd be very close to the 570s. Noise would be another matter entirely. That configuration is notorious for howling in SLI mode.As for CrossFire, I think it's pretty safe to say that really isn't a problem any more. NV is out of the chipset game, and every AMD/Intel-based board I've had through for years has been CrossFire-capable. SLI licensing has been more spotty, I'd say.[/citation]

Thank you for your input and I totally agree.

As for CF, I was referring to the fact that (according to the article) 6990 in CF wasn't really showing good scaling which of course isn't really a fair representation of what CF can do these days.
However, can you shed som light on what you mean by "NV is out of the chipset game". Sure, CF mobos have become more readily available but wouldn't you agree that when it comes to performance, both companies are pretty much neck and neck?

Thanks in advance.
 
If you want to run multiple 2560 x 1600 displays, this is not the card for you. The performance would be fine, but there is only one dual link dvi port, and the mini dp adapters do not include the necessary dp to dual link dvi capability needed to add a second or third 2560 x 1600 monitor.
 

lradunovic77

Distinguished
Dec 20, 2009
405
0
18,780
0
Compared to single gpu GTX580, this card is joke. I will take SLI GTX580 over this any time because Crossfire sucks. It takes AMD/ATI usually two months (two driver releases) to get shit working propertly. Not worth a dollar in my book. You know benchmarks are one story, but real time gaming is completely another story.
 
This card is like a nuclear weapon. It's a matter of extreme pride to have created it, and tested it, but ok; it needs to be put away in some dark secret place and never used.
This thing is a waste. AMD will sell "a few" of them, some to those unaware of how loud and hot they are, others to teenage gamer-bois who have not yet developed a sense of value, but are still able to moan and whine to their parents who don't have the sense to say "Hell NO!"
 

goodsyntax

Distinguished
Aug 4, 2008
33
0
18,530
0
As others have alluded to, this is an "image" release. All it does is garner bragging rights for AMD, at least for a while.

Hopefully AMD/NV has legions of engineers working on the next-gen GPU that has the same, or better performance capabilities at substantially lower power and thermal loads. I suspect that we are seeing the limits of the current design generation, both from a thermal and performance perspective.

Imagine, with a refreshed architecture and a die shrink, we can get mainstream parts with comparable specs (and substantially lower power/thermal/audible loads) out of a single die! I hope we see a refresh soon, my electric bill will thank you!

Either way, software is lagging far behind the hardware, both in the gaming space and in the productivity space. What good are mega-core CPUs and GPUs if there is no readily available software to harness it? I don't know about you, but if ffmpeg/Handbrake cannot take advantage of all the electronic goodies available to transcode in faster than real-time, PhotoShop still takes forever to load, and Visual Studio still has gnomes in my computer that compile code by hand, why am I spending so much on these things again?
 

jasonpwns

Distinguished
Jun 19, 2010
415
0
18,790
1
[citation][nom]GeekApproved[/nom]Nvidia still can't beat the 2yr old 5970. Nvidia = Fail[/citation]

GTX580 = 1 card can beat the 5970 in FPS on nonAMD made games at stock. So yeah I would correct yourself. Also when the GTX590 comes out.
 

nukemaster

Titan
Moderator
[citation][nom]Doom3klr[/nom]No matter how good this card is it will have issues Ati/Amds drivers are garbage just like they were when Ati was on there own. Nvidias drivers are 1000s times better.[/citation]
Well I still remember my 9+ month wait even after Vista SP1(I waited for that to upgrade) came out to be able to use my 8800GTX. So i think Nvidia has had there share of issues too. 9 Months for a $500+ card? Yeah never had more then a month or 2(very rare) for ati to fix an issue I had.
 

HavoCnMe

Distinguished
Jun 3, 2009
603
0
18,990
1
Why is noise and power consumption a big deal on a Flagship model? High-end (anything) uses technique’s that standard/mid-level models won’t use due to a thing called a “Budget”. To be honest awesome devices/machines are normally loud and use a good amount of power or have a big engine/motor/processor to achieve its high output. You don’t buy a Ferrari and complain about it being a gas hog that is loud at high rpm’s. Kind of defeats the purpose of having a high-end device/machine. I have 3 133cfm fans on my pc. When I turn on my pc it sounds like I am vacuuming, but that just tells me that NO HEAT will be trapped in my case, same goes for the fan on the 6990. If you have ever been in a server room you know that server’s, routers, switches, and firewalls are running multiple high cfm fans to keep them cool, not a quite environment. So why can’t a Flagship GC be loud? You are essentially running mini electric stoves in the die and have to keep them cool enough not to melt/fuse/short together.

All in all Flagship models will ALWAYS make noise, to me it’s the sound of power and deep pockets.

That is my rant about people complaining about noise and power consumption on a Dual GPU solution.
 
G

Guest

Guest
does the 6990 support 3 displays per DisplayPort? in that case 12 monitors could be connected to this card.
Also if you only have two PCIe 16 slots and you want to use a raid card, you only have one slot left and this is the only way to put crossfire in your machine.
with good aftermarket watercooling solutions the noise and heat shouldnt be a big problem.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
9
Article quote (Chris A.) about World of Warcraft performance:

When it comes time to turn on anti-aliasing, though, the Radeon HD 6990s take the smallest performance hit. Even still, you can run Cataclysm at 2560x1600 with 8x MSAA on a Radeon HD 6950 and still average more than 60 frames per second. There’s really no reason to buy such a high-end card for this fairly mid-range title.
Shame on you. You report average frame rates and make a statement like that? You know better. There are other reasons too, and you can think of them, I'm sure. Can you say 120Hz? People are beginning to figure out why people like me are saying 60Hz isn't good enough and using it even if they aren't running 3D, and others are using it for 3D. What about multi-monitor setups? I doubt those single-GPU cards could maintain that on my 1920x1200@85Hz CRT even if I switched from i7 to Sandy Bridge. I doubt the dual GPU 6990 could.

Quit telling people they don't need things that they could possible use. Of course they need to match their hardware, there's no sense spending gobs of money for a monitor that runs at 1024x768, but there are plenty of possibilities for those who can use more power in WoW.

It's up to those who know better to give the people who don't enough information that they can choose correctly for their wants, not tell them what they should buy. It's also up to those who know better to provide information that's as accurate as possible and not fuel arguments between ignorant people on both "sides" of an issue. Your comprehensive WoW hardware article was that kind of article, but a statement like this is not, and could very well lead to someone telling a person his or her video card purchase was pointless when it wasn't.

I also question your conclusion that the Gulftown didn't allow the video cards the breathing room Sandy Bridge does. The fact is, none of the video cards you tested in your 560 Ti article approached the power of these cards, and the frame rate averages were all lower in those charts. I don't know what the deal is in the world, but there are plenty of benchmarks published across the internet that demonstrate the "architectural improvements" of Sandy Bridge, aside from the added GPU power, aren't much, and that the Gulftown can still more than hold it's own. Intel is right to keep it as it's flagship, and a peruse of Passmark's CPU benchmark results will show that is the case. (I'm not saying it's priced right; I think it should be half the cost it is, at least, but then they might not be screened to the standards that have allowed higher overclocks.)

If the video cards are being limited by the CPUs in WoW, or the CPU is still the limit for other reasons, of course then more graphics power won't help. However, that's not what your comprehensive WoW article showed. The most powerful CPUs available may not be enough for GTX 580s and beyond in WoW, based on your results in that article, but we know they are are otherwise powerful enough to differentiate cards and I suggest Gulftown still is, otherwise we would not have seen any differentiation between the top end cards here in this article, and we did.

Of course, as to this specific card, I'll pass, for WoW. Nvidia is clearly the way to go, and SLI if they don't come out with a dual-GPU card and you want the extra bang. There are of course other reasons to go AMD graphics, outside of WoW considerations.

;)
 

foogoo

Distinguished
Jan 27, 2011
26
0
18,530
0
Thats one fast card but i cant help but wonder with all these dual gpu solutions if micro stutter is still a potential problem. I've heard its been fixed but I've seen no proof of this personally. In these reviews i would like to see a micro stutter test with a single card control. A 6870 vs a 6990 micro stutter test would put my mind at ease before buying another dual card setup.
 
[citation][nom]scrumworks[/nom]Starts with negative comments (noise), so no surprises from Chris. Fermi of course never made so much noise and consume so much power that would require this type commenting. Everything was Power, PhysX and CUDA![/citation]

I'm going with this. Author Fail.

Compares the HD6990OC power consumption at load to a single GTX 480 -- then fails to note:

GTX560 SLI: 496w
GTX470 SLI: 541
GTX570 SLI: 556w
GTX580 SLI: 620w
GTX480 SLI: 668w

Think he will 'man-up' and make a comparison of watts per frame in various games or take his hatchet to me?
 
G

Guest

Guest
Uhh, quit showing an ad page on every page click in this article, and you will quit losing readers.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS