Radeon HD 7990 And GeForce GTX 690: Bring Out The Big Guns

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ronch79

Distinguished
Jan 16, 2010
181
0
18,680
Pairing one of those power-hungry AMD cards with an overclocked-like-hell AMD FX CPU should make your fuse box blow up. Even given the stellar performance (of those GPUs, anyway) I'd hate to use a system using that much power even if I'm frickin' rich and don't give a crap about my power bill.
 

fuzg13z

Honorable
Aug 5, 2012
151
0
10,690
[citation][nom]mayankleoboy1[/nom]IMHO, the GTX690 looks best. There is something really alluring about shiny white metallic shine and the fine metal mesh. Along with the fluorescent green branding. Maybe i am too much of a retro SF buff[/citation]
The way it's put together looks great. All the pieces fit so nicely together and it's not big and bulky and full of seemingly wasted space like the Powercolor and HIS card. It looks.. streamlined.. lol
 

fuzg13z

Honorable
Aug 5, 2012
151
0
10,690
Why wouldn't they use an IvyBridge processor and a Z77 chipset for the PCI-e 3 support? Wasn't one of the issues in question the fact that one of the cards uses a PCI-e 2 lucidlogix controller? SandyBridge only supports PCI-e 2 right??
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
This was a typo :(

I have worked all time with an early, pre-patched version (for better comparison) and overwriting in driver, but this was the first time with an actual and fully patched game. I forgot to change the subtitle in the two charts. This is 4x MSSA, all other settings have no effect. Only Supergrid Sampling works. Thank you for your comment!

I asked Chris to correct the pictures and I ask our readers for apology for this lapse :)
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
It is always very difficult to keep a game over such a long period of one year as on the same level for comparison (charts, articles). The rest are then automatism, the shitty Excel and a worked through night...
:D
 

GoldenI

Distinguished
Nov 11, 2010
445
0
18,810
As much as I would enjoy impregnating my system with this card, I can not justify shelling out an extra $1000 on a GPU.

MAYBE $500, but $1000 for a GPU? No. No way. It is very aesthetically pleasing, and performs ridiculously well, but I honestly could not picture myself spending $1000 on a GPU. I would much rather buy a MacBook Air, or a nice tablet for that amount.

Will most-certainly consider buying when the price is sliced in half, however. :eek:)
 

thanny

Distinguished
Nov 23, 2010
10
4
18,515
First, the 690 had annoying coil whine at idle, though not as bad as the PowerColor card. It was very high frequency, so you need a decent pair of headphones to hear it.

Second, how come no one does an additional noise test where the cards are adjusted to the same cooling target? Force the 690 to shoot for the same temperature as the AMD cards, and see how much louder it gets. Or, in reverse, change the fan curve for the AMD's to match the 690, and see how much quieter they get. People spending this much on graphics cards are able to do adjustments like that, so sticking with out-of-the-box behavior doesn't make a lot of sense.

As for me, I'll be using two single-slot water-cooled 7970's within a few days. I may never use air-cooled graphics again, given how things are going.
 

kryzzay

Distinguished
Apr 30, 2008
84
0
18,630
Yeah here in Australia to. We're paying 0.30 usd per Kw.

Not only is it to expensive to buy but it's too expensive to run. That's of course if you really give a toss about $.
 
G

Guest

Guest
I noticed that under the first game benchmarks you accidentally listed the Battlefield 3 charts under the heading of "Metro 2033", as well as further down under the "Battlefield 3" heading. Just a heads-up.
 

octoberhungry

Honorable
Apr 30, 2012
233
0
10,680
Who really needs these types of cards for the current games of today? Are these cards better used by companies like Pixar, CGI, animation studios etc? Is that where they're best utilized for?
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
They could serve as future-proof solutions (just up to some point of course), also I wonder how they'd perform with, let's say, a 3x2 landscape Eyefinity setup of 2560x1600 monitors (at least for the AMD cards) which has almost 12 times the pixels of a single 1080p monitor. I wonder if you'll need cards like these (or 2) to play with the highest settings in the most intensive games., if they even can. What I just said may sound silly, but it's just sort of a proof of concept as to a possible need for these cards. :D

I think companies like Pixar may use professional cards like AMD FirestreamPros and Nvidia Quadros (and I think Teslas in conjunction because I think Nvidia has something (at least for Adobe movie editor (not sure what the name was) which I think had the Mercury playback engine) where you can use both Quadros and Teslas together for workstation type work).

 


Are you mad??? Have you even looked at the HD7750 or HD7770 review??? Or any card other than the HD7990. Most of AMDs card use LESS power per performance than Nvidia. I agree with you on the CPU front but seriously, GPUs, NO WAY!!!
 

Thor

Distinguished
Jan 5, 2004
155
0
18,680
More video games (eg Borderlands 2) use "nVidia PhysX”. If you have an ATI card, you must emulate this system with software. It slowed down a lot and the power of ATI cards.

AMD bought ATI like, and that AMD continues to sleep, and does not compete against the system "NVIDIA PhysX" by creating a similar system; it is much more advantageous for the players now purchase an nVidia card.

And it will be the same in the future. Just look at the competition was not even Intel and AMD CPU in the world. Too bad ATI has sold his soul to a company as mediocre. Suffice to say that she committed suicide.

It does not make a ATI card more powerful than nVidia. For emulating "nVidia PhysX" this ATI becomes slower and less powerful. Better now buy an nVidia card.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
Your first statement is true AFAIK. Though I'm not sure what you meant by "and the power of ATI cards." Please elaborate.

But about those other points of yours, I have something to say about them. How sure are you that AMD is just "sleeping" (doing nothing about it)? For you to just claim that just like that is both defamatory and unproven if that's the case. Haven't you seen their efforts with promoting general-processing with GPU's? OpenCL? HSA? Fusion? APU's?

For one thing as well, PhysX is owned by Nvidia, and I don't think AMD could implement support for it on their cards. It's also a problem that some game developers choose to use it. Nvidia might have tools that are easier to use for developers. AMD may be working on that, or rather, some game/physics engines may be implementing things like OpenCL already. I think games like Battlefield 3 have support for DirectCompute already, but just for ambient occlusion I think.

Another thing, I don't think the AMD GPU's emulate PhysX on them. If it hasn't changed, I remember that they (the calculations) fall back to the CPU (like you said "software" i.e. not "hardware-accelerated," unless this isn't what you meant). I remember someone sharing this article for me on that topic. It mentions how the PhysX engine may be intentionally left to only be single-threaded by either (or both) Nvidia or game developers. I was told (and read) that Metro 2033 implemented the software version of PhysX run on multiple threads (enough so that all of a Phenom II X6's cores were taxed substantially by the game). Here's the article if you want to read it.
CPU PhysX: Multi-Threading?

Anyway, if you're saying that Nvidia cards are superior (in your opinion) because they support HA of PhysX games, then that's fine, but bashing AMD with points like that seems wrong to me, and if you do, you could expect comments like this one. :)
 
Status
Not open for further replies.