Radeon HD 7990 And GeForce GTX 690: Bring Out The Big Guns

Status
Not open for further replies.

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
18
IMHO, the GTX690 looks best. There is something really alluring about shiny white metallic shine and the fine metal mesh. Along with the fluorescent green branding.
Maybe i am too much of a retro SF buff :)
 
G

Guest

Guest
thanks for the in depth analysis with adaptive V-sync and radeon pro helping with micro stutter.

not to take away anything for the hard work performed; i would have liked have seen nvidia's latest beta driver, 310.33, included also to see if nvidia is doing anything to improve the performance of their card instead of just adding 3d vision, AO, and sli profiles.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
15
Good read!

But, would have liked to see 680s in SLI, to see how they scale now compared to the 690.

Also, would using two single GPUs in CF/SLI make a difference to the micro-stuttering charts? iirc, the PCIe controller is tied to the CPU for SB/IB chips? So that would mean no 3rd party bridge in between the two GPUs as in the case of the 7990 and 690. Would that make a diff?

How do you manage to isolate the cards' power consumption at load (idle is simpler)? And noise too: how do you block out the case fans and CPU cooler?
 
G

Guest

Guest
The radeon pro is saving AMD's butt

But In the end, 690 was slower than 7990 average framerate but with Radeon Pro, it is the 7990 which is slower right?

So yes it's better than without, but the 690 is faster, as smooth, and use a built in technology

AMD really need to work on it's crossfire technology
 
[citation][nom]amuffin[/nom]Is it just me or do the 7970X2 and 7990 coolers look so fast and fugly?[/citation]

I don't think they look "fast and ugly", although I do think that the HIS model could do with some more finesse.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
1
How do you manage to isolate the cards' power consumption at load (idle is simpler)? And noise too: how do you block out the case fans and CPU cooler?
The noise was measured with the open benchtable, not in case (no extra case fans and an ultra silent fan on the hidden CPU cooler)

For the power consumption: 3 current clamps with monitoring ;)
 
[citation][nom]Novuake[/nom]Interesting, AMD has a winner at the top tier! That hasn't happened in a while. CODOS to that.[/citation]

Technically, HIS has a winner, not AMD because AMD didn't launch a 7990/7970X2 reference;)
 
[citation][nom]twinshadow[/nom]if you are spending 1000$ dollars on a video card paying a Power bill is not an issue[/citation]

Actually, the only person who I ever recommended a GTX 690 to wanted it specifically because of its low power consumption literally being enough to pay for itself compared to his previous graphics setup due to his high cost for power. Some people looking for such high end cards most certainly do care about power consumption.
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790
2
The GTX 690 is the clear winner in my eyes, especially since there is a two-slot water-cooled version.
"Just Because You're Fastest Doesn't Make You The Best" pretty much says it all.

The Radeons make huge concessions for the sake of performance:

1. Bigger size. Three slots vs two. Quad Crossfire with two cards becomes virtually infeasible.
2. HUGE power draw: Equals more heat, hence more cooling necessary, hence bigger size.
Exceeding PCI-E specs is very worrisome.
I think TWO GTX 690s would consume about the same or maybe even less power.
3. LOUD. +Coil whine which is even more annoying than just loud.
4. LOTS of microstuttering (virtually unplayable without using third-party software).
5. Price. Let's be real. $1300 is optimistic, and availability is a shot in the dark.

Pros:

1. More FPS. Doesn't matter though unless you're using multiple displays, but that comes with the HUGE downside of giant bezels in your face.
2. Little to no microstuttering with third-party software. The only saving grace but doesn't add a whole lot since GTX 690 microstuttering isn't that bad.

Calling these three-slot monstrosities "inelegant" is possibly the nicest thing you could say.
 
[citation][nom]merikafyeah[/nom]The GTX 690 is the clear winner in my eyes, especially since there is a two-slot water-cooled version."Just Because You're Fastest Doesn't Make You The Best" pretty much says it all.The Radeons make huge concessions for the sake of performance:1. Bigger size. Three slots vs two. Quad Crossfire with two cards becomes virtually infeasible.2. HUGE power draw: Equals more heat, hence more cooling necessary, hence bigger size.Exceeding PCI-E specs is very worrisome.I think TWO GTX 690s would consume about the same or maybe even less power.3. LOUD. +Coil whine which is even more annoying than just loud.4. LOTS of microstuttering (virtually unplayable without using third-party software).5. Price. Let's be real. $1300 is optimistic, and availability is a shot in the dark.Pros:1. More FPS. Doesn't matter though unless you're using multiple displays, but that comes with the HUGE downside of giant bezels in your face.2. Little to no microstuttering with third-party software. The only saving grace but doesn't add a whole lot since GTX 690 microstuttering isn't that bad.Calling these three-slot monstrosities "inelegant" is possibly the nicest thing you could say.[/citation]

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&Order=BESTMATCH&Description=radeon+hd+7990

PowerColor has two 7990s, one is going for $1000 and another for $900. Where are you getting this $1300 number from? Sure, availability is poor, but the pricing is not.

Two GTX 690s consume a good deal more power than a single 7990. Yes, the 7990's power consumption is far too high, but lets leave exaggeration out of it.

Why is exceeding PCIe specs that worrisome? The cables are more than capable of handling it, it's fine.

Quad Crossfire is easy. Simply get a system with eight expansion slots such as the Gigabyte G1.Sniper 3 with a case that has eight expansion slots too (very common among higher end cases) and you'd have a full two slots for air flow between the top and bottom card, that's plenty. Heck, even one slot of airflow with a much cheaper board and case would probably be just fine. What I'd be more worried about is getting a PSU that can handle the load and the ridiculous power bill entailed.

Tom's only said that the Power Cooler model had bad coil whine, not the HIS model.
 
G

Guest

Guest
Eh...not a fair comparison, IMO. The 2gb memory on the 690 is quite limiting. And these are specialty run AMD cards. They'd be more comparable to an evga gtx 680 classified 4gb (doesn't come in 690 variety). Simply because nvidia has not pushed the 690's very hard.

Furthermore...the limited 256 bit bus on the 690 causes some bandwidth limitation issues. That's why I returned my 2 gtx 690's and went with quad gtx 680 classified 4gb. Even then I have the gpu's running at 1300mhz, and the memory at 7300mhz up from 6000mhz.

Vsync also adds input lag. I'd like to know about the radeon app used which was supposed to lower micro stutter. Because if it lowers micro stutter at the cost of increased input lag, it's still not worth it for the hardcore gamers that would be getting these kinds of cards. Substantially better power usage and nearly no micro stutter is why I got my cards.

And with a 120hz 1440p monitor from 120hz.net I really don't need to use vsync.
 
[citation][nom]HyperMatrix[/nom]Eh...not a fair comparison, IMO. The 2gb memory on the 690 is quite limiting. And these are specialty run AMD cards. They'd be more comparable to an evga gtx 680 classified 4gb (doesn't come in 690 variety). Simply because nvidia has not pushed the 690's very hard. Furthermore...the limited 256 bit bus on the 690 causes some bandwidth limitation issues. That's why I returned my 2 gtx 690's and went with quad gtx 680 classified 4gb. Even then I have the gpu's running at 1300mhz, and the memory at 7300mhz up from 6000mhz. Vsync also adds input lag. I'd like to know about the radeon app used which was supposed to lower micro stutter. Because if it lowers micro stutter at the cost of increased input lag, it's still not worth it for the hardcore gamers that would be getting these kinds of cards. Substantially better power usage and nearly no micro stutter is why I got my cards.And with a 120hz 1440p monitor from 120hz.net I really don't need to use vsync.[/citation]

2GB isn't limiting much at all per GPU right now... Even at triple 1080p or triple 1920x1200, there are only a handful of situations where 2GB per GPU becomes a problem and even then, simply using a less memory capacity-reliant setting and more GPU-reliant setting solves that issue just fine. Unless you have something like triple 2560x1440, 2GB is rarely a bottle-neck. At only one 2560x1440 display, I'm not aware of any game that is bottle-necked by 2GB of frame buffer capacity, especially when you're going for high frame rates for a 120Hz display.

Sure, they're memory bus is a limiting factor, but if you cared about that, you could have simply gotten 7970s instead...
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS