GeForce GTX 295 In Quad-SLI

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Crashman

Polypheme
Former Staff
[citation][nom]cardnyl[/nom]These power draw numbers look completely bogus/fudged. The power consumption article done by Tom's (http://www.tomshardware.com/reviews/geforce-radeon-power,2122-3.html) shows a single GTX 280 test system based on an older and more power hungry chipset and cpu pulling 380watts at the wall peak 3d. Yet somehow their triple SLI gtx280 system with OC'd core i7 only pulls a measly 636watts at the wall peak? One of these articles is completely wrong.[/citation]

This article is completely right, the measurements were taken with a global wattage meter at the plug. Of course, these numbers shouldn't be confused with other tests that simultaniously stress the CPU at full load using several instances of Prime 95.

If you look at the numbers, 478W for two cards (peak 3D power) and 316W for a single card (peak 3D power) means the extra card pulls 162W. If you subtract that number from the single card results, you get 154W for the platform at low CPU load and low-heat (liquid-cooled CPU).
 

romioforjulietta

Distinguished
Sep 5, 2008
12
0
18,510
hey guys seriously the main competitor for the GTX280 is the 4870 1GIG version so can you tell us why didn't you use the 4870 1 GIG version in this benchmark?
it's obvious that 4870x2 and GTX295GX2 are almost in the same level.

and what about FARCRY 2 when enabling AA the performance of thee 4870x2 goes down,i thought that FARCRY 2 is a DX10.1 GAME so naturally 4870x2 should be the best in this GAME like other DX10.1 GAMES "STALKER CLEAR SKY,ASSASSINS CREED and CALL of JUAREZ".

 

Christopher1

Distinguished
Aug 29, 2006
666
3
19,015
Pretty good showing from NVidia's parts... however, ATI HAS to work on the power consumption of their cards..... WOW.... couldn't believe how much juice the two 4870 X2's sucked at load...... I doubt my home could support that much of a power draw without the wall wires bursting into flames.
 
G

Guest

Guest
Just FYI. Standard Frame rates for television = 25 - 29.97 FPS.
(movies, tv shows, DvD's, etc)
 

Tindytim

Distinguished
Sep 16, 2008
1,179
0
19,280
[citation][nom]vinak[/nom]Just FYI. Standard Frame rates for television = 25 - 29.97 FPS.(movies, tv shows, DvD's, etc)[/citation]
Nope. It's 60 NTSC and 50 PAL.

But why was it relevant?
 

Crashman

Polypheme
Former Staff
[citation][nom]Tindytim[/nom]Nope. It's 60 NTSC and 50 PAL.But why was it relevant?[/citation]

No, actually its 30 NTSC and 25 PAL. TV signals are interlaced, so they display 1/2 frame for every cycle. That's how 30FPS runs at 60Hz and 25FPS runs at 50Hz.
 

Tindytim

Distinguished
Sep 16, 2008
1,179
0
19,280
[citation][nom]Crashman[/nom]No, actually its 30 NTSC and 25 PAL. TV signals are interlaced, so they display 1/2 frame for every cycle. That's how 30FPS runs at 60Hz and 25FPS runs at 50Hz.[/citation]
Interlacing doesn't change the frame rate, otherwise we wouldn't call it 60i, we'd call it 30p. And even if that were true, plenty of sporting events are in 720p.
 

Crashman

Polypheme
Former Staff
[citation][nom]Tindytim[/nom]Interlacing doesn't change the frame rate, otherwise we wouldn't call it 60i, we'd call it 30p. And even if that were true, plenty of sporting events are in 720p.[/citation]

I said TV signals, not digital video signals. NTSC is an analog standard at 30FPS. Each frame image is divided into odd and even frames, so 1 image becomes 2 half-frames and 60 half frames is 30 frames.

:p
 

wick001

Distinguished
Dec 30, 2008
126
0
18,680
I have almost this exact same setup, drivers.

Core i7 OC'd to 3.9GHz
G.Skill Ram @ 1480Mhz
(2)Sapphire 4870x2 Crossfire OC'd
Gigabyte UD5
WD Caviar Black 32mb Cache 640gig

I can't pull more than 49FPS Crysis Benchmark 1680x1050 Very High/DX10/Vsync Off/64bit/NOAA

Am I missing something?
 

spearhead

Distinguished
Apr 22, 2008
120
0
18,680
this is brutal. the prices are very high too so it evens out the settings :D. but it leaves me with a choice a hard one because there is no way i can afford a core I7 system and a veryhigh-end graphic card togheter. so what should i choose? would radeon 4870 or gtx260 be enough for me when building a well build core I7 setup or should i go for radeon 4850x2 perhaps AKA dustblower if a afthermarket cooler would be afofrdable for it i would do that.
 

spearhead

Distinguished
Apr 22, 2008
120
0
18,680
[citation][nom]Pei-chen[/nom]Very very nice review. I have been an advocate for 1.5GB+ RAM on video card ever since I got GTA4. It seems like ATI/Nvidia haven’t realize the need for larger buffer on $400+ cards.Either way, I just bought a 4850 1GB OC to replace a fairly new G92b 9800GT as it only cost me about $20 to upgrade. I think the 512MB RAM on 9800GT is bottlenecking a few game at 1920*1200. Hopefully the 50% faster 4850 will solve that problem.[/citation]
yes that would be perfect. I realy hope the next Gen single GPU cards will either be DX11 or be much faster then this gen because it would make much sence for crysis

 

wicked-warlock

Distinguished
Feb 20, 2009
5
2
18,515
I think that amd needs to work on drivers still or a flash for the cards. The reason I say that is they don't scale well at all. But more evidence is looking at the world at war chart. Same applies to nvidia, seems to need a tweek at the quad setups. Seems to be conflicts. I think after a few more driver releases and perhaps a firmware change we will see clearer.

currently, if you play crysis at 2500+, pair of ati is the only solution at the moment. We shall see it time. By the way, new egg sapphire 4870x2 is dirt cheap if you buy a pair. $414 each, That's nearly 200 bucks cheaper than a pair of gtx295s. Got to love competition!
 

dingumf

Distinguished
Apr 20, 2009
313
0
18,780
The only problem I have with GTX295s in quad sli is micro stuttering.

I'm going to hold off for till DX11 GPUs can provide the power from 1 GPU
 
Status
Not open for further replies.