GTX 295

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Exactly. Remember ATI's original goal. They wanted to lead in the mid and low end, which they are doing (or at least competing well). The fact that they pulled such an upset was just icing on the cake. A lot of their reputation lost during G80's reign is back. And if you still are pulling hard for them to hold the top, their new "performance" driver will be out soon (betas are already out).
 

emp

Distinguished
Dec 15, 2004
2,593
0
20,780


If I remember correctly the 9800, X800, X1k series all outperformed their direct top-end nvidia competition by a decent margin, so the statement above just isn't true (Not even the 7950GX2 scaled that well if I remember correctly).

If you didn't plan to use AA (which is a retarded thing to do for the kind of money you'd be paying for the card), the HD 2900 XT sat somewhere between the 8800GTS 640 and 8800GTX, but when AA was turned it absolutely killed any performance. For now it's time to sit on what we have and wait for Q2/Q3 2009 to see what the new crop of cards will bring to the table, because this whole GTX 295 is just a gimmick card like jaydee says.

On a side note l1qu1d, who knows what deneb might bring to the table, if it beats Yorkfield like some rumors are pointing out it might be worth checking out, unless you plan on doing stuff besides gaming, I'd hold off from buying any i7 until we get an idea of what deneb is all about. Just a heads up, you might end up kicking yourself if you don't...

 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
7950 GX2 alone did ok, quad didn't scale almost at all. The 9800 XT, I had it, and it exchanged blows with the top Nvidia card (in all honesty can't remember it's name...FX something, I really didn't like Nvidia then, now I don't really care which:)). The X 800 can't really remember, I had the XL:p

Thanks for the heads up. Gaming will be all it is. For work and school I have my laptop (which can game on its own, but the CPU is holding it back).

I clearly remember the 2900 XT, it was in direct competition with the 8800 GTX, at the rate it performed, I saw in THG's old and good charts (not now) that the 8800 GTS 320 beat it a couple of times. Ofc seen other reviews 2:)

 

emp

Distinguished
Dec 15, 2004
2,593
0
20,780
I'm pretty excited about Deneb, more than anything because of it's supposed overclocking potential. (Even though I'm not getting one now because I'm broke :na: )

You need someone to jumpstart your long-term memory :p, the FX series was nvidia's worst mistake ever (and they accepted it). Those were dark times for them and I'm sure they never wish to return to that place... ever again.
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790


I remember my mother's FX5200, and when we UPGRADED to integrated graphics, the 6100...
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790


No it will be a great day! It will spur ATI into action, and not just sit on the 4xxx series, and that will spur nVidia to action. You get the idea?
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790


I agree totally. I want Deneb to do well because we need some competition and we need it now. If the 3Ghz Phenom get atleast Q9400/Q9550 performance and overclock to 4 Ghz on air for $200 or less it will be a win for AMD. However that's not all that likely...
 

Kari

Splendid
I think people are missing out on something here, the JEDEC PCI-e standard that limits the power draw of a single card to 300W.

The heat output of a card equals the power draw, and the temperature of the chip is also effected by the effiency of the cooling solution. (of course not all of that power is dissipated at the core itself)

4870x2 consumes some 270W allready. Single GTX 280 consumes 180W so just doubling that totals to 360W which is way over the 300W limit. so thats a no go. the original 192shader GTX260 uses 140W which is a lot closer, but the 216core version has to consume more (dont have any real numbers for that, sorry, questimate 155W).

The die shrink will shave off quite a few watts but how much exactly remains to be seen. Anyways it seems very feasible to pull of a 55nm GTX260 216 based doubledecker card with the same clocks as the normal card. This is of course purely based on the power needs alone. Designing a cooling solution that can dissipate almost 300W of heat output while keeping the cores in a decent temperature that fits between two pcbs might be a bit more difficult. If they can make it happen, thats cool, but if they can't, they are going to have to lower the clocks to bring down the heat output and thus the temperature... (and performance)
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
That is my exact point. They will be able to barely keep stock clocks. Since the GTX 260 isn't exactly suffering from much of a vRAM or bandwidth shortage so it wont gain much from going to 512/1GB per card, atleast not nearly as much as the 4870 did. That said I just can't see the GTX 295 being much better than the 4870 X2, I expect it to be about exactly the same performance, like the GTX 260 and 4870.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
Then there wouldn't be much point in making it, since the 4870 X2 quad doesn't beat the 280 GTX tri.

If thats the case.

Unless the appeal would be to sell them under the 4870 X2 and around 1 280 GTX, then i would shift something. though the 260 GTX won't gain much goin from 896 to 1024, it would gain alot if the card had Gddr5, from the 448 bit architecture.

Either way, we have not hard facts yet, so we'll see:)
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
Yeah, lowering the price or GDDR5 is about their only option. Problem is I still keep hearing how nearly impossible it would be for nVidia to do GDDR5 without atleast a partial architectural overhaul, not just a die shrink. It will be interesting, but I think we are on the same page:

Unless something very unlikely happens, then whats the point of it?
 

daskrabbe

Distinguished
Feb 3, 2008
213
0
18,680

Which is why they will have low voltage, which is why it will suck at overclocking.