GTX 295

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

emp

Distinguished
Dec 15, 2004
2,593
0
20,780
I don't think it will be feasible without the 55nm parts... There's a reason why they haven't released this thing yet, heat issues more than anything.


 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
The 4000s single GPu has been known to have 80+ degrees, while the 260 and 280s manage a 60 degree temperature, yet we're talking about heat issues????

I think power drain would be a reason not to do, but heat, I doubt it. I mean the 4870 X2 does great with heat, and again look at it's single GPU little brothers:)
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
so ur saying that the 9800 GX2 could do 90s and the G200 cannot do 80s or 90s?

I'll take you up on ur offer, I'll turn the fan speed from 33% to 10% and see what happens:) I'll down clock them to stock 280 GTXs.

Now I can do 3 280 GTXs on a 1000 watt PSU, and like I said I don't think heat will be an issue so much as power drain, which will prob need a 1200 watt PSU. These are just estimates:)

I mean 3 280 GTXs, which pretty much blow into each other, run at 65-70 degrees on load.

I don't think this is too far fetched.

The price might be outrageous, maybe hit high 600s-700s, with a die shrink, I think it can be do able.
 
The point being made here is, its not the temps coming out, but the temps on the core. nVidia has their cards/cores at lower temps for reasons, not convenience, as weve all seem higher temped nVidia cards out, as per your example.

Each card/solution has an ideal TDP operating temp and spec
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
so ur saying even with a die shrink this is impossible? 9800 GX2 is just the G92 chips Pasted together.

I understand what ur going for but, really its not impossible. So ur saying that the core temperatures, of the 4000 series, is lower than the Nvidia series. I said core temperature, not the temperature coming out:).

I really want to know, I mean I might have gotten this all wrong. No offense or anything, but ur posts are becoming very narrow minded compared to what they were a couple of months ago, thats what I'll leave it at.

I just want to point out that the operating temp of the GX2 is set to be higher than that of the other G92 cards, yet they are all the same GPU, no G92A or G92B
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
Sorry L1qu1d, the 4870 produces less heat than the GTX 260 by a decent margin, the fan speed is why they seem so high. Run those GTX 280s at 2% fan speed then tell me I'm wrong. The 9800 GX2 is a bad example again, the problem is that to beat the 4870 X2 they need to not only keep STOCK settings on those cards, but also OVERCLOCK them by a pretty large margin to compete with the 4870 X2's extra performance over 2 4870s.

That all said, I hope nVidia can manage to get this thing to compete with the 4870 X2 so that prices drop and both companies can be spurred into action on the next architecture. Besides, "GTX 295" sounds like a cool name!
 
What Im saying is, the G200 isnt a G92. Cant be treated the same as fatr as thermals etc. I never said it cant be done, as if it can, it will be.

What I am saying is, between power draw and their design (sandwitch style), it may make it even harder, plus adding the extras the 4870x2s dont have to have like the 295 will have make it draw more power, POSSIBLY make for harder to cool solution due to the sandwitch design, even at 55nm.

Its obvious nVidia had trouble going to 55nm. So far the assumptions have been heat/leakage. If this is true, making these cards will include those problems. Im looking at this from a technical aspect, no fanboyism here. I think nVidia needs this card, and they need to price it right. Im hoping they do
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
I dunno....the 4870 does run extremely hot...the fan speed isnt everthing, there is still the back of the PCB where the chip is and what not. Just removing my 4870 and getting this 4850 IceQ4, my overall system temps droped by a good 1-3c, thats NB and CPU. At 45% fan speed, i couldnt keep my finger in the back of the PCB of my 4870 for longer than 3 seconds without being stupid, and that was while CCC was reading GPU temp at 55c-60c, idle.

I wont say that this card is impossible, but it might be stretching standards a bit. By that I mean its probably not going to be possible to be used without GPU support brackets, aka special cases or custom made support. These single cards like 4870 and even my 8800GT bend enough, this new card will probably break the damn PCI interface.
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790


You bring up a good point. The amount of heat a card is able to sustain depends on a lot of things. The GTX 295 is very possible, but it will most likely not outperform the 4870 X2 in any real way, kind of like the GTX 260 vs 4870. At this point I just can't find a good reason to why it would be possible, but hey who knows. nVidia might surprise us all. You also have to understand this GTX 295 will almost have to cost more to produce than the 4870 X2, even with the die shrink. This raises the question, what does nVidia have to gain from this release? They already own the top performing setup, albeit not by too much but at that price point the consumer paying that much isn't even conscious of any price/performance ratio, so why do they need to compete in EVERY price point? It just doesn't seem like a good idea on nVidia's part, but we will find out.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
What noticeable extra performance, only thing I see is the gain of going from 512 megs to 1 gig. I mean the 280 GTX is only 30% slower tops than the 4870 X2 (ofc ATI games would do the ati card wonders, just like Nvidia games do the Nvidia card wonders).

Your saying that another 280 GTX stock, couldn't easily close the gap of 30%? Come on I think that can be done. The 280s run at a fan speed of 33% stock I believe, your sayign that the 4870 ran at 2% or what I don't get ur point. I mean 2% isn't that basically barely rotating? I really doubt any card could last a 2% rotation percent.

I'm not defending any company at the moment, and I'm not even saying that the 280 GX2 would be better, all I'm saying is its DO ABLE, I don't under stand where these comparisons come from blood_raven. I mean how is the 9800 GX2 a bad example, thats a dual PCB card, it runs on 2 every day G92 chips, yet it can manage a higher temperature in the manual than the rest.

I'm not arguing who can beat who, frankly I don't care because I don't see myself ever buying a dual GPU card again, only 1 I bought was teh 4870 X2 cuz i got suckered in by the whole, it performs like a single GPU rumour, and also in all fairness I'm a star Craft fan, and I'd like to see it in Dx 10.1:)
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
2% speed is about 40RPM, the card would be dead in less than a minute. Add to the fact that heatsinks with fans arent designed to be ran passively.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
Its always been like this, Nvidia has the best performance, but at the highest cost, and ATI could almost match at a lower cost. Thats why I was with ATI up until the 8 series, and ATI's horrible 2900 XT (which sold almost as much as the 8800 GTX and the 2900 XT performed at 8800 320 levels).

frankly all I'm doing this Christmas, or possibly my birthday is grabing and middle i7 CPU + motherboard, and seeing what will come this march. I mean with the amount ofpower 3 280 GTXs have and scaling it has, the quad sli setup of the 280 GTX isn't that appealing to me, I mean its an extra card...IF it happens.

I think a single GPU card solution from both companies would be more appealing to me.
 
The G280 may be done, but all the speculation is pointing towards a 260, not a 280, and this changes things alot, as for performance goes, as even the 216 isnt a killer for the 4870 1 gig. Like I said, if its done, itll win, or why do it? But it wont win by huge margins. The 280 sli solutions dont always win against the x2, not in all games. My guess is 10% faster, which is pretty good actually, but again, nVidia needs to price this thing right

As to the relevence, I point out the 7950x2, it was a horrible card, early eol, total lack of driver support, and best left forgotten. But nVidia didnt have a response for ATIs 1900 series, so they slapped it together. Why? Because a halo product IS that important to them, and is why we probably will see something like a 295
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780


I agree. There are just too many hurdles in multi-card setups, on a hardware and software end. You almost never get your monies worth, something is always there to screw things it, be it scailing, bus bandwidth, power requirements....... Sometimes I feel like SLi/Crossfire are only around as a lazy technology. Why make a stronger single GPU when you can put 2 "weak" ones together and end up spending more money? I know thats not exactly how things are, but...


 
Heres another thing to look at, which I pointed out, and may have been mistaken about. The die size for the 4870 is256mm vs the G200s 576mm. Reducing the G200 30 some odd percent isnt going to get it anywheres near 256, which means theres a disparity, a large one, even at same nodes.

What does this mean? Costs, possible trouble going from 1 node to another (thus my references to possible heat issues) and lastly delays. Remember, when ATI went from the 2900, the 3xxx series offered NO improvements in perf, just like here. It took ATI 3 tries to get it right, as it may nVidia, just specualtion, but seeing the delays, ya never know
 

The_Blood_Raven

Distinguished
Jan 2, 2008
2,567
0
20,790
L1qu1d you missed the point entirely... the GTX 295 should be able to match the 4870 X2, but it wont outperform it in any substantial way, atleast I don't see how. No one is talking about a GTX 280 GX2, we are talking about a GTX 260 GX2. Can you honestly say 1 GTX 260 and 1 4870 are not almost exactly the same? I said the 9800 GX2 is a bad example since it couldn't match 2 8800 GTS 512s in SLI. If the GTX 295 was like a 9800 GX2 then it could not compete with the 4870 X2, meaning nVidia would not be likely to release it. You need to stop getting so personal about this, no one is saying it is impossible, no one is saying it wont work, and no one is saying it wont atleast match the 4870 X2. I am just wondering what nVidia thinks they can gain from this as I can't see anything.

For the record I meant 12%, which I think is the same RPMs as the 4870 @ the stock 22-23% fan speed.

Regardless it will be interesting to see what happens, maybe nVidia will be able to make a decent dual GPU card, the 9800 GX2 was not.

Off topic, is it just me or is the 4850 X2 inferior to 2 4850s in crossfire? It seems like it only slightly beats the GTX 280, but should do better and beat the 9800 GX2 which it only does at times.
 
Well, a very large part of the 4000 series heat issues was the fan, at least for the 4850 (the 4870 does draw a bit more power). At the stupid stock settings, my 4850 (with the initial single slot cooler) idled well up into the 70's (and topped out who knows where, I never let it try). With the fan at 50%, it now idles in the 40s and tops out in the 60's. It also is worth noting that the original coolers are horrible for the 4850. I'm not really sure they are even copper (look almost like painted aluminum). Oh well, it will be interesting to see what clocks NVidia can pull of with the 295.
 
Yeah, I come and go. I post a bunch when I am getting ready for a new build, as I am now. My attention sort of floats between comps, other hobbies and stuff, and college. I see your comp's been upgraded a bit since I last saw it.
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
Why? It forces competition, which forces price drops.

Nvidia already has the crown for strongest Setup. I mean ATI has the single card crown and price/ Performance, and again Nvidia has the crown for Single GPU.

Value I give it to ATI, and Performance I give it to Nvidia (if u have the cash)