How low can nVidia go?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790
plus i came to the same conclusion if a company said they where going to lose 2-300 mil on a duff process you can bet its going to be double that


all i can say is wait and see but if i was a betting man i would say all nvidia's stock from the time they started useng that process is duff motherboards and gfx cards
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

Let's not forget, as the Tom's article hinted, that a die-shrunk version may have to ditch the 512-bit memory interface and fall back on a 256-bit one, much like was the case of R600 to RV670, as well as G80 to G92. Given that it appears that the memory bandwidth increase was the chief improvement for the GTX 280, it looks as if any revision will be WEAKER, just like how the 8800Ultra was generally the most powerful card in spide of not being the revision.

Couple this with nVidia certainly having to hack things down in order to get two GPUs to work with 275w or less of power, (when the GTX 280 was some 236w) and you're going to have what will probably be a laughable card... I doubt it could actually compete with what we're seeing of the 4870X2, and likely may not be a whole large margin better than the GTX 280; pehaps 30-40% more powerful at most. Whoops!
 

firebird

Distinguished
Nov 13, 2004
516
0
18,990
What would the power requirement for tri or quad-sli be on a 280? 1600 watts or more? I really want to see this happen just to get a glipse at the power supply(ies)!
 

About 700(+/- 50) watts for the cards alone. Add in a system and your in the 900-1000ish area. You want a bigger supply for more overhead and a buffer to avoid overloading the psu.

They are only in the 175-185 watt area per card.

@ full load 1 GTX 280 is about on par(+/- 10 watts) with 1 9800GX2 for power use.
 

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
The power draw of this generation cards is why I put aside day dreams of CrossfireX. It's simply not worth it for games at the resolution I'll be playing at this fall (which will be @ 1920). Since that's the sweet spot for the 3870x2, I could even get away with not upgrading in this generation.

I do want a 4870x2 though. If Deneb pans out, it will be cool. It would definitely be cool with Nehalem. Anyone else sort of agree with Tom's "what about this other card?" page opinion that it's not worth upgrading unless the new card jumps up 3 spaces on the chart?

 
@ nottheking They could move to GDDR5, if the supplies there. That would put the costs lower, both from a simpler pcb and a 256 bus. It would also lower the power requirements. I just dont see mine nor your solutions as a possibility, at least at this stage. Mine, because the supply issues of GDDR5 and compatability/redesign, and yours because frankly I just dont see it as enough from a performance perspective
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310

Well, I never said that what I proposed would've turned out to be good... It was more of pointing out that nVidia's got a problem on their hands, and suggesting how that wouldn't pan out too well. And likewise, I do agree with what you say on GDDR5... nVidia's sunk quite a bit into staying with GDDR3, and it'd likely cost too much to redesign their GPU to accomodate it.

In other words, I find it doubtful that nVidia will be able to really manage to provide comparable or better performance than the GTX 280 with any follow-up card, at least given what they have to work with in the next half-year or so. The GTX 280 seems like it kinda WAS nVidia giving it all that they had.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

Yep, with 1.4 billion transistors and a 576 mm2 die I'd say they were giving it all they had and then some.
 

ashkon52

Distinguished
Jan 5, 2005
57
0
18,630
ok I had recently bought a GTx 280, prior to the 4870 being released. In spite of my knowledge of the HD 4870, I had presumed the GTX 280 would be the clear winner as the single best GPU card. So I ordered it.

Suffice to say, 1 week later and this mayhem that is Ati-Nvidia war and I had never seen such an event. Within that week, Ati's 4870 proved to be almost as strong as the GTx 280 and half the price.

Within the same week, I was told the GTX 280 would drop its price.

So you can imagine my horror!

I have spent the last week trying to decipher the hype from the fact. And trying to determine a comparison between the 4870 and the 280, irrespective of the cost.

Through this discussion with people such as yourselves, it has become clear to me that ATi has equalled or beaten Nvidia in every department.


Therefore, I have managed to cancel my order of the GTX 280.

I will wait 2 weeks and see what is going on... who is going to come out with the winner.

And There is a reason for this post.

My point is that this back and forth war between Ati and Nvidia is killing the consumer.

A price cut...
new cards...
will support 10.1 ... wont support 10.1
This coming out ... that coming out
X2's

I have been burnt and I wont forget this... (looking mostly at you NVIDIA)

Price cuts or no price cuts... X2's or no X2's

JUST RELEASE THE CARDS of this generation.... around the same time. (Release the specs at the same time at least)
And Put a price on it


AND STICK TO IT FFS

Bring them out together

So that we the consumer can make a choice.
Low end...
mid end
OR high end

When did this become a game of cat and mouse.

 

ashkon52

Distinguished
Jan 5, 2005
57
0
18,630
yes it is.

I dont remember the last time a price cut was made less than a few weeks after its release.

And now with X2's and Xfires\SLi's... I feel like we are just completely in the dark

There used to be a time.. not too long ago...
there was low, mid and high end card to choose from
And all you needed to guess on was which brand to wait for.

Now... there are so many variables... its ridiculous...
I just want to buy a card and know its going to be king for 6 months.

Sure they released Extreme edition or clocked up editions... but there was a respectful period between releases.

These incremental releases are frustrating.

And ultimately the companies suffer too... because we are all in the paralyzing position of waiting it out until everyone has revealed their cards for this generation.

So show your cards FFS and be done with it...
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

As strangestranger pointed out, there will always be price cuts. I'm sorry if this upsets you :hello:
 

mathiasschnell

Distinguished
Jun 11, 2007
406
0
18,780
I agree that competition is always there, prices always change, there's always confusion, etc, etc. but you have to admit he does have a point that these are probably some of the quickest price cuts we've ever seen happen.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
These might've been the quickest price cuts, but they were also the most predictable price cuts ever... Even nVidia saw, after it was too late to go back and change their designs, that they were very liable to be in for a beating. The cards performed as most mainstream predictions foretold, and as well as AMD designed and predicted; the 4870 fell between the GTX 260, and the 4850 bested the G80/G92 parts.

It was pretty much a given that at the original price, the GTX 260 utterly lost all reason to exist, and the GTX 280, offering only a tad more performance than the 4870, could hardly argue it being worth more than twice the price, especially with a dual-GPU version of the 4870 on the way that would've been cheaper as well.

So really, It was a matter that nVidia was going to lose either profits or market share. As standard decision-making goes in the market, they opted to sacrifice profits... With them now making perhaps next to nothing on the GTX 260/280 cards.

Really, dealing with the technology market, it does take a bit of education to be able to properly and accurately predict what will happen. But this time, yes, people were able to predict things. A lot of them predicted wrong, because they lacked prior knowledge and education... As a lot of people mention, they've only had their heads in the GPU field for the past couple years ago, so they've never seen anything like this, while a few of us were around to remember a similar case when the Radeon 9700pro came out against the GeForce 4 Ti.
 
Yes, unfortunately it is a risk you take when you buy a brand new item. It really sn't a rational thing to do (though I have done it for my past 2 Gfx cards) but it is still fun for me. My rule of thumb is that after you buy a new part, Don't look at the prices again, because in this industry they are sure to fall.
 

ovaltineplease

Distinguished
May 9, 2008
1,198
0
19,280
If Nvidia and AMD always performed identically on every level than there would be no competition and it would be bad for consumers.

Feel lucky that the competition is this fierce between the two companies.
 
nVidia didnt have a clue what ATI had, obviously. I think theyll stick it to consumers when they can, ala the G280 at 650 USD. But this is exactly why we need competition. Im actually worried for nVidia and ATI as well, since AMD is on thin ice and nVidia doesnt have a x86 license. Intels coming, and that could cause problems, more than we see now. @ashkon52, sorry to hear about your ordeal, but like whats been said, these things happen , and sometimes quickly when the competition drops a killer card on the other company. I wouldnt write nVidia off as a competitor or a company that cant take care of its customer. It will, but it may be awhile before we something good from them.
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790



I wouldnt write nVidia off as a competitor or a company that cant take care of its customer and will not overcharge, or a company that will not sacrifice image quality for FPS that cant be nvidia your talking about you must be mixed up

a leopard cannot change its spots i can see them doing the same things that got them into this situation over and over and over again till that Jen-Hsun Huang dies or something the guys a feckin arss

hows that can of whoop ass working out then jen, feck that guy must feel right dumb
 

mathiasschnell

Distinguished
Jun 11, 2007
406
0
18,780



Is that English?
 
Look, I hated the Crysis water thing, the 3DMark thing The DX10.1 thing, the AC thing having to wait 5 months for new WHQL drivers thing, the physics drivers changing Vantage files thing etc etc, and eventually things like this will catch up to you, yes I agree. But in the end, nVidia did bring out the G80, a great card/series. Theyre hurting now. The R700 will be 50-80% faster than anything nVidia will be able to produce, within reason. The 4850x2 will beat the G280, shrunk or not. The 4870 and the 4850 handle the upper end cards with the other 4xxx series cards to come will handle the mid-low end. I just dont see nVidia doing much for awhile. lower