How low can nVidia go?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
To me, its like we just now get thrown a few breadcrumbs about Larrabee. Why now? Could it be that both nVidia and ATI just released their cards? Same sorta crap to me. And what did we get from the Larrabee "announcement" ? Itll be made up of PIII's, and run x86, which we knew already. So, actually we learned hardly anything. This supercard nVidias sitting on? Im split on this even if its true. Didnt they just release the top card less than a month ago? Then drastically dropped the price on them? And to have another so soon? Id be madder than hell if Id bought a G280 at 750 if this is true
 
Im not sure a 55nm shrink would be enough, unless parts are disabled, and it to fit the power requirements. With the TDP of the 280 at 236, adding another 60% to that just seems impossible to do. Correct me if Im wrong, but isnt the max power of a mobo+8 pin + 8 pin 275 watts?
 
JDJ Wrote,
Id like to ask, why aren't some people excited about these cards? They'll be 60% faster than the current king. It introduces new tech, one that may bring with it the future ability for growth in this market. If your not excited by these cards, then I guess you only get excited by nVidias offering, but they aren't the only ones in this market.

Its because ATI fans are different to Nvidia fans. You will also notice rather bizarely that it was the Nvidia fans that started the moaning and bashing when the 260 and 280 cane out. "Its not as good as it should have been" etc etc.
Despite what people are saying these cards(4 series) are currently the undisputed leaders in the value and performance stakes.
I think this X2 will be at least as good of an improvement over the 3 series X2 as the single cards are over their 3 series counterparts.

While its true that the GTX/ULTRA were top of the tree for a good while that's just not realistic, it was only caused by the collusion/goalpost moving that went on with the launch of DX10. Everybody said DX10.1 wouldn't be great shakes, the performance increase says otherwise. I will be very interested to see how the Nvidia cards handle it now that they have just decided its worth using after all. Remember the ATI architecture was always made to run it, does that mean a new arch for Nvidia ?
Mactronix
 
And FUaD is now saying nVidia isnt going DX10.1 . I cant see never making a true DX10 card, which was the original DX10 before they took out the .1 part, and going straight to DX11 as improving our tech. Theres no excuses for this either. If someone wants to halt tech improvements and make money, thats just not right. I dont care how good those cards were then, and I mean then, not now. Now those same cards are mid low end. And would be evem lower if DX10.1 were actually done on games. nVidia has dug themselves a hole. This one wont be easy to climb out of. The worse part is, this isnt what they need now. Theres rumor of their drivers rewriting Vantage as well, which doesnt look good no matter what it actually does. nVidia better get up and start going
 
Yikes. Now Fudzilla claims NVidia and ATI are going to DX11 for their next series! I didn't even know we had any REAL DX10 games yet! How long did we have DX9 for? This seems a really quick turnaround.
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
If Nvidia has "a monster" waiting to release, then that makes the GTX 280 and 260 completely pointless since ATI beats them on both fields. They might as well go ahead and fade them out if thats the case. Oh wait, consumers are already doing that themselves.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
The only "monster" I can think of coming from NVIDIA this year would be GT200b, and that should be more of an incremental improvement, 9800GTX+ style. Maybe they'll try to do a GX2 with it, but that would be insane if you ask me.

If they do have something else coming up they are doing a damn good job keeping it under wraps, which is not something NVIDIA is known for...
 

If its a hidden monster, then they stand the chance of alienating their early adopters. And unless its several cards, it doesnt help their current over valued position with the reast of their cards
 

spathotan

Distinguished
Nov 16, 2007
2,390
0
19,780
Thats exactly what I was saying. Bringing a new "better" card with prices competitive with the ATI offerings would make the GTX 260 and 280 literally pointless and would drive them into the ground overnight.....which is preety much where they are right now, theyve dug a hole they cant get out of. They cant survive on GTS/GT sales at $180 a pop forever, espically with teh 4850.
 
Thats what I mean. nVidia put themselves in this hole. Their die is too large for a whole lot of tinkering. It HAS to be the best, as the g90s was to be their mid range and lower. So, Im thinking much further price reductions for competitions sake are in order. The G280 at 350, the G260 at 270 USD. No other way around it. And, if they say theres something in the works, people wont buy the G200s at all. Also, with DX11 and them skipping DX10.1 , DX11 will have DX10.1 and tessellation, which is already found on ATI cards as well. Then, their physics driver changes the Vantage files. Can we say, close to FX?
 

jonyb222

Distinguished
Jan 20, 2008
334
0
18,780


That somehow sounds familiar...

4622_f6f7fdf1.jpg


jk, I'm actually way too young for it to sound familiar (only started at the time of the 2XXX series)
 

jonyb222

Distinguished
Jan 20, 2008
334
0
18,780
^^ yea, I though that was pretty funny, people were getting shafted by both sides X years ago (not sure about the number of years)
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790



i thought id quote my self

will i was right in thinking that it might run deeper than just laptops

http://www.theinquirer.net/gb/inquirer/news/2008/07/09/nvidia-g84-g86-bad

The short story is that all the G84 and G86 parts are bad. Period. No exceptions. All of them, mobile and desktop, use the exact same ASIC, so expect them to go south in inordinate numbers as well. There are caveats however, and we will detail those in a bit.


well i still think its still the tip of the iceberg
round and round it go's where it stops is anyone's guess
 

one-shot

Distinguished
Jan 13, 2006
1,369
0
19,310



...LOL...what next?
 

jonyb222

Distinguished
Jan 20, 2008
334
0
18,780


"Urge to kill rising!" -Homer J. Simpson

:D well ain't that just perfect, those are pretty much the only gaming GPUs you can find on a laptop
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780

You left off the disclaimer:
WARNING, THIS ARTICLE WAS WRITTEN BY CHARLIE DEMERJIAN
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280

It took me about three sentences in until i already knew who wrote it. The ranting, subjective writing style sticks out like a red flag.