Nvidia's in trouble

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

yipsl

Distinguished
Jul 8, 2006
1,666
0
19,780
[quotemsg=6572357,75,66863]How is advertising stupid? I don't here you criticizing Intel or AMD, and Nvidia has PLENTY of $$$$ to go around with their marketshare being so high. Fact:The 8800GTX has been selling like hotcakes since its release. Can't debunk that.[/quotemsg]

It's not just advertising. They aren't paying for a logo with no purpose. Nvidia's paying for access to game code so they can optimize drivers before the game's release. That skews benchmarks when new games arrive. ATI eventually catches up, but the few extra fps Nvidia gets often works to their advantage. It's not as dishonest a program as Intel's OEM rebates, but it's something that most reviewers and nearly all Nvidia fans don't take into account when comparing ATI to Nvidia cards.

[quotemsg=6572357,75,66863]
@yipsl How is Nvidia hurting because they can't buy AMD? :heink:[/quotemsg]

It was reported that AMD wanted to merge with Nvidia, but Nvidia's CEO wanted to be in charge of the new company. Ruiz didn't go for it, so AMD bought ATI. AMD had relied upon Nvidia chipsets for years and AMD CPU's with Nvidia boards and GPU's were popular among enthusiasts.

However, ATI had made great strides in chipsets and often had the better card in each generation, even if Nvidia had the fastest card because they skewed image quality (i.e. 7xxx series). So, ATI for Swift is actually a better deal for AMD. Recently, it was reported that Nvidia floated a buyout plan for AMD among it's Taiwanese partners, but they didn't go for it. It would have to be a hostile takeover, since AMD would not want to merge with Nvidia at this point. Nvidia's also soured it's relations with Intel, such that Intel went for the freely licensed Crossfire over Nvidia's expensive SLI.

At any rate, IGP will move off the chipset to the CPU; especially in notebooks. That puts Nvidia at a disadvantage with notebooks, OEM's and non-gaming budget builds. Nvidia needs a CPU and fast. Perhaps they could buy Via and get the old Cyrix x86 license? Perhaps they could buy their own from Intel? At any rate, even enthusiasts will end up with multicore CPU's having at least one graphics core.

That graphics core will probably work in power saving mode, powering down the GPU while surfing the net or playing video, but powering it up when gaming, doing 3D graphics or other intensive tasks suited to a discrete GPU. Nvidia will have to ditch SLI to join the club and have their discrete cards work alongside Swift (AMD) and Larrabe (Intel) fusion CPU's on AMD or Intel motherboards. At the end of the day, they may only have the business of loyal fans at the enthusiast end who demand Nvidia chipsets and Nvidia cards.

I see Nvidia in the same position that 3dfx was in many years ago, and they're in danger of losing marketshare in their core business to competition from both ATI and Intel in 2009, especially when ATI and Intel's GPU's work alongside fusion CPU's. We need standards, so I'd love to see SLI die out and be replaced with Crossfire.
 

systemlord

Distinguished
Jun 13, 2006
2,737
0
20,780
@yipsl, why don't the developers give free access to the code for both ATI and Nvidia? Don't the game developers want their games to look there best? I am in shock! :ouch:

 

firebird

Distinguished
Nov 13, 2004
516
0
18,990
Lets just hope that marketing gimmicks like "The way it's meant to be played" don't push better technology out of the market because the masses aren't aware of the better technology. Brings the BluRay vs. HD DVD war to mind.
 

themyrmidon

Distinguished
Feb 27, 2008
187
0
18,680
I would not doubt it if nVidia took a stab at the CPU market, they and Via are coming out with UMPC CPU's, this could get real interesting if both of them attack desktops. nVidia will not die, just rake in the dough while ATI catches up.
 

gomerpile

Distinguished
Feb 21, 2005
2,293
0
19,810
[quotemsg=6572357,75,66863]How is advertising stupid? I don't here you criticizing Intel or AMD, and Nvidia has PLENTY of $$$$ to go around with their marketshare being so high. Fact:The 8800GTX has been selling like hotcakes since its release. Can't debunk that.


@yipsl How is Nvidia hurting because they can't buy AMD? :heink:[/quotemsg]almost all the ones saying nvidia is in trouble have not seen the finacial report. 45 billion dollar profit hardly means they are in trouble some are talking rubbish than facts. I dare any showing a link or a stock market holder pulling out of nvidas monopoly.
 

ImajorI

Distinguished
Apr 25, 2007
274
0
18,780
Nvidia made $448 Million in net income last year (2007). They have always had great products. I don't see them in any trouble.
 

hairycat101

Distinguished
Jul 7, 2007
895
0
18,980
[quotemsg=6572430,83,126005]Nvidia made $448 Million in net income last year (2007). They have always had great products. I don't see them in any trouble.[/quotemsg]

What are you talking about!? They are going down in flames. They might not make into the second half of 08. :pt1cable:

Actually the only thing going down in flames is this thread.

Flame on, people! :bounce:
 

mtyermom

Distinguished
Jun 1, 2007
956
0
18,980
I see the 'Nvidia in trouble' thing as 'possible' but not necessarily 'probable'. It seems that maybe, just *MAYBE* Nvidia's run of success is stalling out, while ATi *might* be on an upswing. This is all VERY speculative and just guesses. We all know very well how these two companies have a history of leapfrogging each other. I don't think it's out of the question to see ATi on top again (if only for a short while). As far as 'Nvidia in trouble' I see it more in terms of being 'in trouble of not being on top until the next breakthrough that they can beat ATi to releasing' and not so much of them being in any serious financial trouble. Nvidia does definitely need to get their chipset business back on track before it takes a complete dive (but that's just an opinion).
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
It isn't AMD/ATI that Nvidia should be worrying about. Rumor has it that Intel is refusing to give Nvidia a license for CSI, which would effectively kill Nvidia's chipset business when Nehalem is released. Combine this with the fact that Intel could* be releasing a powerhouse GPU of its own (Larabee) and I see trouble for the green team.

With no CSI license Nvidia would be forced to either kill SLI or (more likely) open it up to Intel chipsets. This in itself wouldn't be that bad for Nvidia, but Larabee has the potential to make Nvidia totally irrelevant.

*Edit: Changed "will" to "could"
 

dev1se

Distinguished
Oct 8, 2007
483
0
18,780
What could Intel call its GPU range?

It'd probably make more sense than the rediculous model namings we currently get.
 

mtyermom

Distinguished
Jun 1, 2007
956
0
18,980
[quotemsg=6572465,86,122612]It isn't AMD/ATI that Nvidia should be worrying about. Rumor has it that Intel is refusing to give Nvidia a license for CSI, which would effectively kill Nvidia's chipset business when Nehalem is released. Combine this with the fact that Intel will be releasing a powerhouse GPU of its own (Larabee) and I see trouble for the green team.

With no CSI license Nvidia would be forced to either kill SLI or (more likely) open it up to Intel chipsets. This in itself wouldn't be that bad for Nvidia, but Larabee has the potential to make Nvidia totally irrelevant.[/quotemsg]


Not directed at you in the least, Homer, but as far as Larabee being a 'powerhouse' I personally will remain very skeptical until we see some working silicon and results. Intel's record with graphics is horrible (yet they remain the largest supplier of IGP, lol). I know Intel has the R&D and engineering resources to pull off a graphics coup, but do they have the creativity and insight? Only time will tell, but the graphics market can only benefit, IMHO, from more competition.
 

homerdog

Distinguished
Apr 16, 2007
1,700
0
19,780
Right, I've changed "will" to "could." I'm a bit skeptical myself, but if Larabee really is all it's cracked up to be then Nvidia will find itself between a rock and a hard place.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
[quotemsg=6572128,1,154240]Yes, Nvidia. Despite the 3x market cap advantage and the fact Nvidia's Geforce 9 series is coming out very soon, I believe they will be the losers over the next two years.

Firstly, their chipset business is in ruins. They are forcing manufacturers to release an entirely new series of expensive motherboards (7xx) with just a bugfix to make them run what everyone expected to run on their 6xx series: 45nm Intel processors. The dissipation of their chipsets is extremely high, so much so that Intel is annoyed at having to include two of their chips in Skulltrail. Nvidia's chipsets no longer offer significant advantages over Intel's or AMD's in price or features, especially as Crossfire matures.

Secondly, the Geforce 9 series is just a rebadged Geforce 8 series. The third highest projected SKU, the 9800GT, is rumoured to be just an 8800GT with Tri-SLI support. The second highest, the 9800GXT, will be an 8800GTS with slightly higher clocks and be no more than 15% better than the 8800GTX released well over a year ago. The top SKU, the 9800GX2, will suffer from all of the problems of SLI (Nvidia have shown no commitment to improving SLI, wheras AMD has shown their dual card, the HD3870X2, can provide nearly transparent Crossfire.), have an extremely high dissipation dueto the 65nm technology and idle-state power reduction being behind AMD's. Also due to the 65nm process and a lesser degree of integration between the cores than HD3870X2, the production is more expensive which could put the retail cost of the 9800GX2 $100 or $150 above the HD3870X2. There is a high chance of supply issues too.

Thirdly, you haven't seen AMD's real high-end yet. The HD3870X2 is out, but the CrossfireX driver that will provide even better performance and transparency than the already impressive current Crossfire driver isn't out yet.

Anandtech's CrossfireX preview:

Configuration HL2 UT3 Bioshock CoD4 Crysis
2-way CF Improvement over 1 card 83% 80% 71% 98% 87%
3-way CF improvement over 2 cards 30% 34% 37% 44% 0%
4-way CF improvement over 3 cards 10% 3% 7% 29% 4%
4-way CF improvement over 1 card 160% 150% 151% 268% 98%

Those aren't even final numbers - that's just a testing build. When the final driver is released in March, the HD3870X2 has the potential to improve in benchmarks by 20% due to Crossfire improvements. This doesn't account for single-card monthly driver improvements which AMD will have had three or more months of by the time the 9 series is established.

Fourthly, there has been plenty of independent rumours about R700. Final silicon has been returned, and some potential model numbers have been leaked. It is looking set for a late Q2/early Q3 release, according to some sources, and with the native multi-core architecture it will show the power of AMD's new Crossfire and be cheaper to manufacture with better yields than a big single-core card. In contrast, there have been very few rumours about Nvidia's next generation - instead we hear that some low-end 9 series cards could be released in May and June, almost ruling out a big new launch immediately after. I think R700 will be on its own in the field for a quarter of so, and being a new generation it will destroy both G9x and RV670 in all the benchmarks.

Fifthly, standards support on the Nvidia side is lacking. AMD were the first to DirectX 10.1 (and Nvidia doesn't have it in the Geforce 9 series either), the first to PCI-E 2.0, the first to DisplayPort and the first to have double-precision for GPGPU computing.

Sixthly, take a look at this article: http://www.3dprofessor.org/Reviews%20Folder%20Pages/FireGLV8650/FireGLV8650P1.htm

AMD, with the 80nm "failed"R600 architecture-based FireGLs, has taken back the performance leadership in the workstation market while being significantly cheaper than Quadro FX. With RV670 at 55nm and then R700 at 55nm or 45nm, think how much more of a performance lead AMD could get with no real improvements forecast on the Nvidia side for six months or more. That was previously a very profitable market for Nvidia; could their market share fall?

Finally, with Intel's new GPU architecture Larrabee and AMD's Fusion project integrating 45nm revised Phenoms with GPU cores, Nvidia won't be able to offer as much value in their products in the longer term. I can't make solid predictions beyond 2008 though, so perhaps Nvidia will have some initiative of their own: Geforce as a PPU, perhaps? AMD have also opened up their GPU specifications and are supporting an open-source driver project. This could end up with a better quality driver and more support from the growing group of free software advocates (I bought an AMD card for my Ubuntu computer for that reason).

So, Nvidia doesn't have much of a chance in the next few years. I think they will try and offer better price/performance as AMD did in 2007 - with the corresponding reduction in revenue.[/quotemsg]

Im glad you said 'finally' instead of 'seventhly' lol
 

mtyermom

Distinguished
Jun 1, 2007
956
0
18,980
[quotemsg=6572477,89,122612]Right, I've changed "will" to "could." I'm a bit skeptical myself, but if Larabee really is all it's cracked up to be then Nvidia will find itself between a rock and a hard place.[/quotemsg]


Agreed, wholeheartedly.
 

Slobogob

Distinguished
Aug 10, 2006
1,431
0
19,280
[quotemsg=6572355,74,70235]
Yes, Intel has more money to bribe game developers than Nvidia has. All the "The way it's meant to be played" program comes out to is Nvidia paying developers to provide pre-release code so Nvidia cards can be optimized in time for the first benchies of a new game (usually FPS). ATI eventually catches up, but only after the damage is done.
[/quotemsg]

I know for a fact that it is even worse sometimes. Instead of "buying" a look at the source code, nvidia sometimes just sends some free graphic cards to the developers, hoping they use them. In addition they pay some money to the publisher or developer to get their "the way..." logo in. As simple as that.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
idiot its called x2 because it has 2 processors incorporated on it .. comparing the 3870x2 with a single processing unit card is stupid.

and how many pcie slots does the 8800 ultra take up ?
Wow... I hope you're a troll. Otherwise you're the idiot.

A 3870x2 is still a single card. It takes up one PCIe slot on the board, therefor it is one card. In the context of his argument, it makes sense. 1 card vs. 1 card. (I'm not a fanboy, so read finish reading the post before you fanboys try to flame me)

Should you compare performance of 2 GPU's vs. 1 GPU? No. I do, however, believe it is fair to compare a single card, dual-GPU solution, to a multi-card (SLI/Crossfire) solution.

The way your question was asked made you look like a completely fool. Not to mention the fact that you resorted to name calling to prove your point, which is stupid in it's own right.

Anyways, think before you speak/type/whatever. It will benefit you in the end.
 

ro3dog

Distinguished
Mar 20, 2006
243
0
18,680
Nvidia's next is G200 gpu and 780/90 chipset.AMD R700,SB700,B3/45nm cpu.Nvidia has to restructure it self to compete,all fanboy stuff aside.The maturity of the Spider platform comes with the SB700 chipset.Not to mention how much cheaper 55/45nm gpu's are.Intel's Larrabee will not be a threat no time soon more than just a 3rd gpu maker on the market
 

tsd16

Distinguished
Jan 11, 2008
333
0
18,790
Why is it, with ATI cards I always hear "wait for for the new drivers". Not to offend you ATI people, I like ATI cards and have one in my general use PC, I just commonly see "wait for the new drivers" when performance of an ATI card is questioned. I don't think I have ever heard that with nvidia.
 

rgeist554

Distinguished
Oct 15, 2007
1,879
0
19,790
My guess is that almost every time ATI releases a new card that it almost always increases performance. They also have a semi-regular schedule for pushing out updates.

Nvidia... once you get the card, you pretty much know what you've got. Drivers may improve performance in one or two games, but it's usually not an "across the board" type thing. Also, updates from Nvidia (at least from what I have seen) just kind of show up occasionally. There doesn't seem to be any kind of schedule or guarantee that new drivers are going to be released immediately following the release of a new card.
 

Amiga500

Distinguished
Jul 3, 2007
631
0
18,980
[quotemsg=6572519,98,160834]Why is it, with ATI cards I always hear "wait for for the new drivers". [/quotemsg]

Quite simple.


ATI's architecture requires the drivers to be optimised to run the code. NVidia's does not.


This is compounded with NVidia's TWIMTBP program means ATI doesn't get access to the programs till late in the day, and thus each driver release is a significant improvement over the previous.
 

enewmen

Distinguished
Mar 6, 2005
2,246
2
19,815
I still wait to see ATIs "8800 GTX killer" card. I agree though ATIs more forward thinking my using 10.1, HDMI, etc. But I've never seen a RUMOR of a ATI monster card yet that will kill a future 9800 GTX.
So when is this r700/r770 coming out? 2009?
Time to get a good rest for a while.
 
Status
Not open for further replies.