News Jacketless Jensen Promised "Inexpensive" GPUs Back in 2011

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
No. Nowhere near. I think you're confusing gross margins with net margins-- but chips don't design themselves, you know. Until around late 2016 (when GPUs began to be so popular for AI tasks), NVidia had net margins in the 10-15% range .... sometimes even negative. They're now at 21% margins ...but this year is, by all accounts, going to be far worse than last.

They had record net and gross margin before the crypto collapse. Now they have record gross margin but horrid sales leading to abysmal net margin. They priced themselves out.

Next apologist excuse?
 
  • Like
Reactions: artk2219
To give Nvidia credit, their midrange and value graphics cards used to be Cheap. I recall that I brought a Nvidia 9600 GSO for much less than $100 and later an GTX 460 1gb. Ever since then I brought ATI/AMD video cards because they give better value.
 
  • Like
Reactions: artk2219
They had record net and gross margin before the crypto collapse. Now they have record gross margin but horrid sales leading to abysmal net margin. They priced themselves out.

Next apologist excuse?
Wow, not only insulting, but wrong on two counts. Over the last year, their gross margins have dipped by more than 7 points. You also have a deep misunderstanding of the accounting involved here. If NVidia "priced themselves out", their resultant revenue drop would have led to corresponding drop in gross margins. The differential between gross and net has nothing to with the price of the cards themselves, but the sum of non-manufacturing overhead such as R&D, sales, etc.
 
Last edited:
Wow, not only insulting, but wrong on two counts. Over the last year, their gross margins have dipped by more than 7 points. You also have a deep misunderstanding of the accounting involved here. If NVidia "priced themselves out", their resultant revenue drop would have led to corresponding drop in gross margins. Those margins may be presented to the ignorant unwashed on a per-card basis, but that's not how they're calculated.
You misunderstand gross margins (per unit sold based on BOM for a chipset bundle) and gross income (total of all sales)

Jensen himself said "gamers are willing to pay more than they ever have" at the investor meeting. Gross sales evapped. That fact he was trying to gloss over to paint a misleading image of the market.

You think he would think he would have learned from the first crypto crash. But $

He
Got
Greedy.

He
Got
Stupid
 
You misunderstand gross margins (per unit sold based on BOM for a chipset bundle)
and gross income (total of all sales)
This is just so wrong I don't know where to start. The first step in calculating gross margin is to determine NET revenue. Then subtract total cost of goods, and divide (for the ratio).

For hypothetical purposes, people sometimes pretend gross margin can be calculated on a per-widget basis, but it doesnt' work that way in practice. When a company reports margins, they use the means I detail above.

Gross sales evapped ...
He
Got
Stupid
NVidia doubled their total profits in 2022, to a record-high $10B (despite the lower margins). I'm curious how you believe he managed that, if their sales "evapped".
 
“We are going to take technology that is only available only in the most expensive workstations. We’re going… to try and reinvent the technology and make it inexpensive.”
I'm not sure that was such a novel concept. By the time Nvidia had GPUs on the market, they already faced competition from the likes of 3D Labs, and there were others in the works.

More to the point, it sounds like SGI was probably singing this tune even before Jensen:

"Silicon Graphics, Inc. (SGI), a long-time leader in graphics computing, was exploring expansion by adapting its supercomputing technology into the higher volume consumer market, starting with the video game market. SGI reduced its MIPS R4000 family of enterprise CPUs, to consume only 0.5 watts of power instead of 1.5 to 2 watts, with an estimated target price of US$40 instead of US$80–200. The company created a design proposal for a video game chipset, seeking an established partner in that market. Jim Clark, founder of SGI, offered the proposal to Tom Kalinske, who was the CEO of Sega of America. The next candidate would be Nintendo.​
Kalinske said that he and Joe Miller of Sega of America were "quite impressed" with SGI's prototype, and invited their hardware team to travel from Japan to meet with SGI. The engineers from Sega Enterprises said that their evaluation of the early prototype had revealed several hardware problems. Those were subsequently resolved, but Sega had already decided against SGI's design. Nintendo disputed this account, arguing that SGI chose Nintendo because Nintendo was the more appealing partner. Sega demanded exclusive rights to the chip, but Nintendo offered a non-exclusive license.​
...and thus was begotten the Nintendo 64. Sounds like it quite likely also influenced the Sega Saturn. (source: https://en.wikipedia.org/wiki/Nintendo_64 )

Separately, I've read that SGI's stated goal, in creating the N64, was essentially "to put a SGI Reality Engine on a chip". I don't know too much about the details, but I've read that the Reality Coprocessor relied heavily on 128-bit vector instructions, much like some of the instructions Intel would later release as part of SSE & SSE2. Although, those perhaps drew more from what Intel pioneered as part of their i860 CPUs, that even SGI itself used in some of its Reality Engine hardware. Interesting circularity, there.
 
Last edited:
In 1995 when Nvidia was founded, what did workstation graphics cards cost? A Sun Ultra 1 Creator3D Model 170E was listed for $28k in 1995 (around $50k in today's money), though that's for the whole system rather than just the GPU (which does not appear to have been something listed separately). Then there's the SGI Onyx, which were in the quarter million dollar range (in 1995 money!).
It's not as if those were the only options. SGI had lower-end machines, as well, like their Indigo workstations. Plus, there were workstations from HP and others that also had optional 3D graphics cards.
 
  • Like
Reactions: artk2219
This is ignoring the fact that Nvidia wasn't the first one to bring consumer GPU accelerators to consumers. In 1995 the 3Dfx Voodoo was $299...
Your chronology is off. Nvidia's NV1 beat 3dfx to market by more than a year.

That said, the NV1 was practically garbage, compared to the first Voodoo card. IMO, the NV1 very nearly killed Nvidia, especially with their huge gamble on quadrics. That didn't fit at all well into Direct3D. There's an excellent writeup of it, here:



And see the gallery, here:



Lucky for them, they learned from their mistakes and lived to fight another day.
 
You misunderstand gross margins (per unit sold based on BOM for a chipset bundle) and gross income (total of all sales)
This is just so wrong I don't know where to start. The first step in calculating gross margin is to determine NET revenue. Then subtract total cost of goods, and divide (for the ratio).

For hypothetical purposes, people sometimes pretend gross margin can be calculated on a per-widget basis, but it doesnt' work that way in practice. When a company reports margins, they use the means I detail above.

NVidia doubled their total profits in 2022, to a record-high $10B (despite the lower margins). I'm curious how you believe he managed that, if their sales "evapped".

Net = gross - cost

And you most certainly can do net cost per kit sold and base margin on that. But it's not GAAP. But that's the number Jensen was quoting.

If you include the soft cost then on a corporate level, yes their overall net is less and that is the number you need to report for SEC investor reports. So you can indeed make awesome margin on a chipset kit as Nvidia has done. But if you don't sell enough due to greed, your corporate soft cost will eat you alive.

There was a joke at GM "we lose $500 on every car we sell". That's because the soft cost were eating them alive because they didn't have enough unit sales overall

Soft cost are amortized over the number of units sold.

That is how Nvidia made record profits last year. The soft cost per unit was incredibly low because they pushed a record number of cards.
 
  • Like
Reactions: Tac 25 and artk2219
Your chronology is off. Nvidia's NV1 beat 3dfx to market by more than a year.

That said, the NV1 was practically garbage, compared to the first Voodoo card. IMO, the NV1 very nearly killed Nvidia, especially with their huge gamble on quadrics. That didn't fit at all well into Direct3D. There's an excellent writeup of it, here:



And see the gallery, here:



Lucky for them, they learned from their mistakes and lived to fight another day.

Yep. Nv1 was a disaster.

Tnt2 was a good card though
 
  • Like
Reactions: bit_user
I think he did exactly what he said he would do and even talked about you sometimes have to ignore your customers as they don't understand the business.

In regards to prices, the scalping prices proved that the market could take higher pricing for a % of buyers.

I remember talking to a few people at a BBQ saying if the scalpers can charge double and people want it, why doesn't Nvidia and others not charge double as well. Well they are, I guess they came to the same conclusion. We may not like it, but some people are willing to pay crazy prices ... and if enough are then ....

So yeah, the market can sustain it, so it is going to happen.
 
I think he did exactly what he said he would do and even talked about you sometimes have to ignore your customers as they don't understand the business.

In regards to prices, the scalping prices proved that the market could take higher pricing for a % of buyers.

I remember talking to a few people at a BBQ saying if the scalpers can charge double and people want it, why doesn't Nvidia and others not charge double as well. Well they are, I guess they came to the same conclusion. We may not like it, but some people are willing to pay crazy prices ... and if enough are then ....

So yeah, the market can sustain it, so it is going to happen.
Yet many RTX 4080's & 4070 Ti's remain sitting on the shelves.

https://www.pcmag.com/news/resellers-struggle-to-unload-geforce-rtx-4070-ti-at-inflated-prices
 
The fact prices have gone down, vis-a-vis, is not because nVidia or Jensen's vision came to fruition, but because at the time was fierce competition and the market was new-ish for consumers. Foundries were spawning new and several of them with healthy competition. The volumes did not saturate any of them and the demand was just ramping up.

What I'm trying to get at is: Jensen just predicted what should happen in a healthy market where there is competition and how nVidia was going to compete in that market. Nevermind all the shenanigans throughout the years, but his core "promise" was just a consequence and not something nVidia actively seeked out. Why would a company want you to pay less? That's just smoke and mirrors and you can clearly see the daggers now.

I will say nVidia, alongside 3DFX (later bought by nVidia), Imagination Technologies, ATI (later bought by AMD), VIA (via ImTech, I think?) and even Matrox with SiS were pioneers. Then the market cannibalized itself; they call this "consolidation", but Matrox, SiS and ImTech are still alive, just not in the consumer market and not creating "compute" graphics.

So, all in all, Jensen was right, but for the wrong reasons? Haha.

Regards.
 
  • Like
Reactions: digitalgriffin
That is how Nvidia made record profits last year. ..they pushed a record number of cards.
Two posts ago, you said, quote, " Gross sales evapped. [Jensen]...got stupid." I'm glad to see you've now acknowledged both statements were in error. Record profits on record sales ... and that despite their making less profit per card.

What I'm trying to get at is: Jensen just predicted what should happen in a healthy market where there is competition....his core "promise" was just a consequence and not something nVidia actively seeked out.
The point you're missing is that, when NVidia was founded, there was no "healthy market". There was no market at all for PC graphics acceleration. There were a handful of workstation-class devices, and the only software available for them engineering and scientific visualization. Like it or not, companies like NVidia and 3dfx created the market from whole cloth.
 
The point you're missing is that, when NVidia was founded, there was no "healthy market". There was no market at all for PC graphics acceleration. There were a handful of workstation-class devices, and the only software available for them engineering and scientific visualization. Like it or not, companies like NVidia and 3dfx created the market from whole cloth.
There always was demand for graphics acceleration, except that at first, it wasn't economically feasible beyond character generation and there were no standards whatsoever for vendors or software developers to follow. Then we got the normalization of 2D acceleration that ultimately became DirectDraw on Windows followed by 3D acceleration.

3D acceleration was going to happen regardless of Nvidia and 3dfx. They only get credit for pushing it harder earlier in a bid for early adopter cash.
 
The point you're missing is that, when NVidia was founded, there was no "healthy market". There was no market at all for PC graphics acceleration. There were a handful of workstation-class devices, and the only software available for them engineering and scientific visualization. Like it or not, companies like NVidia and 3dfx created the market from whole cloth.
There was, and still is. Otherwise, nVidia would have gone bankrupt a long time ago. That is what Jensen saw (and ATI, 3DFX, etc) and executed on.

Ironically, they became the same thing Jensen said they were fighting against. They now command most of the high cost GPUs at the pro level after getting rid of the competition throughout the years. AMD and Intel are the only bastions of hope left for competition in almost all markets, including professional and server compute. There's no healthy market anymore and Jensen is taking all the advantages any CEO of a massive corp would.

Regards.
 
There always was demand for graphics acceleration
Sure, but there was no supply for graphics acceleration. There's demand now for a product that converts lead into gold. Without a reasonable supply, however, there is no healthy market.

Your argument is like claiming NASA shouldn't get credit for the Apollo project, because, without them, others would have eventually walked the moon. While true, is that reasonable?

NVidia did what ALL visionary companies do-- saw a need, and found a way to fill it ... at substantial risk, I might add. Your hindsight today is, unsurprisingly, 20/20. But the fact remains that a huge number of industry pundits thought that hardware graphics acceleration would fall flat on its face, including some editors on this very site itself. They believed it would follow the same arc that floating-point acceleration did ...a separate chip, to start, but it would ultimately be reabsorbed back into the CPU itself.
 
Two posts ago, you said, quote, " Gross sales evapped. [Jensen]...got stupid." I'm glad to see you've now acknowledged both statements were in error. Record profits on record sales ... and that despite their making less profit per card.

The point you're missing is that, when NVidia was founded, there was no "healthy market". There was no market at all for PC graphics acceleration. There were a handful of workstation-class devices, and the only software available for them engineering and scientific visualization. Like it or not, companies like NVidia and 3dfx created the market from whole cloth.

No, I didn't contradict myself. Different time frames and quarters. When crypto collapsed, along with economy, sales collapsed year over year. Year over year is a comparison of one quarter to the prior year's quarter at the same time.

You also act like NVIDIA was the only player in the industry. Far from it. They are just the one who won. There was ATi, S3, Matrox, Power VR, 3drx, and a few others. The competition is what made the industry grow.

I think I'm done here. Have fun and say all you want.
 
Last edited: