News Nvidia Reportedly in No Rush to Boost RTX 40-Series Output

Nvidia's OSAT providers, Siliconware Precision Industries (SPIL) and King Yuan Electronics (KYEC), have not received any indication from the company about scaling up production for the GeForce RTX 40-series
Are these providers also used for their H100 and Grace CPUs? If so, have they said anything about them? I presume Nvidia is happily redirecting its 4N wafer supply to higher-margin datacenter products.
 
A more likely explanation is Nvidia had to take away production time they booked at TSMC for consumer devices and use it instead for their A100 and H100 lines which are in high demand because of the Generative AI explosion that no one saw coming when they booked production time at TSMC over a year ago. Elon Musk/Twitter alone just ordered 10,000 H100 devices which means Nvidia has to make 80,000 of their largest chips and then you have large orders from Microsoft and AWS

Everyone is so myopic that they think the tech world revolves around Gamers .... There is only so such time booked at TSMC and Nvidia is going to use it for their most profitable lines and that simply is not low margin consumer markets but their high margin professional lines
 
4070 might be a recommendation as something worth buying for systems with smaller PSUs. The rest of the line-up, not so much.

I'll just keep running the old 3080Ti at 280W until I start running into performance issues or get the itch to swap it out. Maybe 60 series or, fingers crossed, Celestial.

I'm still rocking a 980 Ti. I was going upgrade to a 3000 series, but the whole crypto/pay 3x over MSRP thing turned me off and I decided that I'm just going to wait until the 4000 series comes down in price (which is probably never) or just go with a low end 5000 series when it comes out.
 
  • Like
Reactions: bit_user
If I were Nvidia at this juncture, I would even slow down production to consumers even more. Right now with AI, it is paramount that Nvidia focuses all of their resources for data centers. As a stockholder, I want Nvidia to prioritize profits and by far not even close, it's more profitable in the enterprise side of production.

Frankly, I am so sick and tired of people's belly aching about GPU prices. It's like they forgot how much prices were 10 years ago. The difference in cost between AMD and Nvidia is pennies in comparison. It's like some lunacy has taken over people's thoughts on the subject. Corporations and in particular publicly traded corporations absolutely have to maximize their profits by law.
 
A more likely explanation is Nvidia had to take away production time they booked at TSMC for consumer devices and use it instead for their A100 and H100 lines which are in high demand because of the Generative AI explosion
A100 are made on TSMC N7. None of their gaming GPUs use that node. Only the H100 shares a node with gaming GPUs.

Elon Musk/Twitter alone just ordered 10,000 H100 devices which means Nvidia has to make 80,000 of their largest chips
According to this, those were 10k GPUs, not DGX's. So, that's only 10k big dies (after accounting for yield).

If I were Nvidia at this juncture, I would even slow down production to consumers even more.
Check their latest quarterly report - gamers still comprise the majority a significant amount of their income. They can't afford to alienate them too much, just yet.

it's more profitable in the enterprise side of production.
What if a purpose-built AI chip outperformed Nvidia's best datacenter products? Their datacenter revenue could dry up rather quickly. That's why they need to remain active in all the markets where they play, and not take any of them for granted.

Corporations and in particular publicly traded corporations absolutely have to maximize their profits by law.
That's an overstatement. Nobody is going to jail, if a company doesn't screw over consumers by the maximum possible amount. The jeopardy is that they could be sued by shareholders who think the company is not being properly run in a profit-maximizing way. Civil, not criminal. The plaintiffs also have some burden of proof - it can't be merely their opinion.
 
Last edited:
  • Like
Reactions: P1nky
I'm still rocking a 980 Ti. I was going upgrade to a 3000 series, but the whole crypto/pay 3x over MSRP thing turned me off and I decided that I'm just going to wait until the 4000 series comes down in price (which is probably never) or just go with a low end 5000 series when it comes out.

I had two 980s...then a 1080, and I told myself I would get on the 80Ti next series, then they announced the price for the 2080Ti and I decided to wait. 3080Ti I got for retail, which was still 1400 after taxes, so not ideal, but I had the same GPU for 6 years and I hope to keep this one for at least 5. Hindsight, should have waited for the 4080 or waited for a 3080 12GB when the price was at its bottom.
 
  • Like
Reactions: JamesJones44
I have a rtx2070 (first one, not the "super").
It is still enough for me, but I need to decrease the settings on some games.
I'm thinking to what to do.. I can still wait a generation before changing..
The most thing that is preventing me to change (beside the high price), is the wattage.

I see no progress..
in the past years (2000-2016), each generation was more powerful, for the same wattage and same price.
I don't want to change now, I would need to change my PSU too (I have 650w.. with rtx2070 and i7 8700k).

And I don't want to "eat" so much energy.. nvidia is crazy by doing no effort on the energy-consumption. Or they could sell 1 or 2 solar module / solar kit with it.
 
I have a rtx2070 (first one, not the "super").
It is still enough for me, but I need to decrease the settings on some games.
I'm thinking to what to do.. I can still wait a generation before changing..
The most thing that is preventing me to change (beside the high price), is the wattage.

I see no progress..
in the past years (2000-2016), each generation was more powerful, for the same wattage and same price.
I don't want to change now, I would need to change my PSU too (I have 650w.. with rtx2070 and i7 8700k).

And I don't want to "eat" so much energy.. nvidia is crazy by doing no effort on the energy-consumption. Or they could sell 1 or 2 solar module / solar kit with it.

4070 is actually pretty close in power requirements. Certainly the most efficient card they've launched in a long while.

RTX2070 is 175W for the stock version, 185+ for custom cards.
RTX4070 is 200W nominal with 223W maximums observed and 188-ish average.

If you have a quality 650W, I would not have any qualms about running an RTX4070.
 
The years of 2021-2022 were very hard for gamers in search of new hardware. GPUs were overpriced and non-existent, and the 5th generation consoles were non-existent at big box store retailers and E-commerce sites. Those E-sites like Newegg and Amazon that had sellers for said consoles (and GPUs) that were scalping for outrageous prices. There were shuffle lotteries for both GPUs and consoles to buy at retail price, and only a handful "won."

After four months in 2021, I won a Newegg shuffle for an EVGA 3080 Ti FTW3 but it was tied to a questionable quality EVGA SuperNOVA G+ 850W PSU as a package. Fortunately I had a buyer for that PSU so it was able to buy a better one. I'm wondering how many just gave up on PC gaming builds and the GPU prices and just started buying Xboxes and PS5s for their 4K gaming needs once the consoles started becoming back in stock last year. I'm also wondering how many are holding on to older generation GPUs and just not even caring anymore.
 
Check their latest quarterly report - gamers still comprise the majority of their income. They can't afford to alienate them too much, just yet.
Might want to check that one again. Enterprise revenue surpassed gaming revenue a while back for Nvidia, with the gap widening every quarter.

Q4 results
Gaming - $1.83 billion
Enterprise - $3.62 billion

Adding in other non gaming revenue, and gaming is less than 1/3 of Nvidia's revenue.

Fy 2023
Gaming - $9.07 billion
Enterprise - $15.01 billion

The days of Nvidia being a gaming focused company are gone and not coming back.
 
With there being available stock at MSRP or near MSRP prices for all ADA GPU's, including the just released 4070, except for the 4090, I don't see why anyone would be thinking Nvidia was planning to increase production.
 
If you sell in low volume, you keep margins as high as you can get away with. When you sell a lot, you can relax that a bit if there's competition, otherwise you just keep on reaping the shekels in.

Thing is, nVidia is squeezing its partners dry in this economic climate, so what I'm wondering is how much more abuse they can take XD

I've read in like 2 places nVidia is "helping" its partners with some of these slow moving inventory, but I wonder how long they'll last like this. I just don't see Jensen like the type of guy that will just lower prices with a smiling face.

Regards.
 
Gaming revenue 3Q2023 was only $1.57B, out of a total of $5.9B. Data Center revenue was $3.83B.
Okay, my bad. I should've said "as recently as Q1 FY23, their gaming revenue was on par with their datacenter revenue".
(note, this is their Q3 FY23 report, but it also references data from Q1 FY23)​

That was before the launch of the RTX 4000 series, even. I underestimated how much their gaming revenue had dropped, since then.

The days of Nvidia being a gaming focused company are gone and not coming back.
I certainly do expect Nvidia's gaming revenue to bounce back to at least as high as its earlier peak. The precipitous drop in gaming revenues isn't a trend - it's a temporary correction.

And, again, we lack a crystal ball. I wouldn't bet against Nvidia's datacenter success, but the tech is moving fast and developments like processing-in-memory or upstarts like Cerebras can change a lot within a year or less.

For Nvidia, gaming dGPUs are like consoles are to AMD. Not the most profitable market, but a relatively safe fallback... unless they neglect it to the point of alienation.

With there being available stock at MSRP or near MSRP prices for all ADA GPU's, including the just released 4070, except for the 4090, I don't see why anyone would be thinking Nvidia was planning to increase production.
The article seemed to suggest that Nvidia should be putting in orders now, to prepare for the holiday season. That's very different than looking at it through the lens of short-term supply.
 
Last edited:
  • Like
Reactions: atomicWAR
If you sell in low volume, you keep margins as high as you can get away with. When you sell a lot, you can relax that a bit if there's competition, otherwise you just keep on reaping the shekels in.
Just so we're clear, here's how profit-maximization works:

profit-maximisation.png.webp

If they undersupply too much, they won't be able to raise prices enough to compensate for the lost sales. If they oversupply too much, even the increased sales volumes can't compensate for the price cuts they'd need to make to move the inventory before it becomes obsolete.

So, if we look at gaming in isolation of other considerations, it's in their interest to keep enough supply on hand to meet demand. It's only when we look at gaming vs. lucrative datacenter products that it would make sense for them to contemplate another under-supply situation, like we had during the pandemic + mining boom.

And I still maintain that Nvidia can't walk completely away from the gaming market. Too much undersupply would hurt their standing, by causing developers to spend more time optimizing for the GPUs consumers can buy (i.e. AMD and Intel).

Thing is, nVidia is squeezing its partners dry in this economic climate, so what I'm wondering is how much more abuse they can take XD
I wonder if it's the reverse. Imagine what happens to partners with $Millions of inventory in RTX 3000 cards, once Nvidia completes the launch of its full RTX 4000 line. They'd essentially have to be liquidated. I think Nvidia & AMD are slow-walking the launch of their lower-tier cards precisely for the benefit of their partners, even at the expense of their own near-term finances.

I just don't see Jensen like the type of guy that will just lower prices with a smiling face.
Right. They're dragging the rollout of their new cards instead of further price cuts.
 
What if a purpose-built AI chip outperformed Nvidia's best datacenter products? Their datacenter revenue could dry up rather quickly. That's why they need to remain active in all the markets where they play, and not take any of them for granted.
nvidia will adapt as usual. back in 2019 there was an interview done by TheNextPlatform with one of nvidia engineer. back then that nvidia engineer said they have a team that dedicated to make AI specific chip that work separately from GPU design team. he said they can sell that chip as it is or integrate them into GPU design. so if things have come to the point where AI specific chip can outpace nvidia "GPU compute" design performance nvidia will adapt accordingly. something similar happen before with regards to tensor core. nvidia release GP100 in mid 2016. then Google release their TPU in late 2016. nvidia probably did not want to push into tensor core until 2018 with GV100. so what nvidia did was they release GV100 in mid 2017 instead of 2018 retiring GP100 as their top compute solution in just a year. and for those that already order GP100 but still not get theirs nvidia are giving them GV100 instead.
 
nvidia will adapt as usual.
I agree, but there are no guarantees. I already said I wouldn't bet against Nvidia, but nobody can say that a dark horse like Cerebras won't turn out to have an advantage beyond Nvidia's ability to counter. Because patents, it's not as if Nvidia can necessarily just copy whatever their competitors do.

10 years ago, Intel looked completely unassailable, and yet look how far they've fallen.

something similar happen before with regards to tensor core. nvidia release GP100 in mid 2016. then Google release their TPU in late 2016. nvidia probably did not want to push into tensor core until 2018 with GV100. so what nvidia did was they release GV100 in mid 2017 instead of 2018
That's a stretch. Google doesn't sell TPUs on the open market. I don't even know when they started making those instances available to the public, but I remember it being quite a while after we first heard about them.

Furthermore, chip design takes a long time. V100 launched too soon to be a counter to the TPU. A more plausible explanation is that Nvidia simply took a more sensible approach to optimizing deep learning, by hard-wiring a few matrix multiply instructions (which is all their "tensor cores" actually are), after noticing how wasteful it was to use a series of dot-products for that purpose.

GP100 as their top compute solution in just a year. and for those that already order GP100 but still not get theirs nvidia are giving them GV100 instead.
It was still perfectly fine for HPC, which relies mainly on fp64. They were available for much more than a year after V100's launch. I hadn't heard about Nvidia offering substitutions, but I'm certain they continued making P100's. For the kind of supercomputer and HPC applications the P100 primarily targeted, you couldn't necessarily just substitute in a V100.
 
Hardly a surprise. Of course their going to prioritize their silicon for professional use as their margins are so much higher compared to gaming. Its a no brainer. Gamers are only a third of the market. They can't ignore us as we are still worth to much cash but we are hardly top priority at this point.

Plus in regards to gaming, Nvidia clearly intended to charge more for their consumer line-up this gen until the botched RTX 4080 12GB bit them in the back side cutting into what they had intended to charge for the part. And I am guessing that affected the pricing of the whole stack. People upset at the $599 RTX 4070s we got would have been even more riled up had 4070s we ended with had the 4060Ti die if the 4080 12GB hadn't unlaunched, especially if that even further cut down 4070 cost more than the $599 they charged for the 4070 we got.

At the end of the day Nvidia is going to chase their highest margins which is totally fine. I just hope they don't end up alienating their gaming base all together, something is not necessary but it appears like Nvidia is intent on pissing off their gaming base all the same. If the RTX 5000 series tries to do the same level of mark ups with the lack of performance increase we got this gen (not including the 4090 and maybe, to a much lesser degree, the 4080) then I could see gamers finally moving on from Nvidia to AMD or even *gasp* Intel.