News AMD, Intel, and Nvidia Reportedly Slash Orders with TSMC

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

InvalidError

Titan
Moderator
Where does this silliness come from? I'm not pointing a finger at you directly, as this is an incredibly wide misconception. In a free market, companies don't "need excuses" to raise prices; they can set them at whatever level they wish. Any intelligent company sets them at the point that maximizes profits, which is not the highest possible price. Calculating the optimal price through determining inflection points on the sales curve is one of the canonical problems of any business calculus class.
It is arbitrary because Nvidia is arbitrarily reducing supply to prop up its grossly inflated prices thanks to its monopoly power - no fear of competition wiping out its market share for being too greedy.
 

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
It is arbitrary because Nvidia is arbitrarily reducing supply to prop up its grossly inflated prices thanks to its monopoly power - no fear of competition wiping out its market share for being too greedy.
There is nothing "artificial" about a company lowering its production, especially not in a situation where demand has substantially decreased. If NVidia doesn't lower production, it'll be forced to sell product at a loss. Why should it do that, simply so you can see prettier pictures in your videogames this year than last year?

All companies continually adjust production to maximize profits. If Lamborghini didn't mind losing a few trillion dollars, they could produce enough Huracans to drive the price down to below that of a 30-year old Chevette. If Samsung quadrupled the production of their flagship smartphone, it'd sell for one tenth its current price. Are they "manipulating" the market? NVidia's current net margin is about 12%. Adjust that for inflation, and NVidia isn't doing much better than breaking even at present. If you consider that "grossly inflated profits", perhaps you should consider emigrating to a consumer paradise like Cuba or North Korea, where the government firmly enforces rules against such blatant free-market manipulations.

In closing, if you define "greedy" as "selling a product for a price higher than I prefer to pay", you can find it wherever you look.
 

bit_user

Polypheme
Ambassador
I bought a 16 core 32 thread chip and the rest of the components for a new chess beast.
Chess seems like the perfect application for cloud computing. You machine probably spends no more than a couple hours/week computing moves, it's not exactly latency-sensitive. It'd be interesting to see how much time you could get on a 64-core cloud instance or bigger, for the same price as your 16-core machine.

I am getting stiffed on cat food. That has gone up $6 a bag. And I have 18 cats. Well, 15 are mine, I guess the neighbors stopped feeding theirs or feed them lousy food.
High food prices, high cat food prices, 18 cats and not all of them yours... there's a solution somewhere in this. Have you thought about how many cats you need?

🤔
 
Last edited:

bit_user

Polypheme
Ambassador
It has nothing to do with being a "chiplet snob". AMD can most definitely produce products for the sub-$300 range in chiplets and/or monolithic, but they don't want to because they want higher margins.
Are you on a different planet? AMD does and is producing lower-cost chips. Primarily, they're monolithic, because that's more cost-effective at the low end. But, that means they come after the chiplet-based CPUs, because AMD need chiplets for its server CPUs and those take priority. Plus, that lets them debug & refine the cores somewhat, before integrating them into the monolithic dies.

maybe accelerating Phoenix could be good? The laptop market could definitely use a new APU.
Did you miss the Ryzen 6000 series? Those are laptop-only parts, launched about 1 year ago. Compared to those, the chiplet-based Ryzens and EPYC CPUs were more urgently in need of an update, hence they get priority. Also, Phoenix uses a smaller node and RDNA 3. So, it's not like there's no benefit from it being staggered.

What I don't understand about posts like yours or @InvalidError is this sense of entitlement that AMD should be all things to all people. Furthermore, they're still much smaller than Intel, and growth takes time to do well. Their priority needs to be solid execution, or nothing else they do matters.
 

bit_user

Polypheme
Ambassador
A while ago, Nvidia announced that it was moving wafers from consumer GPUs to Hopper and AD100.
Yes. Post crypto-crash. I think that was the primary motivation for the reallocation.

Nvidia can afford telling gamers to go screw themselves when it can make $10 000+ 820sqmm AD100 and Hopper datacenter cards instead of $1600 ones using 620sqmm dies. That is 4-5X as much income per wafer.
Obviously, yield of such large dies will be a little lower and their main customers for them are probably able to negotiate a little bit on price.

However, what we know from Nvidia's financials is that gaming dominates their revenues more than all other business segments, combined. So, they really can't afford to turn their back on consumers, too much.

it can then use as an excuse to jack up prices on artificially limited supply of consumer parts some more.
This only works in segments of the market where it doesn't face robust competition from AMD. And any loss in volume has to be made up by increased margins.
 

bit_user

Polypheme
Ambassador
The way gpu prices are going it makes me consider waiting until a new generation of gpus are releasing and then buying on a sale of the previous higher end cards.
I think the last time there was a firesale-type price drop on older GPUs was the launch of Pascal, in 2016. That's when I got my $700+ GTX 980 Ti for $450 (new). These days, it seems like they're more careful about timing launches so the new products don't depress prices of existing inventory in the channel.
 

InvalidError

Titan
Moderator
However, what we know from Nvidia's financials is that gaming dominates their revenues more than all other business segments, combined. So, they really can't afford to turn their back on consumers, too much.
You may want to look at Nvidia's 2023Q3 earnings call: Datacenter was up 31% to 3.8G$ while gaming was down 51% to 1.5G$. Datacenter now accounts for more than double of Gaming revenue.

This only works in segments of the market where it doesn't face robust competition from AMD. And any loss in volume has to be made up by increased margins.
What robust competition? They are priced almost linearly with performance. That looks more like oligopoly price-fixing than meaningful competition.
 

bit_user

Polypheme
Ambassador
What robust competition? They are priced almost linearly with performance. That looks more like oligopoly price-fixing than meaningful competition.
GPUs scale performance roughly linear to area, within the same generation, which also scales somewhat linearly with price (assuming same node). So, just because it's linear doesn't mean it's artificial.
 
Last edited:

tamalero

Distinguished
Oct 25, 2006
1,132
138
19,470
GPUs did not cause a recession.
I recommend you stop.. re-read that I wrote please :)



It's not even clear they "shot themselves in the foot", because the RTX 4090 indeed seems to be selling rather well. The reason they would slash future production targets is that they expect to sell fewer GPUs across the entire range, but potentially what it could mean is that they see demand clustering more towards the lower end of the scale, and those GPU dies are smaller.

It's also not clear what you'd have them do instead. I gather you want them to sell the same GPUs for less $$$, but what if they actually can't? If you want GPUs that perform like the RTX 4090 and 4080, they need to be a certain size and we know that new manufacturing nodes are more expensive. That puts them in a certain ballpark on pricing, which limits sales. Lots of people would love it if they sold these at cost or even a loss, but that would be devastating for them.

So, about the only thing they could've done differently is to make the 4080 be the top product. That would've put them at risk of losing the performance crown, but we only know in hindsight that AMD couldn't match it. They couldn't have known that, at the time.
As for the 4090, it seems the only one that seems valuable considering performance per $.
The 4080 is noone of these things.

"what if they actually can't" you kidding right?
There is no way the prices of the entire 4080 or 4090 board would be 2X times what the 3000 generation was.

Hence why the talks about how Nvidia behaves with AIBS caused EVGA to send them to hell.
 

InvalidError

Titan
Moderator
GPUs scale performance roughly linear to area, within the same generation, which also scales somewhat linearly with price (assuming same node). So, just because it's linear doesn't mean it's artificial.
The build cost is under $400. Nothing in a competitive market gets 100+% gross profit margins and 40+% net margins. The pricing is completely disconnected from costs.
 

SemiChemE

Prominent
Dec 21, 2022
6
6
515
One factor I haven't seen mentioned is the impact of the pandemic on the typical business laptop/computer upgrade cycle. These days, when a business buys a laptop for an employee, the expected lifetime for that computer is probably 4-6 years. Prior to the pandemic, these purchases were dispersed in time, in other words the number of purchases was relatively constant and low over time as each year only, say 1/6 of employees (corresponding to a 6-year upgrade cycle) needed new laptops. The pandemic compressed this cycle, since the increased demand for working from home, supporting zoom meetings, and so on drove employees to demand earlier upgrades. Thus, instead of replacing only 6-year old laptops, many businesses replaced or upgraded 4-, 5-, and 6- year old laptops. Thus, the demand sky-rocketed in 2020, 2021, and the first half of 2022, but now all of these employees already have relatively new laptops that will not need to be replaced until late 2024 and beyond. Thus, we should expect to see a year or two of reduced demand, followed by another 2-3 year uptick in demand. This cycle will continue into the future, but with the magnitude of each "wave" gradually diminishing.

Of course, predicting the future is always challenging, so AMD, Intel, Nvidia, and TSMC likely assumed the pandemic-driven market growth from 2020 through 1H2022 would continue. Clearly it hasn't, probably because most of those business upgrades are now complete. Consumer equipment upgrades would normally be on a different cycle, but the pandemic also "synchronized" these, since consumers couldn't spend money on travel or entertainment, so instead spent on computers, TV's and other electronics.
 
  • Like
Reactions: bit_user

hannibal

Distinguished
GPUs scale performance roughly linear to area, within the same generation, which also scales somewhat linearly with price (assuming same node). So, just because it's linear doesn't mean it's artificial.

Yes and no... The GPU chip price is linear, but cooling and other parts of the ready GPU does not scale linear. That is why low cost GPUs seems to be bad, because the cooling compared to the speed and chip size is relatively expensive. Of course bigger chip need better cooling, but for exapmple 4080 and 4090 use mainly the same cooler, while the speed (size) of the gpu chip is different. And if they want the same margins, that means that 4080 seems to be product that has relative bad price...
 
  • Like
Reactions: bit_user

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
The build cost is under $400. Nothing in a competitive market gets 100+% gross profit margins and 40+% net margins. The pricing is completely disconnected from costs.
NVidia's net margin is 11% at present, not 40%. And the gross margin in highly competitive markets like the luxury perfume industry is well over 1,000%. Your assumptions are wrong, top to bottom.
 
  • Like
Reactions: bit_user

InvalidError

Titan
Moderator
NVidia's net margin is 11% at present, not 40%. And the gross margin in highly competitive markets like the luxury perfume industry is well over 1,000%. Your assumptions are wrong, top to bottom.
Nvidia's Q3 earnings have 1.5G$ net income on 5.9G$ revenue, that is just over 25% overall, which includes marketing, subsidies and other not strictly related expenses that could be skipped altogether. My 60% net profit figure comes from the last time I remember Nvidia breaking down revenue and costs on a per-division basis and that was before the massive price increases.

As for "luxuries", GPUs are a commodity, though AMD and Nvidia are apparently successful in leading some people to believe otherwise and pay stupid amounts of money for them.
 

bit_user

Polypheme
Ambassador
The build cost is under $400.
You're talking about the RTX 4090, right? I simply don't believe this number. I suspect any data you're using is outdated and incomplete. I remember back when Vega launched, and people were saying AMD was on the verge of selling it at a loss. And that was at a higher price than you're quoting, plus pre-inflation, older node, simpler cooling solution, and lower power.

Furthermore, when you consider the massive amount of resources it takes to bring these chips to market and support the software on them, you need to account for those engineering costs in the final price. As Jensen has been keen to point out, most of their engineering costs are for software.
 
Last edited:

bit_user

Polypheme
Ambassador
These days, when a business buys a laptop for an employee, the expected lifetime for that computer is probably 4-6 years.
My employer gets them on 3-year leases. Over that time period, they pay about double what it'd cost to buy the machines outright, but for various accounting reasons, they prefer to lease them.

now all of these employees already have relatively new laptops that will not need to be replaced until late 2024 and beyond. Thus, we should expect to see a year or two of reduced demand, followed by another 2-3 year uptick in demand. This cycle will continue into the future, but with the magnitude of each "wave" gradually diminishing.
Poetically, one might call these ripples or echos of the pandemic.

AMD, Intel, Nvidia, and TSMC likely assumed the pandemic-driven market growth from 2020 through 1H2022 would continue.
We don't exactly know. They're not naive. This business is cyclical and they'll all have been through several cycles, so I wouldn't underestimate their wariness.

However, under-predicting demand is almost as bad as over-predicting it. At least AMD's chiplet strategy lets it order chiplets with the confidence they'll find a home in servers, if not desktops or workstations. But, that doesn't help with their laptop products, which use monolithic dies, or consumer GPUs.

Unlike AMD, Nvidia can sell some of its gaming GPUs for use in datacenters. This gives them a secondary market where they can possibly redirect some spillover from the gaming products. All of the models listed here use the same chips as consumer GPUs, except for the models ending in 100.:

 
Last edited:

peterf28

Distinguished
Apr 7, 2013
111
19
18,585
If they lower prices 50% , I will buy 3 computers instead of 1. They would get more money from me.
 
Last edited:

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
As for "luxuries", GPUs are a commodity.
You work for a site that devotes a majority of its content to comparing and reviewing various graphics cards, and moderate a forum where anyone who chooses one GPU maker over the other is immediately attacked by the other side of fanboy brand loyalty -- and you call graphics chips a commodity? Are you even aware what the word means?

Corn is a commodity. Oil is a commodity. FCOJ and pork bellies are commodities. Graphics boards are luxury items.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
Corn is a commodity. Oil is a commodity. FCOJ and pork bellies are commodities. Graphics boards are luxury items.
I was also thinking about this, after @InvalidError said it.

I pretty much agree with him that graphics is somewhat commodity-like, at the extreme low-end. Like iGPUs that exist mainly for GUI & video acceleration, plus basic 3D rendering for stuff like maps. One key point about commodities is that different suppliers are interchangeable. Graphics APIs enable that, to quite a large degree, but the "interchangeability" requirement also means functional-equivalence, to the extent that it matters, and some degree of uniform market pricing, on the basis of specs. These are all things that break down, towards the top-end of the graphics card market.

You can see this same dynamic with other commodities, like flour. There are artisanal flour producers whose pricing is largely decoupled from the market rate, and cater to boutique bakeries and gourmet food producers.

Perhaps a better analogy would be cars. You can see roughly commodity-like pricing, at the bottom end of the market. But, that starts to break down, as you move further up-market.
 
  • Like
Reactions: Endymio

JamesJones44

Reputable
Jan 22, 2021
652
588
5,760
Source? That number seems pretty high. I can't think of a whole lot of professions which pay over $1M. It's pretty much: CEOs, pro athletes, and financial traders.

I actually think this might be right. I've seen quotes that the US averages about 0.01% of federal tax payers make over a million each filling season (about 1.4 million of the 148 million filers). Sadly this is the best source for numbers that I can find currently, but I've seen it posted at about that % in the past.

I think some of it has to do with people who have a boatload of money making decent returns on it vs people actually drawing a 1m dollar salary from a company, which I think is actually fairly rare, but that's just a guess.
 
  • Like
Reactions: bit_user

InvalidError

Titan
Moderator
I pretty much agree with him that graphics is somewhat commodity-like, at the extreme low-end. Like iGPUs that exist mainly for GUI & video acceleration, plus basic 3D rendering for stuff like maps.
Only because competition is effectively dead. Back when we had 4+ GPU manufacturers, the mainstream was staying solidly anchored around $200.
 

Endymio

Reputable
BANNED
Aug 3, 2020
725
264
5,270
Back when we had 4+ GPU manufacturers, the mainstream was staying solidly anchored around $200.
You mean 15 years ago, when a million-dollar house could be bought for $400K, and when mainstream GPUs were made in fabs that cost millions, not billions? Well take heart -- that same graphics card that you were happy to pay $200 for, you can now purchase for $20 or less, while everything else has seen rampant cost increases from inflation. That 15-yerar old gpu is an entirely different beast from the cutting-edge processors of today, which are being used more for AI, scientific modeling, and cryptology than simply for gaming.
 

bit_user

Polypheme
Ambassador
That 15-yerar old gpu is an entirely different beast from the cutting-edge processors of today, which are being used more for AI, scientific modeling, and cryptology than simply for gaming.
FYI, @InvalidError probably knows more about chip & board design than just about anyone here, besides maybe a couple of industry folks known to haunt the place on occasion. When it comes to technical details, he's usually spot-on.