News AMD, Intel, and Nvidia Reportedly Slash Orders with TSMC

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Titan
Ambassador
So the price of 3 x RX 6600 XT + 3 x RAM + 3 x CPU + 3 x motherboard + 3 x case + 3 x R&D + 3 x packaging + 3 x times etc = 1 x RTX 4080

Something seems off to me...
The RX 6600 XT uses a 237 mm^2 N7 die and has 8 GB of 17.5 GHz GDDR6 on a 128-bit bus. It needs a 160 W VRM and cooling solution. Its die has 11.1 billion transistors.

The RTX 4080 uses a 378.6 mm^2 ~N5 die and has 16 GB of 22.4 GHz GDDR6X on a 256-bit bus. It needs a 320 W VRM and cooling solution. Its die has 46.9 billion transistors.

So, you can immediately appreciate that the RTX 4080 is in a different class. Does it need to cost about 4x as much? I don't honestly know, but it's certainly not hard to understand why it should be more than 2x. Off the cuff, I'd expect about 3x (i.e. $900) to be easily justifiable, at least when we near the end of its product cycle. I think & hope it'll be repriced, but not as low as it'd have to be for me to buy one.
 

InvalidError

Titan
Moderator
The natural hurdles to enter the GPU market are not to be underestimated, as we've been reminded by Intel's stumbles.
The software challenges can be crowd-sources and fixed over time. The patent walled garden though is practically insurmountable by any means other than buying an existing player with sufficient relevant patents to ward off the other players.
 

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
The RX 6600 XT uses a 237 mm^2 N7 die and has 8 GB of 17.5 GHz GDDR6 on a 128-bit bus. It needs a 160 W VRM and cooling solution. Its die has 11.1 billion transistors.

The RTX 4080 uses a 378.6 mm^2 ~N5 die and has 16 GB of 22.4 GHz GDDR6X on a 256-bit bus. It needs a 320 W VRM and cooling solution. Its die has 46.9 billion transistors.

So, you can immediately appreciate that the RTX 4080 is in a different class. Does it need to cost about 4x as much? I don't honestly know, but it's certainly not hard to understand why it should be more than 2x. Off the cuff, I'd expect about 3x (i.e. $900) to be easily justifiable, at least when we near the end of its product cycle. I think & hope it'll be repriced, but not as low as it'd have to be for me to buy one.
You recent posts are giving me very few flaws to pick. However, I must point out that die size speaks ONLY to manufacturing cost. The R&D effort to design the chip has to be amortized in there somewhere ... and nearly all firms choose to recover these costs as early in the product lifecycle as possible, and generally much more so on the high end, than the low.
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
The software challenges can be crowd-sources and fixed over time.
Really? Intel's Linux drivers are fully open source, have been for many years, take advantage of the same Mesa infrastructure as AMD, and yet Intel's dGPU performance on Linux is even worse than Windows!


It seems to me there's more than a touch of wishful-thinking in that statement. Especially if you look at who's really doing the vast majority of work on the Linux GPU drivers.
 

InvalidError

Titan
Moderator
It seems to me there's more than a touch of wishful-thinking in that statement. Especially if you look at who's really doing the vast majority of work on the Linux GPU drivers.
My point is that the driver challenge is potentially surmountable with enough time and people while the patent one isn't.

The Linux kernel, GCC, OpenJDK, WINE and so on aren't exactly trivial projects either. The open-source community isn't exactly new to code compiling, profiling and collections of complex APIs. The talent and knowledge base to make it work is out there, if you can poach enough of the relevant people from other projects or train and retain new talented people.
 

bit_user

Titan
Ambassador
My point is that the driver challenge is potentially surmountable with enough time and people
So, you think Intel was lacking the time & people? Because a smaller company than Intel would've probably gone under, after an offering as uncompetitive as their A-series.

The Linux kernel, GCC, OpenJDK, WINE and so on aren't exactly trivial projects either.
Irrelevant. If a GPU launches without competitive performance, it'd be game over for most. Perhaps, with the exception of those operating in a protected market and with government backing.

Sure, with an unbounded amount of time & community buy-in, you can write big & complex software. That's just not relevant to the question at hand, because new hardware companies don't have unbounded time or budgets to last them until they can become competitive.
 
  • Like
Reactions: Endymio

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
My point is that the driver challenge is potentially surmountable with enough time and people while the patent one isn't
Patents expire after 20 years; the basic technology for using transistors to calculate pixels has long since been in the public domain.

A GPU startup thus has 2 options: it can design a competitive product around 20-year old technology (very difficult), or it can design and develop its own innovations -- and patent them themselves (expensive, but doable). What they cannot do, however, is simply copy NVidia's GPU and sell it as their own. NVidia has paid tens of thousands of engineers for many decades to innovate. The company is entitled to the fruits of their labor.

Companies even possess a third option, which many choose.. Rather than innovate an entirely new chip, you merely need to generate enough of your own IP that you can trade access to those patents for those of other firms. That's a win-win (technically a win-win-win, since it helps us consumers too), but it does require you to have skin in the game. But you simply cannot steal an entire chip design outright, no.
 

vanadiel007

Distinguished
Oct 21, 2015
381
376
19,060
The RX 6600 XT uses a 237 mm^2 N7 die and has 8 GB of 17.5 GHz GDDR6 on a 128-bit bus. It needs a 160 W VRM and cooling solution. Its die has 11.1 billion transistors.

The RTX 4080 uses a 378.6 mm^2 ~N5 die and has 16 GB of 22.4 GHz GDDR6X on a 256-bit bus. It needs a 320 W VRM and cooling solution. Its die has 46.9 billion transistors.

So, you can immediately appreciate that the RTX 4080 is in a different class. Does it need to cost about 4x as much? I don't honestly know, but it's certainly not hard to understand why it should be more than 2x. Off the cuff, I'd expect about 3x (i.e. $900) to be easily justifiable, at least when we near the end of its product cycle. I think & hope it'll be repriced, but not as low as it'd have to be for me to buy one.

I don't understand why a consumer should justify a 2x or 3x price increase on a product. It should be up to the manufacturer to explain those numbers.

You sound like one of those buyers who would side with the car salesman when he tells you the latest generation costs another 20% more than the previous generation. Your supposed to disagree with him so he can "give you a deal" and drop the price, still making a hefty profit on a discounted new product.

You make it too easy to defend Nvidia and AMD...
 

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
I don't understand why a consumer should justify a 2x or 3x price increase on a product.
If you don't understand, why not learn? The die is much larger; it's produced on a more expensive process, and inflation is tacking another 8-10% on every single year. And where do you get this "2 to 3 times" higher figure? The 2080 was $800, the 3080 was $1200, and the 4080 $1300. That's a 60% increase in 5 years and 2 generations time.

It should be up to the manufacturer to explain those numbers.
No manufacturer needs to "explain" themselves to you, just as you don't need to explain to them if you decide to not purchase their product. They set the price: you choose to pay it or not. It's really very simple.

You sound like one of those buyers who would side with the car salesman when he tells you the latest generation costs another 20% more than the previous generation.
I don't remember any new car that runs twice as fast as the generation before it. Do you?
 

InvalidError

Titan
Moderator
Patents expire after 20 years; the basic technology for using transistors to calculate pixels has long since been in the public domain.
Patents may expire after 20 years but countless patents for trivial stuff and minor improvements to existing stuff get filed every year. You aren't going to be able to make a competitive GPU when every sensible 3D rendering hardware and software architecture is covered by a boatload of patents and you have to limit yourself to designs covered by expired patents.

Companies even possess a third option, which many choose.. Rather than innovate an entirely new chip, you merely need to generate enough of your own IP that you can trade access to those patents for those of other firms.
I already covered that under "buy a company that has a large enough relevant patent portfolio to ward off other players."
 

bit_user

Titan
Ambassador
I don't understand why a consumer should justify a 2x or 3x price increase on a product.
Is your question about costs or pricing? I was focused on the cost side, but pricing has to do with complex dynamics like supply, demand, competition, price-sensitivity, and elasticity of supply and demand.

It should be up to the manufacturer to explain those numbers.
Is it? From their perspective, the story is basically: "This is the price. If you can't afford it, we have lower-end options that should interest you."

They don't really owe anyone an explanation. The only reason they might offer one is if they think it would help sell more GPUs, either now or in the future. Typically, as you move up into the premium product lines, you tend to see less and less of that sort of thing. It's mostly commodities where brands fear losing customer loyalty when prices go up.

You make it too easy to defend Nvidia and AMD...
A sophisticated customer (with negotiating power) should always seek to understand the cost structure of what they're buying. That gives you the upper hand in negotiations, because you'll know what their best price is likely to be, meaning you'll know how much to push. That doesn't mean you're on their side, however.

FWIW, I get the sense they could afford to come down on at least the RTX 4080, a bit. Let's try a little comparison, since we now have 3 products with 3 dies and their list prices.

GPUDie Area (mm^2)Memory (GB)MSRPcost/areacost/GB
RTX 4070 Ti
294.5​
12​
$799​
$2.71​
$66.58​
RTX 4080
378.6​
16​
$1,199​
$3.17​
$74.94​
RTX 4090
608.5​
24​
$1,599​
$2.63​
$66.63​

So, we see that the RTX 4080 is indeed an outlier. To fall in line with the others, its MSRP should probably be around $1049.

Now, that doesn't really tell us anything about pricing of the overall product line, but it does justify why many people consider the RTX 4080 to be overpriced.
 
Last edited:
  • Like
Reactions: Endymio

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
Patents may expire after 20 years but countless patents for trivial stuff and minor improvements to existing stuff get filed every year.
By definition, a trivial improvement isn't necessary to design a product. All the rest of your post is merely bemoaning the fact that, to make a competitive GPU, a company is required to innovate, rather than simply sign a contract and start receiving chips from TSMC. What did you expect? These firms aren't baking pretzels or brewing malt liquor. They need to either develop new technology on their own-- or purchase licensing rights from someone who has.
 

InvalidError

Titan
Moderator
By definition, a trivial improvement isn't necessary to design a product.
It is enough to refresh the effective duration of patents because you now have to go out of your way to make sure you don't accidentally implement the trivial improvements or re-invent the improvement while attempting to come up with an alternative implementation.
 

vanadiel007

Distinguished
Oct 21, 2015
381
376
19,060
If you don't understand, why not learn? The die is much larger; it's produced on a more expensive process, and inflation is tacking another 8-10% on every single year. And where do you get this "2 to 3 times" higher figure? The 2080 was $800, the 3080 was $1200, and the 4080 $1300. That's a 60% increase in 5 years and 2 generations time.

No manufacturer needs to "explain" themselves to you, just as you don't need to explain to them if you decide to not purchase their product. They set the price: you choose to pay it or not. It's really very simple.

I don't remember any new car that runs twice as fast as the generation before it. Do you?

What I was trying to point is that you seem very determined in explaining why the current pricing of GPU's is justified, while most others believe it's (still) to high.
That I don't understand, unless you have a vested interest in high GPU prices, which could well be the case.

I honestly do not care about die sizes etc... I just care about the cost to me as a consumer and where this is heading. In my opinion it's heading into the wrong direction considering the current MSRP pricing of these cards and I am not buying into all the manufacturing explanations as to why it's this costly.
 

vanadiel007

Distinguished
Oct 21, 2015
381
376
19,060
Is your question about costs or pricing? I was focused on the cost side, but pricing has to do with complex dynamics like supply, demand, competition, price-sensitivity, and elasticity of supply and demand.


Is it? From their perspective, the story is basically: "This is the price. If you can't afford it, we have lower-end options that should interest you."

They don't really owe anyone an explanation. The only reason they might offer one is if they think it would help sell more GPUs, either now or in the future. Typically, as you move up into the premium product lines, you tend to see less and less of that sort of thing. It's mostly commodities where brands fear losing customer loyalty when prices go up.


A sophisticated customer (with negotiating power) should always seek to understand the cost structure of what they're buying. That gives you the upper hand in negotiations, because you'll know what their best price is likely to be, meaning you'll know how much to push. That doesn't mean you're on their side, however.

FWIW, I get the sense they could afford to come down on at least the RTX 4080, a bit. Let's try a little comparison, since we now have 3 products with 3 dies and their list prices.

GPUDie Area (mm^2)Memory (GB)MSRPcost/areacost/GB
RTX 4070 Ti
294.5​
12​
$799​
$2.71​
$66.58​
RTX 4080
378.6​
16​
$1,199​
$3.17​
$74.94​
RTX 4090
608.5​
24​
$1,599​
$2.63​
$66.63​

So, we see that the RTX 4080 is indeed an outlier. To fall in line with the others, its MSRP should probably be around $1049.

Now, that doesn't really tell us anything about pricing of the overall product line, but it does justify why many people consider the RTX 4080 to be overpriced.

I consider all of them to be overpriced. I will provide an example as to why I believe this to be the case.

65" OLED LG TV 3 years ago: $3,500 (I bought one, very nice tv)
65" OLED LG TV today: $1,500 or so if you look around a bit

So even with supply chain issues, R&D costs, rising costs of components, etc.., the pricing has come down significantly.

The pricing of these GPU's is out of control.
 

Endymio

Reputable
BANNED
Aug 3, 2020
715
258
5,270
What I was trying to point is that you seem very determined in explaining why the current pricing of GPU's is justified, while most others believe it's (still) to sic] high.
That I don't understand, unless you have a vested interest in high GPU prices
I have a vested interest in citizens that understand the basic facts of economics. Ignorant people tend to support ignorant policies, which harms us all. If you doubt that, you may wish to visit the consumer paradises of Cuba, North Korea, or some former Soviet Republics. When people are gaslit into believing they're being economically exploited, bad things tend to happen.
 

InvalidError

Titan
Moderator
I consider all of them to be overpriced. I will provide an example as to why I believe this to be the case.

65" OLED LG TV 3 years ago: $3,500 (I bought one, very nice tv)
65" OLED LG TV today: $1,500 or so if you look around a bit

So even with supply chain issues, R&D costs, rising costs of components, etc.., the pricing has come down significantly.
I remember hearing back when LEP/OLED were first hyped up as future display technologies that we'd have cheap printable displays in about 10 years, so cheap that replacement of the display surface would be an acceptable solution to the burn-in issue. We're about 20 years later and they aren't anywhere near cheap enough that people would consider throwing them away an acceptable solution to burn-in.
 

bit_user

Titan
Ambassador
What I was trying to point is that you seem very determined in explaining why the current pricing of GPU's is justified, while most others believe it's (still) to high.
Both can be true. It might just be that, after years of seeing GPUs sell for crazy prices, AMD and Nvidia both decided the market was now different and would support higher prices, which emboldened them to build larger GPUs than they otherwise would've. Now that demand has collapsed and the market manipulation power of scalpers is diminished, everyone is expecting to see GPUs return to prices like 3-5 years ago, yet we're now seeing some real cost increases that have been built into the current generation of GPUs.

That I don't understand, unless you have a vested interest in high GPU prices, which could well be the case.
Here's an interesting thought experiment. Let's say Nvidia is selling each AD104 GPU for $450 and their costs are $350. If they cut their price down to $400, their margin drops by half. To break even, they'd have to sell twice as many. When we follow this through to the price of the finished good, it might mean the RTX 4070 Ti drops from $800 to $720, since the board maker, channel, and retailers each have their own markups. Are twice as many people going to buy RTX 4070 Ti graphics cards at $720 than $800? I doubt it.

So, if Nvidia and AMD let costs get a little out of hand, it's not hard to see how they could be in a bind where prices are higher than most buyers want and expect, but there's not a lot they can do about it.

I honestly do not care about die sizes etc... I just care about the cost to me as a consumer and where this is heading.
That's fair, and you're certainly entitled to feel that prices are too high for your budget or the value that these products represent to you as a user (I do!). But, when you start trying to argue about whether the prices make sense, that's when you really have to try and look at the input costs.

In my opinion it's heading into the wrong direction
Really? Street prices are lower than a year or 2 ago. I think what you mean is that it's not going far enough in the direction of affordability. I think you're right, and we might see more focus on cost-containment in future generations. I'm not sure how much better they can really do, but I'm sure it's more than they've done recently.
 

bit_user

Titan
Ambassador
I consider all of them to be overpriced. I will provide an example as to why I believe this to be the case.

65" OLED LG TV 3 years ago: $3,500 (I bought one, very nice tv)
65" OLED LG TV today: $1,500 or so if you look around a bit

So even with supply chain issues, R&D costs, rising costs of components, etc.., the pricing has come down significantly.
It's often not relevant to compare products of different types. It's even more fraught, when you pick on such a specific class of products from a single manufacturer, as all sorts of particulars can come into play.

I can think of two examples which have gone against the GPU pricing trend. In Dec. 2021, I got a new high-end Seasonic PSU for nearly 50% off. Why? Maybe they secured more components and materials than needed, in anticipation of even higher demand than what materialized.

And, for much of 2021 and 2022, we saw good deals on gaming monitors. You'd think they would be inflated like other things, but if display manufacturers did a good job of securing components before prices increased much, they could've been able to build more units under the original cost structure. The scarcity of GPUs resulting from miners buying up so much supply depressed demand for gaming monitors, leaving a glut of supply to fill up the channel. I think that's because people tend not to upgrade to a larger or higher-refresh display unless they also have a faster GPU to drive it.

So, it could be a similar story with OLED TVs, where LG prepared for demand to extend well beyond what actually played out. And, if we look at the particulars of LG, there's also the interesting fact that the broader LG group is a massive corporation with 2018 revenues equivalent to $130B. In Japan and Korea, much of the industrial production capacity is owned by a small number of mega corporations, called Zaibatsu or Chaebol. It could be that many of the components in LG OLED TVs are actually sourced from within LG, which could give them favorable terms. Even those components sourced externally could be negotiated with the heft and clout of the larger LG group, perhaps even by combining orders with other business units. For all I know, the South Korean government could even have protectionist policies that favor or prioritize domestic buyers of components and materials.

There are just too many particulars that separate the respective companies and products. I think it's not a useful predictor of GPU pricing. In order to treat it as such, you really should establish a strong correlation and make a compelling argument why we should expect their prices to move in tandem. That would involve doing a cost & technical analysis of OLED panel technology and the manufacturing process, in order to properly account for its contribution and changes or variations which have occurred over the years.

If you just look at pricing of various commodities, you'll often see deviations between them. For instance, egg prices are way up right now, way more than most other agricultural products. Why? Bird flu. If you'd happened to pick on eggs as a predictor of say milk, then you'd see a similar disconnect.
 
Last edited:

bit_user

Titan
Ambassador
I have a vested interest in citizens that understand the basic facts of economics. Ignorant people tend to support ignorant policies, which harms us all.
Yes, but we'd also do well to remember the purpose of capitalism is to optimize resource allocation and distribution - not, simply "to get rich", as too many seem to think. Where there are market failures, we need regulators to intervene. This is ultimately in the interest of capitalists. If the masses don't see capitalism working to their benefit, they'll surely turn against it. We've seen this time and again.

Capitalism is only one of many possible solutions for societies to manage resources. I happen to think that regulated capitalism is the best, but it's a tricky balance to get right and involves continual oversight and intervention by regulators. The market participants also tend to evolve faster than regulators and regulations. In the worst case, you get forms of regulatory capture. So, it's not without its perils - especially when politics and economics are so deeply entangled as in the USA.

you may wish to visit the consumer paradises of Cuba, North Korea, or some former Soviet Republics.
China has had some notable successes. I wouldn't claim it's particularly communist, though. I think the system is getting rather creaky, as its government changes from a technocratic to an authoritarian one, where political loyalty is valued more than competence.

BTW, you omitted Venezuela from your list, though a significant part of its problems were caused by the "resource curse" of massive oil deposits distorting the economy and giving the government largess to prop up their socialist system.

When people are gaslit into believing they're being economically exploited, bad things tend to happen.
We frequently see the reverse. The backlash against free trade and globalization is a good example. Economists proclaiming its virtues, but then people turn to more extreme forms of protectionism, as the domestic factory closures accumulate.

It's funny how even things like the term "meritocracy" seem to get spun round. It was originally used as a critique of capitalism, and now often touted as one of its virtues. The flip side is an implied victim-blaming, if you don't succeed, no matter what headwinds you might face or how badly the deck might be stacked against you.
 
Last edited:
  • Like
Reactions: Endymio

bit_user

Titan
Ambassador
I remember hearing back when LEP/OLED were first hyped up as future display technologies that we'd have cheap printable displays in about 10 years, so cheap that replacement of the display surface would be an acceptable solution to the burn-in issue.
I never heard that - and I was following OLED rather avidly, for a time, as I had hoped to skip LCD entirely.
 

InvalidError

Titan
Moderator
I never heard that - and I was following OLED rather avidly, for a time, as I had hoped to skip LCD entirely.
DuPont thought it'd manage to make printable OLEDs 40% cheaper than LCDs back in 2011. Kateeva though it would have cheaper-than-LCD (also by ~40%) polymer-LED displays "in just two years" back in 2005.
https://www.oled-info.com/duponts-printable-oleds-be-cheaper-lcds-40

So far, OLED is still ~10 years behind LCDs on cost for a given panel size.
 
  • Like
Reactions: bit_user

sitehostplus

Honorable
Jan 6, 2018
404
163
10,870
I wonder if none of the big US automakers use TSMC for car chips. I have seen a number of reports of cutbacks for TSMC/GloFo/foundries, and yet all of the automakers say supply constraints still affect them. I know there is a lot of time between getting orders & actually producing the required output, so cutbacks now might be due to expected order reductions for future output. So I guess TSMC probably doesn't do a lot of chips for the auto industry as they have higher-end nodes, but I'd curious where the real supply constraints are right now.
Auto chips are too simple to make, and do not generate much profit as a result.

So, no, I don't think TSMC makes those at all. More likely, they come from China.