News TSMC's wafer pricing now $18,000 for a 3nm wafer, increased over 3X in 10 years: Analyst

do wonder what the "limit" is on price of electronics.

most people are making less (as inflation makes stuff cost more so value isnt as high) yet goods go higher and a company can only sell stuff for a set price before too few ppl buy em and its losing $.
 
  • Like
Reactions: Nitrate55
Why is anyone surprised. The chips with 3nm node pack in way more transistors than 28 nm so you have a choice of making the chips smaller with the same processing power or the same size with a lot more processing power. Given the propensity of apps to add more features get increasingly complex the latter seems to be happening hence costs increase but with some new smartphones hitting $1500 there will be a point where people will stop paying.
 
Not completely blaming the article writer here but it seems a pretty serious oversight to go by wafer price and not even mention the cost per transistor over the same timeframe. At 28nm, transistor density is around 14M/mm2 while 3nm is at about 300M/mm2. In other words, all else being equal, you would get about 21x the number of chips on a 3nm wafer vs. a 28nm wafer (never mind order-of-magnitude better efficiency/clock scaling etc.)

Accounting for the 3x cost increase, an equally accurate headline could be:

TSMC reduces cost per transistor by 7x over 10 years

The math doesn't check out. Cost per transistor hasn't dropped in 15 years. It's been slowly going up over that time.

LmCtmoT57uZA8jqgWkz5Yo-1200-80.png
 
Not completely blaming the article writer here but it seems a pretty serious oversight to go by wafer price and not even mention the cost per transistor over the same timeframe. At 28nm, transistor density is around 14M/mm2 while 3nm is at about 300M/mm2. In other words, all else being equal, you would get about 21x the number of chips on a 3nm wafer vs. a 28nm wafer (never mind order-of-magnitude better efficiency/clock scaling etc.)

Accounting for the 3x cost increase, an equally accurate headline could be:

TSMC reduces cost per transistor by 7x over 10 years
You do realize you can’t get 21x the number of chips just because you have 21x transistors because you can’t just infinitely shrink everything, right? SRAM uses like 30% of the transistor budget and doesn’t scale as nearly much as logic on modern nodes not to mention things like massively overbuilt modern branch predictors that can recognize patterns that are insanely complicated to prevent lost work killing your efficiency.
 
According to Yahoo from last year:

https://finance.yahoo.com/news/tsmc-may-increase-wafer-pricing-104033184.html

Negotiations with AI and HPC customers, such as Nvidia, suggest these clients can tolerate approximately 10% price hikes for 4nm-class wafers from around $18,000 per wafer to around $20,000 per wafer. As a result, the 4nm and 5nm nodes, primarily used by companies like AMD and Nvidia, are expected to see an 11% blended average selling price (ASP) hike. This means that prices of N4/N5 wafers may increase by approximately 25% since Q1 2021, at least for some customers.

4nm already costs $20k per wafer now. A 25% increase in cost for the same node since 2021. This is why Gelsinger was trying to get Intel into fabbing for external customers.

While wafer prices depend on actual agreement and volumes, some believe that a wafer produced on TSMC's N3 node costs around $20,000 or higher, but will increase next year. Morgan Stanley believes that companies should be room for companies to 'pass through the additional costs to end users.'

$18k per 3nm wafer seems like a really optimisitic estimate. Ironically, the link in the quote above points to an article on this site about 3nm wafers costing $20k.
 
Last edited:
  • Like
Reactions: thestryker
It could make sense as an Apple exclusive price given that they've bought first runs of TSMC's last few nodes entirely. I certainly agree that likely isn't the standard rate for the node.
I think it probably is now. The $20,000-$25000 Apple was paying was a first gen node with only one production line available. Now they have like 4 different 3nm sites with multiple production lines in each.
 
According to Yahoo from last year:

https://finance.yahoo.com/news/tsmc-may-increase-wafer-pricing-104033184.html



4nm already costs $20k per wafer now. A 25% increase in cost for the same node since 2021. This is why Gelsinger was trying to get Intel into fabbing for external customers.



$18k per 3nm wafer seems like a really optimisitic estimate. Ironically, the link in the quote above points to an article on this site about 3nm wafers costing $20k.
Prices on new nodes go down very quickly when the next one is getting ready for production ramping, and 2nm is getting very close to production ramping. The gains are getting small enough that new nodes can’t justify massive price hikes as well.
 
Prices on new nodes go down very quickly when the next one is getting ready for production ramping, and 2nm is getting very close to production ramping. The gains are getting small enough that new nodes can’t justify massive price hikes as well.
Did you miss the part where TSMC announced price hikes? That typically doesn't indicate rapid drops in price are coming soon.

The spiraling costs of fab construction are going to dictate continuing cost increases whether they can be justified or not. Gov't subsidies already play a big role in getting these built, sooner or later they will likely become a requirement for leading edge fab construction.
 
  • Like
Reactions: bit_user
Accounting for the 3x cost increase, an equally accurate headline could be:

TSMC reduces cost per transistor by 7x over 10 years
This is also misleading, because it implies they're still on that trend. However, cost per transistor has been increasing for a few generations, already.

As the article pointed out, the big gains in transistor density are well behind us, yet price increases for new nodes are probably as big or bigger than ever before.
 
... SRAM ... doesn’t scale as nearly much as logic on modern nodes ...
Not only that, but I/O isn't scaling well, which is why Intel and AMD are both using an older node for things like memory and PCIe controllers. That's even while their L3 cache is (for the most part) still using the same node as their compute cores!

Another scaling issue that's recently been covered is wire scaling:

wLR9kkjzrCh8cjAnu2JqR-970-80.png.webp


Intel is experimenting with a new technology based on Ruthenium and air gaps, though they didn't specify how much improvement it achieved. Even if everything works as they hope, it'll be a while before it enters production.
 
  • Like
Reactions: snemarch
This is pretty old news. The overall cost when you factor in not just logic gates but SRAM, which every SoC or CPU has, basically started rising after 7nm. The cost per wafer at 7nm was $9,350 vs $16,988 for 5nm wafers. Going to 3nm was 'only' 25% more, but 3nm has 4/5nm like SRAM density which is a huge portion of modern SoC/CPU dies.

Barring some new technologies, 3/2 nm is likely to be the end of the line for affordable CPUs and SoCs. Really I'm not even sure 7nm won't hold that distinction.

 
Barring some new technologies, 3/2 nm is likely to be the end of the line for affordable CPUs and SoCs. Really I'm not even sure 7nm won't hold that distinction.
I think CPUs have an easier time than GPUs, simply because the transistor count of CPU cores doesn't increase as fast and core counts have somewhat stabilized.

What does seem to be circling the drain is GPU scaling, due to the extent they're dependent on more transistors for generational performance increases. It's perhaps ironic that Nvidia only started using lots of on-die SRAM at a time when the SRAM is no longer scaling well. On that point, I think AMD's chiplet strategy had merit, but perhaps just needs a couple more iterations for them to iron out the wrinkles.

What I expect will happen is that we just see longer gaps between generations. Maybe, instead of a new generation every 2 years, it'll stretch out to 3 years and beyond.
 
Barring some new technologies, 3/2 nm is likely to be the end of the line for affordable CPUs and SoCs. Really I'm not even sure 7nm won't hold that distinction.
TSMC is touting the N4C node (5nm variation) as a more affordable node, with volume production this year. And they will have more nodes like this in the future. Even if it's not a true budget node (-8.5%), there will usually be an opportunity to take costs a step back. If a node initially costs $20k, the long-lived budget version could go down over time.

What would be funny is if almost all chips eventually get 3D SRAM chiplets to cut down costs.
 
LOL, It's a phone. It does the SAME thing a phone did 15 years ago. Why anyone would spend the money Apple thinks their products are worth are beyond me. I have never paid more than $300 for smart phone and never will. People don't need 8 cameras on their phones.
 
LOL, It's a phone. It does the SAME thing a phone did 15 years ago. Why anyone would spend the money Apple thinks their products are worth are beyond me. I have never paid more than $300 for smart phone and never will.
I hear this argument, but one problem we face is that app developers tend to use and target newer devices. So, they pack in features and only tune performance well enough to run on those devices. This leads to older phones feeling slower and more sluggish over time.

It's a similar story with web developers. So, if you use your phone to browse the web, it'll be getting increasingly bogged down by sites using more videos and other heavy-weight web features.

I'd say it's probably easier to get along with a low cost phone, if you upgrade more frequently. I had a 6 year old phone that I finally upgraded, last year, but it was a flagship model when it launched (or at least had a top-tier SoC). The other thing that was starting to be a concern is that it got only one Android update and some apps don't support older Android versions.
 
LOL, It's a phone. It does the SAME thing a phone did 15 years ago. Why anyone would spend the money Apple thinks their products are worth are beyond me. I have never paid more than $300 for smart phone and never will. People don't need 8 cameras on their phones.
There are benefits to having multiple cameras. But you can continue to slum it on old phones if you want. Nobody asked.

The pricing of these nodes also affects CPUs and GPUs.
 
The other thing that was starting to be a concern is that it got only one Android update and some apps don't support older Android versions.
For all of Microsoft's faults this is one place where Windows is far superior to iOS/Android. Apple was better than Google about updates for a long time, but even there they had arbitrary cutoffs (though I think the 64-bit cutoff was reasonable). Google took way too long to get Android in order for easier updates, but at least now it's about the same as Apple depending on device purchased.

The awful updates did get me to buy a Pixel and stay with them though, and now Samsung is as good or better with updates though they're equally arbitrary. This is generally an even bigger problem with budget phones (though the cheapest Pixels still have the same update cycle as the others).
 
Why is anyone surprised. The chips with 3nm node pack in way more transistors than 28 nm so you have a choice of making the chips smaller with the same processing power or the same size with a lot more processing power. Given the propensity of apps to add more features get increasingly complex the latter seems to be happening hence costs increase but with some new smartphones hitting $1500 there will be a point where people will stop paying.
I was going to say the same thing, yes cost per mm^2 has gone up but cost per transistor has gone down. So if all you want to do is produce the same design on a newer node, area will go down, allowing for more chips per wafer, and if the defect rate is the same between the former and new node, then the move will result in more usable dies.

Edit: in writing this, I implied 5nm to 3nm but did not state that. Sorry for the confusion.
 
Last edited: