News Imec patterns first logic and DRAM transistors using High-NA litho tools

Status
Not open for further replies.
This significant accomplishment underscores High NA technology’s ability to replace several mask layers with a single exposure, simplifying manufacturing processes, shrinking cycle times, and reducing costs.
I hope our journey towards $0.10/GB 3D DRAM goes well.
 
I hope our journey towards $0.10/GB 3D DRAM goes well.
Well, single-patterning is ultimately indeed about cost-reduction, which is key for increasing transistor count in whatever chip we're talking about.

Every time new CPUs or GPUs launch, quite some of the commentary is about prices. It would seem that ever-higher pricing is going to risk stifling industry growth, yet higher prices is the basic trajectory we're on.
 
  • Like
Reactions: usertests
Every time new CPUs or GPUs launch, quite some of the commentary is about prices. It would seem that ever-higher pricing is going to risk stifling industry growth, yet higher prices is the basic trajectory we're on.
DRAM pricing swings wildly based on supply and demand cycles, but if it becomes cheaper to make, it will trend down. That is my expectation if they can get it on a trajectory similar to 3D NAND, which is in the works. DRAM requirements may never again go up as fast as the prices can come down.

I don't know how bad things actually are after we take inflation out of the equation. PCs and DIY are threatened by various developments, but if sub-$200 CPUs and GPUs are what we want, there's always the used market, and APUs or sub-APUs (any desktop with weak iGPU) will do the trick for some people.
 
  • Like
Reactions: bit_user
DRAM pricing swings wildly based on supply and demand cycles, but if it becomes cheaper to make, it will trend down.
This year, I bought more than I really required, just to fill both channels with dual rank memory. It's certainly not helping the "demand" problem, when nobody in the industry makes lower-capacity chips, so I don't have to buy 32 GB DIMMs, just to get dual rank.

I don't know how bad things actually are after we take inflation out of the equation.

JWVHie5XEX4KCvj9VRdzSM.png

Source: https://www.tomshardware.com/tech-i...nsistor-stopped-dropping-a-decade-ago-at-28nm

While I'm at it, density and efficiency are also flattening:

PCs and DIY are threatened by various developments, but if sub-$200 CPUs and GPUs are what we want, there's always the used market,
I wonder if smaller nodes will see more instances of accelerated wear-out, like Intel's Raptor Lake problem. I always had expected us to reach a point where CPUs just started failing after a certain amount of use and effectively became "consumables". If that happens, it would substantially undermine the used market.

and APUs or sub-APUs (any desktop with weak iGPU) will do the trick for some people.
Since I don't play video games, I have typically just used iGPU graphics since Sandybridge.
 
  • Like
Reactions: usertests
This year, I bought more than I really required, just to fill both channels with dual rank memory. It's certainly not helping the "demand" problem, when nobody in the industry makes lower-capacity chips, so I don't have to buy 32 GB DIMMs, just to get dual rank.
I don't care about ranks but I was able to get 64 GB DDR4 cheap. Basically buy the dip and go big at the same time, even if you have to wait years in between purchases. I don't buy RAM when I don't have a system to test it, so I would skip a great DDR5 deal for now, like this YMMV one.

While I'm at it, density and efficiency are also flattening:
Not good trends obviously, but on the other hand Zen 5 is packing 28% more transistors with lower MSRPs than the previous generation. Not that +28% transistors means 28% better.

We might see some (positive) cost scaling with High-NA EUV or other developments. Even if we don't, chips could remain around the same number of transistors (less die area, no core increases) and you would still get mild performance/efficiency benefits. There's a slight silver lining in the dark clouds ahead.

I wonder if smaller nodes will see more instances of accelerated wear-out, like Intel's Raptor Lake problem. I always had expected us to reach a point where CPUs just started failing after a certain amount of use and effectively became "consumables". If that happens, it would substantially undermine the used market.
I think we'll have to take it on a case-by-case basis. For all we know, we'll adopt some QFET-like transistor that has amazing endurance despite being smaller. But simulations and predictions might not accurately tell you that things are going to start failing after 10 years in the real world (not a rapid Raptor Lake style failure), which is plenty of time for products to spread on the used market.

My advice is to hoard. See a good $100 desktop computer on sale? I did, and I put it in the hoard. This strategy works better with small factor systems, but I would even consider stripping larger ones for parts.
 
I don't care about ranks but I was able to get 64 GB DDR4 cheap. Basically buy the dip and go big at the same time, even if you have to wait years in between purchases.
But, then you might be stuck with a slower speed memory. I basically waited as long as I could, so that I could try to get faster RAM that I could potentially reuse in a future machine.

Not good trends obviously, but on the other hand Zen 5 is packing 28% more transistors with lower MSRPs than the previous generation. Not that +28% transistors means 28% better.
It had to use an older node, however - N4P is still N5-derived. It also reused the same N6-based IO die, with implications on memory support and idle power.

We might see some (positive) cost scaling with High-NA EUV or other developments. Even if we don't, chips could remain around the same number of transistors (less die area, no core increases) and you would still get mild performance/efficiency benefits. There's a slight silver lining in the dark clouds ahead.
Yeah, it's not like Moore's Law suddenly dies a firey death. The benefits just occur more slowly, especially if you aren't willing to spend more.

My advice is to hoard. See a good $100 desktop computer on sale? I did, and I put it in the hoard. This strategy works better with small factor systems, but I would even consider stripping larger ones for parts.
I buy some stuff when I see deals. Last year, I bought some SSDs, because I expected to use them this year and knew prices would be higher.

I'm more likely to do that sort of thing with cases, power supplies (although this is how I got stuck with an ATX 3.0 model, but I don't really care since I don't run high-power GPUs), and some networking gear (got a great deal on a multi-gigabit switch in early 2020 that is only now starting to get surpassed).
 
Status
Not open for further replies.