News AI Tools Take Chip Design Industry by Storm: 200+ Chips Tape Out

JTWrenn

Distinguished
Aug 5, 2008
331
234
19,170
I understand that AI can help, but someone can explain what manufacturing stage is improved and in which way ?
Probably something that is present in every stage for some part or another. Likely used as double checks or optimizations for layouts, or maybe even as a debugger. Used to recommend things when people get stumped on some part of a project. Used to create rough drafts that are then tweaked by humans. Used to do something complex for a human to apply but that has been done before like say reorganizing a group of data a way it was done before that couldn't be done with a normal algorithm but ai can mimic.

In short as a tool in every step rather than to completely take over any step.
 
Probably something that is present in every stage for some part or another. Likely used as double checks or optimizations for layouts, or maybe even as a debugger. Used to recommend things when people get stumped on some part of a project. Used to create rough drafts that are then tweaked by humans. Used to do something complex for a human to apply but that has been done before like say reorganizing a group of data a way it was done before that couldn't be done with a normal algorithm but ai can mimic.

In short as a tool in every step rather than to completely take over any step.

Every chip has functional blocks or sub systems. Placing these systems by hand were guesses like which would have the least signal propagation time, or best path for high current passageways. Apu here, look ahead buffer there. Decoder here, cache there, ooe rendering ther, security there, memory controller there....

There are tons of these logic blocks. They used to be placed by hand with some intuition based off experience and some rough numbers. AI does a better job at placement.
 

JTWrenn

Distinguished
Aug 5, 2008
331
234
19,170
Every chip has functional blocks or sub systems. Placing these systems by hand were guesses like which would have the least signal propagation time, or best path for high current passageways. Apu here, look ahead buffer there. Decoder here, cache there, ooe rendering ther, security there, memory controller there....

There are tons of these logic blocks. They used to be placed by hand with some intuition based off experience and some rough numbers. AI does a better job at placement.
Yup, as I said " Likely used as double checks or optimizations for layouts, or maybe even as a debugger"
 

bit_user

Titan
Ambassador
In short as a tool in every step rather than to completely take over any step.
I think that's not accurate. It's probably used for placement & routing, exclusively.

Sort of like how programmers tend to use compilers, instead of hand-coding assembly language. Not a perfect analogy, but the point is that nobody is taking compiler output as a "first pass" and then tweaking it by hand.

I always laugh when they claim that "AI" does not make mistakes.
The resulting design is already subjected to a validation process, when humans do it. So, they would simply do the same with AI-driven process.

that imperfection is what makes humans perfect.
Human fallibility is no virtue, in engineering or manufacturing.
 

bit_user

Titan
Ambassador
Yes? I just don't believe in perfection.
Yeah, the article is weird to even talk about that. The resulting design has to meet its timing targets. You wouldn't fab a chip without checking that, whether designed by humans or AI.

What's interesting is that they're also talking about using AI in verification, which suggests there are some difficult-to-measure factors that tend to defy conventional verification approaches.
 

JTWrenn

Distinguished
Aug 5, 2008
331
234
19,170
I think that's not accurate. It's probably used for placement & routing, exclusively.

Sort of like how programmers tend to use compilers, instead of hand-coding assembly language. Not a perfect analogy, but the point is that nobody is taking compiler output as a "first pass" and then tweaking it by hand.
For now maybe but it will be coming.