News Lisa Su: AI Will Dominate Chip Design

AMD pivoted to using more machine learning sooner than Intel due to R&D budget, but I think in the end it has served them well. It sounds like most of what AMD/Intel are using it for is time savings. It'll be interesting to see if the increased AI usage makes an impact on additional features supported.

I don't really think this is going to have much of an impact on the timing/generations of products we see on shelves. Though it may allow for more diverse SKUs as both companies move towards using more extensive tile/chiplet designs as the engineering time would be lower.
 
I want to make something clear. If they focus on TRUE AI processing advancements, edge computing will not benefit.

Make no mistake, the major providers have invested HEAVILY (as in: all in) on cloud computing. They want you to treat the cloud as the computer. No single PC is going to be doing any amount of AI-driven work as it takes too much computational power. Now, could someone build a nueral processing P2P network, sure. But companies like AMD and Intel and others will simply disable features or, as I'm starting to believe, these AI-focused future chips will in practicality stay on the server chip lineups.

Yes, manufacturers might tout "AI POWEREDZ" on chip retail boxes and commercials (do those still exist?), but they can't let this genie out of the bottle. They have to drive sales towards their clouds.
 
I want to make something clear. If they focus on TRUE AI processing advancements, edge computing will not benefit.

Make no mistake, the major providers have invested HEAVILY (as in: all in) on cloud computing. They want you to treat the cloud as the computer. No single PC is going to be doing any amount of AI-driven work as it takes too much computational power. Now, could someone build a nueral processing P2P network, sure. But companies like AMD and Intel and others will simply disable features or, as I'm starting to believe, these AI-focused future chips will in practicality stay on the server chip lineups.

Yes, manufacturers might tout "AI POWEREDZ" on chip retail boxes and commercials (do those still exist?), but they can't let this genie out of the bottle. They have to drive sales towards their clouds.

Why not? I saw a commerical for an "AI" washing machine.
 
It sounds like most of what AMD/Intel are using it for is time savings. It'll be interesting to see if the increased AI usage makes an impact on additional features supported.
According to prior claims, it seems like it can deliver performance/power improvement nearly equivalent to a node-shrink. So, I don't see it as either/or.

I don't really think this is going to have much of an impact on the timing/generations of products we see on shelves.
It could enable more rapid turnaround of spins tailored to specific market niches or in response to the competitive or market environment.
 
No single PC is going to be doing any amount of AI-driven work as it takes too much computational power.
Depends on what. You can still train smaller networks on desktop hardware.


Training does take an awful lot of compute and memory bandwidth, however. Certain networks just can never be trained using an amount of compute and infrastructure a consumer could possibly afford, much less power. It was rumored to take something like a month on 10,000 A100 GPUs to train GPT-3

Now, could someone build a nueral processing P2P network, sure. But companies like AMD and Intel and others will simply disable features or,
In general, no. It would be too expensive to have a substantial amount of compute on-die that's just disabled. Intel's client Golden Cove cores physically don't have AMX, for instance.

However, the main instance of this that we do actually know about wasn't by AMD.

"NVIDIA gave the GeForce cards a singular limitation that is none the less very important to the professional market. In their highest-precision FP16 mode, Turing is capable of accumulating at FP32 for greater precision; however on the GeForce cards this operation is limited to half-speed throughput. This limitation has been removed for the Titan RTX, and as a result it’s capable of full-speed FP32 accumulation throughput on its tensor cores."

Source: https://www.anandtech.com/show/13668/nvidia-unveils-rtx-titan-2500-top-turing
 
AI will only make cpu/gpu even more expensive without a reason. And create another chip shortage eventually.
Why do you think so?

EDA (Electronic Design Automation) tools have been used to enable bigger and complex semiconductor designs since the 1980's. As transistor counts have ballooned and design constraints of newer manufacturing technologies have increased, EDA tools have had to keep pace, with less and less being done by hand. One way to look at this is just the next generation of said tools.

I believe that chip designers wouldn't use these AI-enabled tools if they didn't offer real benefits in cost, performance, efficiency, or time-to-market.
 
nope

ytutyutut.png
 
This is one of those area's where real AI can be used, and it doesn't require massive amounts of training either as the scope is extremely limited. Mostly it's the next level in testing and design automation with the results being massive savings in manhours allowing for faster product development. This doesn't mean faster product releases, but that each release will have more in it.
 
You're certainly entitled to your opinion, so no need to apologize. I just wondered if you might like to explain why, because maybe you had information or a perspective different from mine.
i misunderstood the title, i thought the domination was AI cores in cpu/gpu. for design its ok.
 
  • Like
Reactions: bit_user