News Nvidia Tech Uses AI to Optimize Chip Designs up to 30X Faster

OneMoreUser

Prominent
BANNED
Jan 2, 2023
111
108
760
Is Nvidia using AI supposed to be a selling point, if not then why should we care.

Judging from the 30- and not the 40- generation of Nvidia GPU's their AI isn't working since their products have not gotten cheaper and efficiency isn't like revolutionary either.

Or maybe it is because they have used AI to improve their pricing, as in how they make more money for each GPU they sell.

Before any one buys then do your self a favor, look at what AMD and Intel is bringing in terms of GPU's and make sure to notice the current prices rather than those from the launch reviews (and note also that DLSS and FSR is damn close to equal in many situations).
 
  • Like
Reactions: gg83
Mar 28, 2023
1
4
15
Is Nvidia using AI supposed to be a selling point, if not then why should we care.

Judging from the 30- and not the 40- generation of Nvidia GPU's their AI isn't working since their products have not gotten cheaper and efficiency isn't like revolutionary either.

Or maybe it is because they have used AI to improve their pricing, as in how they make more money for each GPU they sell.

Before any one buys then do your self a favor, look at what AMD and Intel is bringing in terms of GPU's and make sure to notice the current prices rather than those from the launch reviews (and note also that DLSS and FSR is damn close to equal in many situations).

I hate Nvidia right now just as much as the next gamer, but their 40 series chips are the most efficient GPU's on the market. The 4090 might not seem that way with the stock power limit, but that's because Nvidia was worried about AMD before launch and pushed the power limit WAY past it's efficiency curve. With the power limit set to just under 300W, it only loses about 5-10% performance. That makes it more than twice as efficient as the 3080 from last gen, and MUCH more efficient than the 3090. It's also much more efficient than the 7900XTX at that wattage, while having more performance. That said, I don't recommend anyone buy current gen GPU's. With Nvidia charging $800 for what essentially should've been the 4060, and AMD not having a high enough jump in performance over last gen and moving a GPU up the stack to charge more, buying them is telling them the prices are okay. As much as I hate Intel for all their anti consumer practices, I really hope they succeed. Nvidia and AMD need competition to put them in their places.
 

bit_user

Titan
Ambassador
their 40 series chips are the most efficient GPU's on the market. The 4090 might not seem that way with the stock power limit, but that's because Nvidia was worried about AMD before launch and pushed the power limit WAY past it's efficiency curve. With the power limit set to just under 300W, it only loses about 5-10% performance. That makes it more than twice as efficient as the 3080 from last gen, and MUCH more efficient than the 3090. It's also much more efficient than the 7900XTX at that wattage, while having more performance.
The reason why the RTX 4090 is so expensive is that it's big. And all desktop GPUs are normally clocked above their peak-efficiency point. The recipe for making an efficient GPU is basically just to make it big and clock it low. That's exactly what you're describing.

Because size costs money, the dilemma is that you cannot have a GPU that's both cheap and efficient. However, as we've seen in the past, die size is not a perfect predictor of performance, so I don't mean to say that architecture doesn't matter. However, a bigger, lower-clocked GPU will generally tend to be more efficient.

Disclaimers:
  • When comparing die size, I'm assuming the GPUs in question are on a comparable process node.
  • When comparing efficiency, I'm assuming the GPUs in question are at comparable performance levels. If you're willing to throw performance out the window, you can obviously make a very efficient GPU.
 

Johnpombrio

Distinguished
Nov 20, 2006
252
73
18,870
Starting with the 8086 CPU, Intel became concerned with the integrated circuit's layout very early on. Within a couple of generations, the company started to develop tools to aid the engineers in designing the layout. Less than a decade later (in the late 1980s), these design tools were fully responsible for the entire layout. Intel didn't call it "AI", just a design tool. This is VERY old news.
 

bit_user

Titan
Ambassador
Intel didn't call it "AI", just a design tool. This is VERY old news.
I think we all know that automated routing and layout isn't new. However, they seem to be tackling a higher-level floorplanning problem, which involves instantiating multiple macros. I doubt they're the first to automate such a task, but they do claim that it's "traditionally" a somewhat manual process. The paper indeed references benchmarks by the TILOS-AI-Institute, which immediately tells us this is by no means a new optimization target.

To understand where/how they used "AI" in the process - exactly what they mean by that - you'd have to read their paper or perhaps just the blog post linked from the article. Unfortunately, those links aren't working for me (I get a "bad merchant" error). So, I've gone to their site and found the links for myself:
 
Last edited:

Madkog

Distinguished
Nov 27, 2016
12
1
18,515
I am thinking technically we are probably at a stage where no one person has the low level understanding of these chips to every minute detail.

I can see Ai will make this even more so. Maybe too the extent where teams can't grasp it either.

Pretty scary stuff
 
  • Like
Reactions: Why_Me

bit_user

Titan
Ambassador
I am thinking technically we are probably at a stage where no one person has the low level understanding of these chips to every minute detail.
It's basically like an optimizing compiler, but for chip design. I know that's a gross simplification, but that's all it's doing - taking an existing design and just figuring out an optimal layout.

IMO, GPT-4 is way more scary. It can create the actual designs for chips and then write programs for them. It lacks agency, though, which people seem to miss.

Teaching AI to build more efficient and better version of chips could never backfire on us!
Same argument applies to virtually all forms of automation. If you really want to live in a world where no rouge AI or hacker could cause any damage, it looks like developed countries did in the 1950's. It's easy to forget how manual things were, back then, but think typewriters and telephone operators at switchboards.

The problem with regressing our technology that nobody seems to appreciate is that our current population size couldn't be supported by 1950's era tech. There's no going back - at least, not without an apocalyptic calamity.
 
Last edited:

kjfatl

Reputable
Apr 15, 2020
214
157
4,760
We're well beyond that stage, by at least a decade, when it comes to large dies like CPUs and GPUs.
The ability to understand everything to a minute detail was lost by the mid 1980's. This is when we moved from transistor based designs to standard cell designs allowing most of the designers to work at a higher level of abstraction. This process has continued. For example, If a designer wants a PCIE interface in their design, they will pull in an existing PCIE design and trust that the designers of the core knew what they were doing. For larger designs there are literally thousands of people involved in the design and manufacturing process. It is truly amazing the level of technology we have reached in such a short period of time.
 
  • Like
Reactions: bit_user

cirdecus

Distinguished
BANNED
Am I the only one that is freaked out by AI developing the fundamental building blocks of it's own brain? Imagine if ChatGPT could construct and design it's own chips used in GPU's that power itself.
 
  • Like
Reactions: Why_Me

PlaneInTheSky

Commendable
BANNED
Oct 3, 2022
556
762
1,760
Might want to write an update.

Researchers claim Nvidia fabricated the data in a Nature publication to sell AI chip.

In fact Google's AI performed worse than humans.

A former Google employee has now also come out and said Google hyped and fabricated the data.

https://www.theregister.com/2023/03/27/google_ai_chip_paper_nature/?td=rt-3a

A Google-led research paper published in Nature, claiming machine-learning software can design better chips faster than humans, has been called into question after a new study disputed its results.

The university academics ultimately found their own recreation of the original Google code, referred to as circuit training (CT) in their study, actually performed worse than humans using traditional industry methods and tools.
 

bit_user

Titan
Ambassador
Might want to write an update.

Researchers claim Nvidia fabricated the data in a Nature publication to sell AI chip.

In fact Google's AI performed worse than humans.

A former Google employee has now also come out and said Google hyped and fabricated the data.

https://www.theregister.com/2023/03/27/google_ai_chip_paper_nature/?td=rt-3a
Slow your roll, bro. That article is about Google, not Nvidia!

Here's the actual paper that Nvidia published about their technique. Something is screwy with links in forum posts, so you might need to copy/paste it into your address bar:



Note that every one of the authors is listed as an Nvidia employee, and the paper makes only one reference to Google. That article you linked says nothing about Nvidia, whatsoever.

Same subject, different research papers by different research groups.
 

bit_user

Titan
Ambassador
I don't trust this for two reasons:
  1. I think that everyone knows better than to trust nVidia in anything
People trusting their AI workloads to Nvidia seem to keep coming back for more. The film industry trusts their rendering to Nvidia Optix. I'm sure there are plenty of others that have done quite well with them.

2. The last time a computer designed a chip resulted in Bulldozer (AMD FX)
So, you're claiming EDA technology can't improve in more than a decade, and with orders of magnitude more compute and memory to throw at the problem? And no, Bulldozer wasn't "the last time a computer designed a chip". As far as I can tell, Bulldozer isn't even a particularly relevant example to what this project is concerned with.
 
So, you're claiming EDA technology can't improve in more than a decade, and with orders of magnitude more compute and memory to throw at the problem? And no, Bulldozer wasn't "the last time a computer designed a chip". As far as I can tell, Bulldozer isn't even a particularly relevant example to what this project is concerned with.
Bulldozer wasn't designed by a human being because AMD couldn't afford to do so at the time. The problem was that a computer-based program (even AI) would design a chip that minimised weaknesses instead of maximising strengths. It did this to ensure maximum flexibility in multiple tasks, even theoretical ones. Along comes Jim Keller and shows why a human designer is better with Zen. A human will instinctively know what other humans want while an AI can make assumptions but will never be certain because a lot of what humans want defies logic. Of course it can improve, but I don't think that it will be capable of superior designs until the technology has matured for at least two decades because the AI has to be able to do more than just think. It has to be able to "think like a human", something that is, in itself inherently dangerous.

Are you denying that nVidia tries to push immature technologies on people to improve their own bottom line? Have you ever heard of hardware-accelerated ray-tracing?
 

bit_user

Titan
Ambassador
Bulldozer wasn't designed by a human being because AMD couldn't afford to do so at the time. The problem was that a computer-based program (even AI) would design a chip that minimised weaknesses instead of maximising strengths. It did this to ensure maximum flexibility in multiple tasks, even theoretical ones. Along comes Jim Keller and shows why a human designer is better with Zen. A human will instinctively know what other humans want while an AI can make assumptions but will never be certain because a lot of what humans want defies logic. Of course it can improve, but I don't think that it will be capable of superior designs until the technology has matured for at least two decades because the AI has to be able to do more than just think. It has to be able to "think like a human", something that is, in itself inherently dangerous.
I wrote a longer reply, but it's pointless to get into the details because it's clear that you have no idea what you're even talking about. I think you're too triggered by the mention of "Nvidia" and "AI" to take a step back and educate yourself suitably, before trying to take some sort of stand on it.

Here are some key points you might've missed & other food for thought:
  1. This is simply a research project. It's not a product, service, or anything to do with video games.
  2. As Nvidia's blog post says, the paper was presented it at the International Symposium on Physical Design (https://ispd.cc/ispd2023/index.php). Funny enough, Nvidia wasn't even a sponsor of the event, but AMD was.
  3. This is tackling a very limited form of the broader chip layout problem.
  4. Above, I already posted a link to the blog post & paper, in case you want a starting point for becoming more informed about the subject.
  5. If this article has you upset, definitely don't read these:
 
  • Like
Reactions: JohnBonhamsGhost