Interesting to know that they used a new AI approach to design RISC-V CPU automatically, by generating the Boolean function/logic circuit represented by BSD, which can also automatically generate optimal instruction-level
parallelism. Much better than the conventional BDD generation problems tackled in existing EDA tools.
Because using "conventional" and traditional AI
learning techniques usually fail if they are used to design CPUs, just basically only from using input-output.
They can only generate correct circuit logic around about 300 logic gates, which is no match when compared to that of latest industrial CPUs. Even the Intel 80486 CPU rounds up to about 300,000 logic gates.
It appears the CPU has been codenamed as "Qimeng 1" (after translation), and 4 million logic gates were generated in 5 hours, which is 4000 times larger than the chip that can be designed by GPT-4.
Not directly related to this news:
View: https://www.youtube.com/watch?v=yTMRGERZrQE&ab_channel=TechTechPotato%3AClips%27n%27Chips