News Engineer 'builds a GPU from scratch' in two weeks — process much harder than he expected

cyrusfox

Distinguished
I will be interested to see how they navigate the display drivers as well as what resolutions and output technologies this can support. Doing all this solo in 2 weeks is amazing, hope it powers on with first tape out. Best of luck.
 
  • Like
Reactions: snemarch
Apr 11, 2024
6
10
15
So, how powerful is this thing? I mean on a 130nm process node, it can't be that fast considering 130nm was used in GPUs in 2001.

I will be interested to see how they navigate the display drivers as well as what resolutions and output technologies this can support. Doing all this solo in 2 weeks is amazing, hope it powers on with first tape out. Best of luck.

It's not a GPU. Back in the day (like 30 years ago) something like this might get slotted into a CPU as an accelerated math unit or a co-processor to a CPU to help with certain math functions. Interesting project but far away from what it is being reported as. Same applies to the CPU article which design was even further away from a CPU than this design is to a GPU.
 

Conor Stewart

Prominent
Oct 31, 2022
44
28
560
It's not a GPU. Back in the day (like 30 years ago) something like this might get slotted into a CPU as an accelerated math unit or a co-processor to a CPU to help with certain math functions. Interesting project but far away from what it is being reported as. Same applies to the CPU article which design was even further away from a CPU than this design is to a GPU.
Yeah it doesn't seem like much of a GPU, it has 4 compute units with 4 processors each and nothing seems specific to graphics. Like you say it seems much more like a maths accelerator.

It's also going to be very limited with only 11 instructions.

I can't help but feel that this guy is rushing it. First a CPU in two weeks and then a GPU in two weeks and neither is that good. He has likely missed a lot of things and made quite a few bad design choices. In my opinion it would be better to have taken more time and done more research to produce a better CPU from the start rather than rushing through a pretty bad design and calling it done.

He would also likely learn more by slowing it down, like learning the best design practices rather than pushing through with bad design decisions.
 
Apr 11, 2024
6
10
15
Yeah it doesn't seem like much of a GPU, it has 4 compute units with 4 processors each and nothing seems specific to graphics. Like you say it seems much more like a maths accelerator.

It's also going to be very limited with only 11 instructions.

I can't help but feel that this guy is rushing it. First a CPU in two weeks and then a GPU in two weeks and neither is that good. He has likely missed a lot of things and made quite a few bad design choices. In my opinion it would be better to have taken more time and done more research to produce a better CPU from the start rather than rushing through a pretty bad design and calling it done.

He would also likely learn more by slowing it down, like learning the best design practices rather than pushing through with bad design decisions.

Agreed. It seems like he just wanted to get a crash course on digital design to better understand the overall flow, which is commendable and it is impressive how much he was able to accomplish in the short time frame. However, as you mention, the small amount of time given and seemingly fairly decent reliance on AI to help, most likely a lot of core understanding was missed. I would be curious to see what the target frequency is and how he intends to actually test the chip after tapeout. Bad design or not, if the chip is at least functional at even a half-way decent frequency for the process node, it would be a nice accomplishment given the very short design time.