Redditor repurposed his Ryzen 5 4600G into a 16GB VRAM GPU to run Stable Diffusion.
$95 AMD CPU Becomes 16GB GPU to Run AI Software : Read more
$95 AMD CPU Becomes 16GB GPU to Run AI Software : Read more
> Unfortunately, he only provided demos for Stable Diffusion, an AI image generator based on text input. He doesn't detail how he got the Ryzen 5 4600G to work with the AI software on his Linux system.
Stable Diffusion doesn't really run out of memory during AI workloads, at least the implementations I'm familiar with (Automatic1111 and ComfyUI) can work even with low (≤4GB) VRAM GPUs with a speed penalty, moving stuff between DRAM and VRAM. Kobold and similar programs can do the same with text generation, but the speed penalty, in my experience, is so large that it doesn't make it worthwhile.Logically, the APU doesn't deliver the same performance as a high-end graphics card, but at least it won't run out of memory during AI workloads, as 16GB is plenty for non-serious tasks.
When you set the RAM on an iGPU, you are reserving that amount of RAM specifically for the iGPU. That in turn means it is a 16GB GPU. Now you do not get the same performance as if it had its own VRAM due to shared bandwidth, but the frame buffer is the full 16GB.This is a 0GB GPU. Shared ram is nowhere near the same thing. It's like calling a 3.5in floppy disk a Hard Drive or USB 2.0 flash drive a SSD.
But for 95$ does worth it ?This is a 0GB GPU. Shared ram is nowhere near the same thing. It's like calling a 3.5in floppy disk a Hard Drive or USB 2.0 flash drive a SSD.
Typically, 16GB is the maximum amount of memory you can dedicate to the iGPU. However, some user reports claim that certain ASRock AMD motherboards allow for higher memory allocation, rumored up to 64GB.
It's worth noting that Phoenix and later desktop APUs would not only include substantially better iGPUs and DDR5 support, but also the XDNA AI accelerator. I don't know if it would be any faster than using the RDNA graphics, but it could allow you to use your APU to game while running Stable Diffusion or whatever on the accelerator at the same time.We wonder if AMD's latest mobile Ryzen chips, like Phoenix that taps into DDR5 memory, can work and what kind of performance they bring.
as 16GB is plenty for non-serious tasks
They are probably thinking about inference and 16 GB because consumer cards only come with about 4-24 GB, not 80-141 GB or hundreds of terabytes from thousands of cards.Lol. Ok. Good luck.
Idk why people only think about inference with deep learning. Using something already built is completely different resource-dependent than training.
It still needs to access that RAM pool via the CPU, because the iGPU does not have its own memory interface.When you set the RAM on an iGPU, you are reserving that amount of RAM specifically for the iGPU. That in turn means it is a 16GB GPU. Now you do not get the same performance as if it had its own VRAM due to shared bandwidth, but the frame buffer is the full 16GB.
As far as I know the iGPUs have DMA to the RAM. Therefore it doesn't go through the CPU for access.It still needs to access that RAM pool via the CPU, because the iGPU does not have its own memory interface.
And if you need to jump via the CPU to access RAM anyway, mays as well use a GPU so you have access to its internal memory and an arbitrarily large pool of fenced-off RAM connected via the CPU.
So can GPUs on the PCIe bus. But DMA is a logical process, in both cases the RAM is physically connected to the memory controller on the CPU die, and neither a dGPU nor iGPU can access that RAM without use of the PCU's memory controller. This is not like Kaby Lake G where the GPU had its own independent memory controller.As far as I know the iGPUs have DMA to the RAM. Therefore it doesn't go through the CPU for access.
There it's you again, falling for the headline, when you should know that can't be quite true in the sense that you imagine...There goes Tom's Hardware again. "You can turn a cheap CPU with iGPU into a GPU for AI with this one simple trick!" And it's just an OEM APU with 16 GB of slow DDR4.