Question What hardware for AI...

SPECOPS70

Honorable
Nov 29, 2018
497
40
10,710
Hello all.

Short of buying a threadripper for $1200 and a 4090 for 1800 or so what would yall recommend for hardware?

i installed ollama, docker desktop and open web UI. Then from there I installed models like deepseek r1, llama 3.2 etc.

I want to build a system that would run programs like llama or deepseek at high parameters. I know the higher end the hardware the better the performance when using AI programs.

What CPU, mobo and GPU would yall recommend short of the above that would work smooth.?

Example: I was just chatting with deepseek at 7b parameters and it was smooth. I tried 8b and still good. Tried 14b and it took around 2 minutes for ds to answer the question I asked.

This occurred on my spouses laptop. It is a rog zepherus g16 gu605mi. It has a ultra 9 CPU and a rtx 4070.

any and all advice on different types of hardware and prices would be greatly appreciated.

thx
 
From what I can tell, the higher the #B, the more ram you're going to need. As for the CPU, pretty much all of them state that you're going to need a multicore processor but without a doubt all of these AI app's will need an RTX to get your feet wet.

https://github.com/open-webui/open-webui/discussions/736
https://vagon.io/blog/a-step-by-step-guide-to-running-deepseek-r1-on-vagon-cloud-desktops

The question is what sort of #B model are you aiming for and at what sort of a budget cut off are you looking at for your build?
 
  • Like
Reactions: SPECOPS70