Hello all.
Short of buying a threadripper for $1200 and a 4090 for 1800 or so what would yall recommend for hardware?
i installed ollama, docker desktop and open web UI. Then from there I installed models like deepseek r1, llama 3.2 etc.
I want to build a system that would run programs like llama or deepseek at high parameters. I know the higher end the hardware the better the performance when using AI programs.
What CPU, mobo and GPU would yall recommend short of the above that would work smooth.?
Example: I was just chatting with deepseek at 7b parameters and it was smooth. I tried 8b and still good. Tried 14b and it took around 2 minutes for ds to answer the question I asked.
This occurred on my spouses laptop. It is a rog zepherus g16 gu605mi. It has a ultra 9 CPU and a rtx 4070.
any and all advice on different types of hardware and prices would be greatly appreciated.
thx
Short of buying a threadripper for $1200 and a 4090 for 1800 or so what would yall recommend for hardware?
i installed ollama, docker desktop and open web UI. Then from there I installed models like deepseek r1, llama 3.2 etc.
I want to build a system that would run programs like llama or deepseek at high parameters. I know the higher end the hardware the better the performance when using AI programs.
What CPU, mobo and GPU would yall recommend short of the above that would work smooth.?
Example: I was just chatting with deepseek at 7b parameters and it was smooth. I tried 8b and still good. Tried 14b and it took around 2 minutes for ds to answer the question I asked.
This occurred on my spouses laptop. It is a rog zepherus g16 gu605mi. It has a ultra 9 CPU and a rtx 4070.
any and all advice on different types of hardware and prices would be greatly appreciated.
thx