• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

llm

Forum discussion tagged with llm.
  1. P

    Question Advice on a Motherboard/Case capable of accommodating 4 x 4090 GPUs ?

    I'm interested in building a system initially with 2x 4090 GPUs but with the ability to add two more later if the need arises. This is for running LLMs and perhaps fine tuning small ones. So far I have found a very limited number of motherboards that have more than 3 PCI slots, and of those...
  2. Z

    Question Best value GPU with 27+GB VRAM for running LLMs ?

    My use case is I have installed LLMs in my GPU using the method described in this Reddit Post. I want to run WizardLM-30B, which requires 27GB of RAM. (I also have a quantized version with lower RAM requirements, but I do not have a number on the RAM requirements. I just know that I can fit...
  3. Z

    Question How to shop for GPUs (or other hardware) for LLM Workloads ?

    Hello, I would appreciate some guidance on what hardware (GPU or otherwise) I should purchase to enable me to run LLMs locally on my machine. Here are my system specs. CPU: AMD Ryzen 9 7950X Motherboard: ASRock X670E PG Lightning AM5 ATX Mainboard. RAM...