ollama

Forum discussion tagged with ollama.
  1. H

    Question modern GPU on older motherboard

    Hello everyone! I have an old server from 2013 that has a tesla M40 gpu, im trying to run llms locally on that gpu through ollama but haven't had luck, I know its not officially supported since it has a compute capability of 3.5 but I tried using a modified version of ollama but still it falls...