Question Is it worth upgrading my GPU to one with more vRAM ?

Donkey Shot

Prominent
Apr 26, 2024
6
0
510
Thinking about upgrading my RTX 2070 8gb to either a new RTX 5060 Ti 16GB (€459) or a used RTX 3090 24GB (roughly €700-€800 in my country). My main focus is running AI models locally, gaming isn't a priority.

The 5060 Ti has a memory bandwidth of 448 GB/s, which is the same as my current 2070. Will the 5060 Ti work smoothly with my existing hardware, or are there any compatibility issues I should expect? Rumors of a 5080 with 24gb vram, is that worth considering? The bandwidth of the 5080, 960 GB/sec and the 3090, 936 GB/sec are close. From what I’ve read, the 3090 doesn't suffer much of a performance drop on PCIe 3.0 systems, so I assume the same would apply to the 5080? What is the lifespan of used GPUs if they are kept in good condition?

Just trying to figure out which option gives the best value for my use case.

my system specs:

Motherboard Intel Gigabyte Z390 M GamingCPU
Noctua Cooler NH-D15
Processor Intel Core i9 9900K
RAM Corsair DDR4 Vengeance LPX 4x16GB 3000 C15
Samsung SSD 860 EVO 500GB
Samsung SSD 970 EVO 1TB
WD HDD 3.5" 4TB S-ATA3 64MB WD40EZRZ Blue
Gigabyte GeForce RTX 2070 WINDFORCE 8G
Corsair PSU RM750x 750W
Fractal Design Define Mini C

Thanks!
 
It's a wide range of applications. I'm mainly using ComfyUI with Flux dev and schnell at the moment, but I'm also interested in experimenting with some open source ai video generators. I'm running a few LLMs as well, Gemma 3, DeepSeek and others. Mostly the scaled down or quantized versions. Overall, I'm making do with what fits on my 8 GB of VRAM.
 
I'm running a few LLMs as well, Gemma 3, DeepSeek and others. Mostly the scaled down or quantized versions. Overall, I'm making do with what fits on my 8 GB of VRAM.
This. When trying LLMs at home, VRAM size is the biggest constraint. On this basis, one should theoretically choose the 3090. With 24 GB you have several LLMs at your disposal for home runs. It probably provides inference performance in the same range or slightly better than a 5070 Ti, in terms of token/s.
The other side of the picture is that it is more power-hungry, supports older Cuda versions, and, of course, is a used device vs a brand new one.

When it comes to AI for image and video generation, different considerations may apply, and the 5070 Ti could make more sense. You know that landscape better than I do.
 
Thanks for the insight! What’s your take on used GPUs and how long they can last if you purchase one in good condition? I could buy a 3090 that is about 4,5 years old and watercooled with bykski FE.
 
Eh, I don't have experience with that, unfortunately. I guess that the used card most likely comes from a gamer rather than an AI user. I believe some of the gentlemen here may provide a hint about it.
 

TRENDING THREADS