News $650 buys lucky eBay shopper two Nvidia RTX 3090 GPUs instead of the 3080s they ordered — scoops Dell OEM cards worth thousands for a fraction of t...

I'm pretty sure graphics software support for SLI was never there for the RTX 30* series.

At this point that connector was really more of an NVlink which allows CUDA to access VRAM across NVlink at 600GByte/s vs. perhaps 32GBytes/s with both sharing 16 lanes of PCIe v3.

So in these days of LLMs this could be a benefit, if he can get both a connector and has a board that fits them close enough to make it work.

Most 3090 cards were built extra wide so that for some odd reasons you never managed to put two 3090s into a single system. Some Chinese vendors retrofitted consumer cards with blowers just so they'd fit into boards that offered 8+8 bifurcation slots for two slot GPUs.

I was evaluating that type of setup for lab use at the time, but failed because a) those "form factor" modified GPUs from China wouldn't have made it through corporate purchasing, b) budget had already been spent on V100s.

But it doesn't seem worth the effort, the smaller quantizations available on more modern hardware might offset the gain from 48GB at INT8 or BF16.
 
Personally, I'm not a fan of the power requirements of the highest end cards (3090, 4090, 5090). so even I had received a higher end card than what I thought I was getting, I would not be happy about it. I'll stick with the RTX4080, which is 10TFLOPS higher than the 3090 and uses just a little more power than the 3080. The 5080 is marginally better, not worth getting in my opinion. Those super high wattage cards have been troublesome. Worrying about heat issues and a higher electricity bill (just to play games in 4K with everything turned up). Not worth it.
 
Personally, I'm not a fan of the power requirements of the highest end cards (3090, 4090, 5090). so even I had received a higher end card than what I thought I was getting, I would not be happy about it. I'll stick with the RTX4080, which is 10TFLOPS higher than the 3090 and uses just a little more power than the 3080. The 5080 is marginally better, not worth getting in my opinion. Those super high wattage cards have been troublesome. Worrying about heat issues and a higher electricity bill (just to play games in 4K with everything turned up). Not worth it.
In this situation where someone was expecting 3080s and got 3090s it's a universal win. If you don't want the extra power consumption that's what power limits are for. The 3090 will still be a fair bit faster than a 3080 at its 320W power limit.
 
  • Like
Reactions: bolweval and leclod
The truth is, $599 is the cost of a 12GB GDDR7 5070. And in EVERY SINGLE BENCHMARK, it beats the 3090 by 1-5%. Even in gaming at 4K.

Although TWO 3090 cards are a better deal, if a person is in the market for a 3090, a single 5070 slaughters it. Core frequency, VRAM bus speed and transfer speed, and so many other factors.

24GB VRAM go back and forth slower on GDDR6x, so the 12GB VRAM of GDDR7 make up for that with Nvidia's new data-compression, frequency speed, and more. (If you doubt what I say, check out GanerNexus or ANY triple A tech channel for an array of benchmarks).
 
The truth is, $599 is the cost of a 12GB GDDR7 5070. And in EVERY SINGLE BENCHMARK, it beats the 3090 by 1-5%. Even in gaming at 4K.

Although TWO 3090 cards are a better deal, if a person is in the market for a 3090, a single 5070 slaughters it. Core frequency, VRAM bus speed and transfer speed, and so many other factors.

24GB VRAM go back and forth slower on GDDR6x, so the 12GB VRAM of GDDR7 make up for that with Nvidia's new data-compression, frequency speed, and more. (If you doubt what I say, check out GanerNexus or ANY triple A tech channel for an array of benchmarks).
until the 5070s 12gb vram buffer gets filled. then playability goes out the window
 
In some countries if you don't return you can be charged for fraud...

The same countries that don't do anything when companies cheat your money...