Discussion Would the Nvidia l40s outperform the 4090 in gaming?

Order 66

Grand Moff
Apr 13, 2023
2,166
912
2,570
I was browsing techpowerup and came across the volta line of datacenter GPUs and found that the most recent one was the L40s, and according to TPU, the L40s outperforms the 4090 by 10%. Although, I haven't actually seen any gaming benchmarks for it. I understand these cards aren't meant for gaming, but I'm curious.
 
They are the same AD102 GPU, I would expect similar results in gaming. L40 seems to have a little less memory bandwidth as well. Clocks are close enough.

With the same driver loaded there should be almost no difference.
 
  • Like
Reactions: Order 66
I was browsing techpowerup and came across the volta line of datacenter GPUs and found that the most recent one was the L40s, and according to TPU, the L40s outperforms the 4090 by 10%. Although, I haven't actually seen any gaming benchmarks for it. I understand these cards aren't meant for gaming, but I'm curious.
You've not seen any gaming benchmarks for these, because they are so far out of a typical gaming budget as to be laughable.

$27, 687
https://www.amazon.com/NVIDIA-L40S-GDDR6-Graphics-900-2G133-0080-000/dp/B0CNDY37ZG

$10,749
https://www.amazon.com/Graphics-Accelerator-900-2G133-0080-000-PG133G-TCSL40SPCIE-PB/dp/B0CLTDCZ82

$12,581
https://www.cdw.com/product/nvidia-l40s-gpu-computing-processor-nvidia-l40s-48-gb/7582191
 
  • Like
Reactions: Order 66
They are the same AD102 GPU, I would expect similar results in gaming. L40 seems to have a little less memory bandwidth as well. Clocks are close enough.

With the same driver loaded there should be almost no difference.
L40s has a fully unlocked AD102, 18176 cuda cores vs 16384, or 10% more.
 
Correct, but this isn't gaming card. The voltage/frequency curve is likely to more conservative and the TDP is fixed at 300W, not 450W, so this thing will not boost as high. Now if you watercooled both of them and managed to unlock the BIOS to allow higher power limits, you might be able to get it to beat a 4090 in gaming. 48GB of slower memory won't help either, but if you could find a game that uses more then 24GB, then it would certainly have an advantage there, but until that point it will be slower.
 
  • Like
Reactions: Order 66
Correct, but this isn't gaming card. The voltage/frequency curve is likely to more conservative and the TDP is fixed at 300W, not 450W, so this thing will not boost as high. Now if you watercooled both of them and managed to unlock the BIOS to allow higher power limits, you might be able to get it to beat a 4090 in gaming. 48GB of slower memory won't help either, but if you could find a game that uses more then 24GB, then it would certainly have an advantage there, but until that point it will be slower.
Sure, but that doesn't explain why TPU seems to think it would beat in gaming unless they're going purely off the difference in CUDA core counts. I wonder how much more VRAM 8k requires than 4k. That's a use (albeit an extremely niche one) that I could see for its 48GBs of VRAM.
 
8K resolution maybe, that will mostly just slow down the frame rate from compute needs. Really need 8K (or more) texture packs or something from a mod to utilize the memory.

My guess would be that is exactly what they did. Just scale it up based on cores. But if the L40 tops out at 2500Mhz or so while the 4090 pushes near to 3000Mhz, that is 20% increase right there, plus the faster memory.
 
  • Like
Reactions: Order 66
8K resolution maybe, that will mostly just slow down the frame rate from compute needs. Really need 8K (or more) texture packs or something from a mod to utilize the memory.

My guess would be that is exactly what they did. Just scale it up based on cores. But if the L40 tops out at 2500Mhz or so while the 4090 pushes near to 3000Mhz, that is 20% increase right there, plus the faster memory.
Couldn't you overclock, and undervolt (to a point) to get more performance out of it? I've done that with my 7700x where I undervolted (maybe, I'm not sure what the true stock voltage is, but from what I've read, it is 1.25v) and overclocked to 5.45 GHz. Couldn't you do the same for this GPU?
 
Couldn't you overclock, and undervolt (to a point) to get more performance out of it? I've done that with my 7700x where I undervolted (maybe, I'm not sure what the true stock voltage is, but from what I've read, it is 1.25v) and overclocked to 5.45 GHz. Couldn't you do the same for this GPU?

For a little bit of gain, possibly. Just depends on the individual GPU. But that 300W power budget is going to limit access to higher clock speeds. You can't just undervolt and overclock until it matches something with 50% more available power. The card will just become unstable without enough power to the cores.

Power limit is usually the first hit on Ada, then voltage, which you don't have control of without modding. From what I recall at about 520-540W you are pretty much done overclocking a 4090. No additional voltage or power makes much of a difference. At that point you would have to go cryogenic to see gains.
 
  • Like
Reactions: Order 66
For a little bit of gain, possibly. Just depends on the individual GPU. But that 300W power budget is going to limit access to higher clock speeds. You can't just undervolt and overclock until it matches something with 50% more available power. The card will just become unstable without enough power to the cores.

Power limit is usually the first hit on Ada, then voltage, which you don't have control of without modding. From what I recall at about 520-540W you are pretty much done overclocking a 4090. No additional voltage or power makes much of a difference. At that point you would have to go cryogenic to see gains.
How exactly difficult is it to mod a card so it can reach higher power limits? I remember seeing LTT modding a 4090 to 1000W.
 
Test to destruction.

Raise the power until the thing no longer works.
What exactly is the cause of death? too much voltage and therefore heat? I could also see the coolers not being able to handle much more than they're designed for, but it does seem like especially coolers for the 4090 are overbuilt. ex the asus ROG strix 4090 having a temp of only 56.4C. (according to TH's review) surely a card like that would be able to theoretically handle a hefty overclock and stay cool.
 
What exactly is the cause of death? too much voltage and therefore heat? I could also see the coolers not being able to handle much more than they're designed for, but it does seem like especially coolers for the 4090 are overbuilt. ex the asus ROG strix 4090 having a temp of only 56.4C. (according to TH's review) surely a card like that would be able to theoretically handle a hefty overclock and stay cool.
Not just heat, but actual voltage.

Push too much voltage through something, and it may break.
 
  • Like
Reactions: Order 66
That particular LTT video they were using a leaked pre-release BIOS that had a 1000W limit. They didn't actually push it that far, that would have made it die. That BIOS also had no over temp protection so was quite dangerous. They also installed a module to allow direct control of the VRMs through I2C, which is a serial communications protocol, so that they could directly change the voltage to the GPU core.

Getting a power modded BIOS for an L40 is probably not a thing. Tricking it with a shunt mod might be possible, but I think that doesn't apply to Ada, they fixed that little loophole. Probably the 'easiest' might be swapping the whole power deliver to the GPU. It would be stupid to do that with such an expensive card when you could just get a 4090 and do pretty much the same thing.

Like any electronics there are limits in all things. Keep in mind that current silicon lithography is operating on a very tiny scale. That is why chips run at .8 to 1.1 volts, they can't take much more with out a complete breakdown of the PNP or NPN junctions inside transistors (actually haven't kept up with that to know which it is). Once a transistor can't 'switch' any more, it won't work and basically becomes a lump of metal. That happens anywhere inside something critical to the GPU, then it stops working.

Too high of power or voltage or temperature can actually straight up crack the silicon. Arcing can occur and it can actually explode.
 
  • Like
Reactions: Order 66