News Nvidia GeForce RTX 3080: Everything We Know

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Not only does Nvidia not buy back old stock, if they have extra chips, they force board partners to buy them if they want larger allotments of the new chips.

Nvidia putting the squeeze on AIB partners

"Claims"

A report by Taiwanese analysts at Digitimes claims that graphics card makers are being pressured into doing more than their share for Nvidia.

This is one ,

Two , that was at the time of Crypto Currency high demand , ofc it will be a different issue when you Demand Nvidia to make 3 times the normal chips ... you cant tell them oh take them back .

dont bring up the Crypto Currency Disaster into it , that was a spcial case . and dont bring "Claims"

More over , you also said Intel and AMD does the same , this is not true at all.

Huge retailers make return Agreements with Chip makers , for a credit , but at the same time they need to order large volumes for such kind of contracts.
 
I still do not see any significant price drop on RTX series. 2080 Super is just 30 Eur down and new ones are out in 2 weeks!
Someone buying VGA must be totally crazy to buy anything now!

If 3080 is same price as 2080 at double performance does it means that 20xx series price will drop 50%?
It already has. Sell GPU is only offering 50% of the original cost for a 2080S. But there just isn't a large overhang of top 2000 series GPUs at retailers. Local Microcenter only has refurbished 2080tis. Also very limited inventory of 2080s including only EVGA waterblock equipped and founders editions that weren't selling.
 
The Nvidia 3080 uses DisplayPort 1.4a - so, the output bandwidth will not change from what was available 5 years ago.
It does support HDMI 2.1 - so, it will be interesting to see if that provides a significant improvement.

I have been running 1080p 60fps 8-bit for the last 9 years using an Nvidia GT 690.
I would like to upgrade to 4k 120fps 10-bit. I'm not sure I will do that until Nvidia graphics cards support DisplayPort 2.0.
 
I talked about this more with the RTX 3090, but I suspect real-world gaming performance is going to be limited by the CPU and rest of the system -- even at 4K.

Earlier TH made the statement that at least a 9700k was needed to get peak performance. From the reviews you posted for Flight Sim etc you show a 9600k as one of the top in CPU gaming performance.

Thus my question : Is the a CPU frequency issue ie better than 5Ghz speed the issue or is this a core/thread limiting issue? In all the gaming CPU rankings the six core 9600k and the 10600k overclocked to ~5Ghz rank just a smidgen below the overclocked 9900k or 10900k.

If this is a frequency issue then the six core "K" should continue to rank as top gaming cards on the 3000 series Nvidia GPUs. Or is six cores no matter what frequency going to start being a limiting factor?
 
The Nvidia 3080 uses DisplayPort 1.4a - so, the output bandwidth will not change from what was available 5 years ago.
It does support HDMI 2.1 - so, it will be interesting to see if that provides a significant improvement.

I have been running 1080p 60fps 8-bit for the last 9 years using an Nvidia GT 690.
I would like to upgrade to 4k 120fps 10-bit. I'm not sure I will do that until Nvidia graphics cards support DisplayPort 2.0.
Displayport 1.4 already supports 4K 120/144Hz and 8k 60Hz. It's going to be a while before we see any displayport 2.0 displays. None have even been announced yet. The capabilities of DP1.4a are good enough for the hardware we have today.
 
  • Like
Reactions: truerock
Is DP 2,0 a GPU issue or is it a monitor issue?
Well, yes - good point.

I'm not saying this is an Nvidia issue. It is a technology industry issue.

My take is that the Nvidia 3080/3090 is the technology industry's very first tiny step to supporting 4k, 120fps, 10-bit color.
Obviously you have to have DisplayPort 2.0 and HDMI 2.1 to do that.

I'm guessing that Nvidia 3080/3090 isn't quite powerful enough to support DisplayPort 2.0 - so, maybe that will come next year.

I've been waiting since 2012 for the next big step up in video and I'm thinking I should wait 1 more year.
I would have never thought I would be running the equivalent of DisplayPort 1.4a for 10 years.
 
Displayport 1.4 already supports 4K 120/144Hz and 8k 60Hz. It's going to be a while before we see any displayport 2.0 displays. None have even been announced yet. The capabilities of DP1.4a are good enough for the hardware we have today.

Yes, I absolutely agree.

It's just me having to decide when to upgrade. And, yes - DP 1.4a is almost there - but, I just decided about 5 years ago that I was absolutely happy running 1080p at 60fps and 4k at 30fps and I needed to decide what would make me upgrade my 2012 PC. And that was 4k at 120fps with 10 bit color. And, it looks like I'll wait until next year when Nvidia will probably start supporting DisplayPort 2.0
 
Earlier TH made the statement that at least a 9700k was needed to get peak performance. From the reviews you posted for Flight Sim etc you show a 9600k as one of the top in CPU gaming performance.

Thus my question : Is the a CPU frequency issue ie better than 5Ghz speed the issue or is this a core/thread limiting issue? In all the gaming CPU rankings the six core 9600k and the 10600k overclocked to ~5Ghz rank just a smidgen below the overclocked 9900k or 10900k.

If this is a frequency issue then the six core "K" should continue to rank as top gaming cards on the 3000 series Nvidia GPUs. Or is six cores no matter what frequency going to start being a limiting factor?
It depends on the game. Some games prefer no Hyper-Threading / SMT. Others work better with SMT enabled. Hitman 2 DX12 mode is a good example of a game that actually scales with more threads, not just more cores. There are a lot of games that still don't scale much beyond 6-core, almost none scale beyond 8-core. The Core i9-10900K is the fastest CPU overall in gaming benchmarks, but some of that is certainly from the higher clocks. If you had a 10-core/10-thread Intel Comet Lake chip, running at the same clocks, it would almost certainly come out ahead overall.

The problem is that games and game engines change. Lower level APIs are usually better than DX11 at scaling with higher core/thread counts. We're seeing more games and game engines that support DX12 or Vulkan. But we also still see games (eg, MS Flight Simulator) that launch without DX12 or Vulkan. And suffer for it in terms of CPU scaling.

Realistically, if you're willing to overclock, the Core i7-9700K, or maybe even an i7-10700K but with Hyper-Threading disabled, will probably come out as the overall fastest CPUs for gaming today. i9-10900K is probably a few percent slower, depending on what games you test.
 
Well then 3080 in SLI is the answer to this not to pay 114% more for 20% more performance .. the worst SLI scenario will give you 50% more ... if not upto 90% more .

no wonder Nvidia did not Allow SLI for 3080 ...

I know Developers are not bothering to optimize their games for SLI , but that is because Nvidia Policies are not encouraging it.
Neither Nvidia nor AMD are really encouraging it. And, it doesn't make financial sense for developers to pour money/effort into it, since only a tiny fragment of users have or even consider an SLI setup.
 
  • Like
Reactions: riddick51pb
Neither Nvidia nor AMD are really encouraging it. And, it doesn't make financial sense for developers to pour money/effort into it, since only a tiny fragment of users have or even consider an SLI setup.

But Nvidia is still allowing NVlink on the RTX 3090 and not on the 3080 .. this is just pure marketing , they know that no one would pay double for RTX 3090 and igonre SLI RTX 3080
 
Well then 3080 in SLI is the answer to this not to pay 114% more for 20% more performance .. the worst SLI scenario will give you 50% more ... if not upto 90% more .
Not really. The worst-case SLI scenarios are:
  1. games refuse to load or crash
  2. worse and less stable frame rates than disabling SLI/CF to run single-GPU in many games
  3. frequent stutters and freezes

On-going problems with getting SLI and CrossFire to work properly across a decent selection of games and games typically needing some degree of explicit support to achieve decent scaling are the main reasons why both AMD and Nvidia have mostly given up on pushing their house-brand driver-level multi-GPU in favor of letting game developers use generic DX12/Vulkan explicit multi-GPU... and most game devs haven't bothered with that either with single GPUs being more than good enough to play their games mostly maxed out at the developers' target resolution.

But Nvidia is still allowing NVlink on the RTX 3090 and not on the 3080 .. this is just pure marketing , they know that no one would pay double for RTX 3090 and igonre SLI RTX 3080
Nvidia is 'allowing' NVLink on the 3090 because if you want more performance than that, you have nowhere else to go but multi-GPU.

AMD and Nvidia have scrapped multi-GPU support on their lower-end GPUs because SLI and CF cause too many user experience issues. Almost everybody here will tell you that a single faster GPU is preferable over a bunch of cheaper GPUs of higher theoretical aggregate throughput due to the single GPU route being far more predictable and trouble-free.
 
Yes, I absolutely agree.

It's just me having to decide when to upgrade. And, yes - DP 1.4a is almost there - but, I just decided about 5 years ago that I was absolutely happy running 1080p at 60fps and 4k at 30fps and I needed to decide what would make me upgrade my 2012 PC. And that was 4k at 120fps with 10 bit color. And, it looks like I'll wait until next year when Nvidia will probably start supporting DisplayPort 2.0
You're probably looking at 2 years before Nvidia implements DP2.0. Next year will be a mid generation refresh and Nvidia doesn't typically make any architectural or hardware changes for those.

HDMI 2.1 displays are already on the market and it will be the dominant format going forward. There is nothing tangibly better about DP2.0 that's worth waiting for with HDMI 2.1 currently supported.
 
But Nvidia is still allowing NVlink on the RTX 3090 and not on the 3080 .. this is just pure marketing , they know that no one would pay double for RTX 3090 and igonre SLI RTX 3080
The 3090 has NVlink support for compute tasks and for competitive benchmarking and that's about it. It will theoretically support gaming, but you will not see any SLI profiles from Nvidia nor will game developers support it making it largely worthless for gaming. Raytracing has probably killed off SLI for good in gaming.
 
I talked about this more with the RTX 3090, but I suspect real-world gaming performance is going to be limited by the CPU and rest of the system -- even at 4K. The games used for testing are also going to impact performance gains, because a lot of games simply don't scale beyond a certain point. So Control for example if you double your GPU performance, you probably only get ~50-75% more fps.

But we also know Nvidia has reworked the underlying architecture, and it's possible that in some ways it gets less benefit from higher TFLOPS. I mean, that would be weird and I don't think that's the case, because it would imply lower efficiency and performance per watt on some level.

Very likely it's just the games and system bottlenecks, though. There will be games where performance is going to scale more with TFLOPS, so maybe RTX 3090 ends up twice as fast as Titan RTX in some cases. And other games won't scale as well and it will only be 30-50% faster.

I think I have found the answer to this. It is the change to 2 operations per cycle vs 1 operation per cycle in the previous architecture. While the theoretical performance is there, it is probably not immediately applicable in current games. It could be exploited in the future however, with new code designed around this new architecture.
 
  • Like
Reactions: JarredWaltonGPU
Great! Looks like I'll finally be upgrading my GTX 1080. Then I read this:

"if you're eyeing the RTX 3080 you should probably have at least a Core i7-9700K or better CPU first "

I have an i7 7700k on a pretty awesome 2017 - 2018 build. I'm sure I'll be fine at 1440p 144hz.
 
Just buy it guys , just buy it .. (here we go again)

780 ti , 980 ti , 1080 ti , 2080 ti , no 3080 ti ? I DONT THINK SO.

More over , when the 3080 ti will be released for $999 , 3090 Ti/super will be released for $1499 with 20% more speed than 3090 and 3090 will be canceled

That is the 3080 ti will be 3090 but with 12GB VRAM. and the 3090 Ti will be 20% faster than 3080 ti/3090 with 24 GB VRAM

This is my Prediction

I was all in for a 3090 then put my thinking cap on, then read the AMA on reddit about the ampere cards. The 3090 is essentially a Titan re-branded. Unless people are trying to play 8K games or doing intensive graphics work not gaming the 3080 is supposed to be the card you want. This is where I fall right in line with your comment. At this point waiting out AMD's offering and possible 3080 Super or ti seems to be the sensible idea. Essentially with the 3090 all the extra cores only help with higher resolutions, where the 3080 is plenty for 4K, with a boost in frequencies on the GPU waiting for the ti will land you better framerates. The price range is a lot better. The last time I bought a GPU was a GTX 1080, just to learn about a 1080 ti and as good as the 1080 was, the 1080 ti is what I needed at the time, I was always let down by the difference between the two and how I feel like I should have waited. The "flagship" title they are giving the 3080 for gaming is misleading. I won't be duped again. At least at this point I can hold off building a new PC around the Ampere generation to ensure I get full benefits out of it and the PCI-E gen 4 and good IO for the direct storage benefits like the upcoming consoles are getting.
 
Great! Looks like I'll finally be upgrading my GTX 1080. Then I read this:

"if you're eyeing the RTX 3080 you should probably have at least a Core i7-9700K or better CPU first "

I have an i7 7700k on a pretty awesome 2017 - 2018 build. I'm sure I'll be fine at 1440p 144hz.
An i7-7700K is generally going to be slower than a Core i5-9600K. We've done various tests where we have both 9900K and 9600K in the CPU performance results. With the higher performance being offered by Ampere, running a 1440p display is going to be similar to a 2080 Ti running at 1080p. Basically, you're more likely to hit CPU bottlenecks. If you're overclocked to 5.0GHz, you'll reduce that a bit, but long-term I expect more games to start hitting CPU bottlenecks with anything below the i7-9700K.

The good news is you can easily transfer the GPU to a new PC if you determine the CPU is holding you back. But I'll be testing the 3080/3090 with 9900K, 3900X, possibly 10900K ... and if I have time (probably a follow up article) i5-9600K.
 
An i7-7700K is generally going to be slower than a Core i5-9600K. We've done various tests where we have both 9900K and 9600K in the CPU performance results. With the higher performance being offered by Ampere, running a 1440p display is going to be similar to a 2080 Ti running at 1080p. Basically, you're more likely to hit CPU bottlenecks. If you're overclocked to 5.0GHz, you'll reduce that a bit, but long-term I expect more games to start hitting CPU bottlenecks with anything below the i7-9700K.

The good news is you can easily transfer the GPU to a new PC if you determine the CPU is holding you back. But I'll be testing the 3080/3090 with 9900K, 3900X, possibly 10900K ... and if I have time (probably a follow up article) i5-9600K.

Can you please test Threadripper as well ? we need to know how it performs with RTX 3090

3960X or 3970X
 
NVIDIA is very good at playing Psych. 101 with these prices. They produce an outlandishly expensive card (RTX 3090) that many laugh at, saying "That card is way too expensive but I won't be fooled! I'll just buy the $700 card! HA!"

NVIDIA is making people feel that THEY, the consumers, got the good deal by spending ONLY $700 on a video card!?! They are trying to 'normalize' a $700 GPU purchase. Absolutely ridiculous, IMHO.

I was really hoping that AMD's Big Navi cards came close to the 3080 in performance to force NVIDIA to drop prices further but some of the latest info (still rumors, of course) say it won't even match the 3070.
 
  • Like
Reactions: nofanneeded
NVIDIA is very good at playing Psych. 101 with these prices. They produce an outlandishly expensive card (RTX 3090) that many laugh at, saying "That card is way too expensive but I won't be fooled! I'll just buy the $700 card! HA!"

NVIDIA is making people feel that THEY, the consumers, got the good deal by spending ONLY $700 on a video card!?! They are trying to 'normalize' a $700 GPU purchase. Absolutely ridiculous, IMHO.

I was really hoping that AMD's Big Navi cards came close to the 3080 in performance to force NVIDIA to drop prices further but some of the latest info (still rumors, of course) say it won't even match the 3070.
When the 1080Ti was announced at $700 3 1/2 years ago, it was met with genuine surprise, as the consensus was that it would be more expensive since it was expected to perform similarly to the $1000 Titan. The 1080ti was universally praised in reviews for being a great value, and will go down as one of the best cards of all time. You're 3.5 years late on the $700 card from Nvidia is a great deal observation.
 
  • Like
Reactions: JarredWaltonGPU
When the 1080Ti was announced at $700 3 1/2 years ago, it was met with genuine surprise, as the consensus was that it would be more expensive since it was expected to perform similarly to the $1000 Titan. The 1080ti was universally praised in reviews for being a great value, and will go down as one of the best cards of all time. You're 3.5 years late on the $700 card from Nvidia is a great deal observation.
I'm not really sure that the 1080Ti could be called "normalized," though. It was the absolute best you can get (with the Titan always being a niche within a niche).