News Nvidia GeForce RTX 3080 Founders Edition Review: A Huge Generational Leap in Performance

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
"And last but not least, there's no VirtualLink port this round — apparently, VirtualLink is dead. RIP. "

The link is leading to an unrelated page about the best processors 2020. I searched for "virtual" in that article and there were no results, so I believe the link is wrong.
Fixed. Weird, because I use an internal linking tool so it should have grabbed the right link. I've checked now (in our CMS) and it says it has the right link. But just in case it messes up again: https://www.tomshardware.com/news/virtuallink-is-dead
 
Jarred, I'll start by saying that your review is absolutely spectacular! You have definitely outdone yourself!

I do have a comment to make but it's not about your review, it's about the product itself. You were saying something about the VRAM that I 100% agree with but there is a problem with our kind of thinking:

"If you're worried about 10GB of memory not being enough, my advice is to just stop. Ultra settings often end up being a placebo effect compared to high settings — 4K textures are mostly useful on 4K displays, and 8K textures are either used for virtual texturing (meaning, parts of the texture are used rather than the whole thing at once) or not used at all. We might see games in the next few years where a 16GB card could perform better than a 10GB card, at which point dropping texture quality a notch will cut VRAM use in half and look nearly indistinguishable. "

On a "FLAGSHIP" card like this that is hailed as a card that's specifically for gaming at 2160p, should people have to turn their textures down because nVidia didn't give the card the 12-16GB of VRAM that it should have? Sure, it doesn't matter, I agree with that, but I also think that nVidia shouldn't have made it necessary because this is their "FLAGSHIP" card. Your thoughts?
The biggest issue with 8K textures is that to actually use them ... you need an 8K display. The basic MIPMAPPING algorithms goes like this:

  1. Check on-screen pixel size of polygon
  2. Select closest MIPMAP that's one step above the pixel size

So, if you have a 1920x1080 display, the maximum width of a texture would be 1920 pixels (unless the game allows you to get so close that you're effectively seeing half of a texture spread over the entire screen -- not really a useful metric). Which means at most, playing at 1080p, a game will use a 2K texture. But the vast majority of textures are going to be further back and only cover a fraction of the display, so they'll use 1K or 512 or 256 or even 128 size textures.

To get 4K textures to be useful, you need a 4K screen. Most of the textures will still be 2K, 1K, or lower. But the close ground/wall textures (and maybe sky, depending on how it's done) could use the 4K textures. If you have a 4K screen with 8K textures, the 8K textures get loaded into system RAM but shouldn't end up being used on any objects, because an object would never cover enough of the display that an 8K texture would be needed.

This is why, if a game actually tells you the base texture size it's using (2K, 4K, 8K) and you compare screenshots at 4K using all of the texture options, there's almost no visible difference between 2K and above. Also keep in mind that it's possible to use a 1K texture on a 2K polygon and still not see much of a difference. And if a game were to implement some form of smart texture upscaling (put those tensor cores to use doing something else!) it could potentially reduce the memory footprint while delivering similar quality.
 
  • Like
Reactions: Avro Arrow
Wait, isn't Variable Refresh Rate just GSync for Nvidia (with grudging support for FreeSync)? Or is there something more involved?
Yeah, I think he just doesn't believe VRR works with HDMI 2.1 or something? It's part of the spec, so it should work. But the only HDMI 2.1 displays tend to be 4K or 8K TVs, which most people don't have. G-Sync is Nvidia tech for it, FreeSync is AMD's knock-off version that's royalty free, and Adaptive Sync is basically FreeSync without the AMD branding -- so Nvidia always says Adaptive Sync or G-Sync Compatible and never FreeSync Compatible.
 
Sep 22, 2020
1
0
10
I did not see any VR tests. I only do VR gaming now and don't really get why anyone would need this card otherwise... I can barely see the difference in 2k 4k pancake games now.
 
The biggest issue with 8K textures is that to actually use them ... you need an 8K display. The basic MIPMAPPING algorithms goes like this:

  1. Check on-screen pixel size of polygon
  2. Select closest MIPMAP that's one step above the pixel size
So, if you have a 1920x1080 display, the maximum width of a texture would be 1920 pixels (unless the game allows you to get so close that you're effectively seeing half of a texture spread over the entire screen -- not really a useful metric). Which means at most, playing at 1080p, a game will use a 2K texture. But the vast majority of textures are going to be further back and only cover a fraction of the display, so they'll use 1K or 512 or 256 or even 128 size textures.

To get 4K textures to be useful, you need a 4K screen. Most of the textures will still be 2K, 1K, or lower. But the close ground/wall textures (and maybe sky, depending on how it's done) could use the 4K textures. If you have a 4K screen with 8K textures, the 8K textures get loaded into system RAM but shouldn't end up being used on any objects, because an object would never cover enough of the display that an 8K texture would be needed.

This is why, if a game actually tells you the base texture size it's using (2K, 4K, 8K) and you compare screenshots at 4K using all of the texture options, there's almost no visible difference between 2K and above. Also keep in mind that it's possible to use a 1K texture on a 2K polygon and still not see much of a difference. And if a game were to implement some form of smart texture upscaling (put those tensor cores to use doing something else!) it could potentially reduce the memory footprint while delivering similar quality.
Oh yes, I completely agree with you. It's not a screen image thing, it's a product image thing. Look at it like a consumer who doesn't know better like you and I do.

So, nVidia released their new "flagship" GPU but it has less VRAM than their top card from the previous generation. This means that they deliberately sandbagged the card because a new flagship card shouldn't have less VRAM than its predecessor. Now, as a semi-ignorant consumer, you'll be turned off by this because all you know is that it has less of something important than the card that came before it. You'll still buy it of course, but you might also join the chorus of the other ignorant sheep bleating loudly about nothing.

As a tech expert, I'm sure that you find few things to be as annoying as some noob whining about something completely irrelevant. I know that it drives me absolutely bonkers sometimes and I just wish that they'd shut up. Like that dumb lawsuit against AMD about what constitutes a CPU core. Anyone with half a brain and some knowledge in tech knows that a CPU core is an integer core, but here we have idiots using up taxpayer money in court. It's dumb, it's wrong and it's irrelevant but it's nevertheless a thing.

So yes, I know that it doesn't matter and you know that it doesn't matter. I'm sure that filling a 10GB frame buffer with less than 8K textures would be borderline impossible (which is why I'm not the least bit worried about the "only" 8GB of GDDR6 in my RX 5700 XT). However, there are a bunch of people whining about the 10GB frame buffer being too small. I'm just wondering if it would have been wiser for nVidia to just put 12GB on it if only to prevent this very thing from happening. They must have seen it coming.

Then, instead of bleating sheep, we'd have Silence of the Lambs. :ROFLMAO:
 
  • Like
Reactions: JarredWaltonGPU
Oh yes, I completely agree with you. It's not a screen image thing, it's a product image thing. Look at it like a consumer who doesn't know better like you and I do.

So, nVidia released their new "flagship" GPU but it has less VRAM than their top card from the previous generation. This means that they deliberately sandbagged the card because a new flagship card shouldn't have less VRAM than its predecessor. Now, as a semi-ignorant consumer, you'll be turned off by this because all you know is that it has less of something important than the card that came before it. You'll still buy it of course, but you might also join the chorus of the other ignorant sheep bleating loudly about nothing.

As a tech expert, I'm sure that you find few things to be as annoying as some noob whining about something completely irrelevant. I know that it drives me absolutely bonkers sometimes and I just wish that they'd shut up. Like that dumb lawsuit against AMD about what constitutes a CPU core. Anyone with half a brain and some knowledge in tech knows that a CPU core is an integer core, but here we have idiots using up taxpayer money in court. It's dumb, it's wrong and it's irrelevant but it's nevertheless a thing.

So yes, I know that it doesn't matter and you know that it doesn't matter. I'm sure that filling a 10GB frame buffer with less than 8K textures would be borderline impossible (which is why I'm not the least bit worried about the "only" 8GB of GDDR6 in my RX 5700 XT). However, there are a bunch of people whining about the 10GB frame buffer being too small. I'm just wondering if it would have been wiser for nVidia to just put 12GB on it if only to prevent this very thing from happening. They must have seen it coming.

Then, instead of bleating sheep, we'd have Silence of the Lambs. :ROFLMAO:
If anything, less-informed customers will either think the 2080 Ti is better because it's more expensive still or the 3080 is better because it's a higher number. Or they think one card is better because it clocks higher than another. Or another card is better because it has more buzzwords on the box.

There are a dozen plus factors that less-informed customers will use to think why one product is better than another.
 
If anything, less-informed customers will either think the 2080 Ti is better because it's more expensive still or the 3080 is better because it's a higher number. Or they think one card is better because it clocks higher than another. Or another card is better because it has more buzzwords on the box.

There are a dozen plus factors that less-informed customers will use to think why one product is better than another.
I agree with you, but this would be one less reason and that's not a bad thing.
 

King_V

Illustrious
Ambassador
Then buy my gently used ZXXX TRIPLE MAX 10,000!!! Only $3,495.99.

Please note that, it was an experimental prototype that was supposed to never be publicly available, AND, it was secretly disguised as a GTX 1080FE to throw off leakers.
 
  • Like
Reactions: hotaru.hino
I agree, but bigger numbers ALWAYS sound better and it's not like nVidia can't afford it. The most priceless thing a company can buy is consumer goodwill. ;)
In this instance, they may not be able to. 12GB of VRAM requires 12 memory channels to be fully functional, which if we went by the 3090's specs, 12 memory channels appears to be the full compliment. Requiring 12 memory channels would reduce yield count and exacerbate the supply, which as we know, appears to be pretty low at the moment. Plus considering the number of GPCs that are deactivated on the 3080, there's a question of whether or not the GPU in theory would even make the most of it.

Or NVIDIA could just give it 12GB and run into another GTX 970 situation (or more accurately a GTX 660 Ti). I'm sure people would love to have 2GB segmented off. Using 2 GiB chips instead of 1 GiB chips could also be an option, but considering that normal GDDR6 runs for about $10 per GiB at volume pricing and assuming GDDR6X is somewhere between 1.5x and 2.0x more in cost, we could've seen a 3080 with a launch price of more like $850-$1000 and we'll just have Turing all over again.

I don't see this fetish for having more VRAM when there's no dire need for it.
 
  • Like
Reactions: Avro Arrow
In this instance, they may not be able to. 12GB of VRAM requires 12 memory channels to be fully functional, which if we went by the 3090's specs, 12 memory channels appears to be the full compliment. Requiring 12 memory channels would reduce yield count and exacerbate the supply, which as we know, appears to be pretty low at the moment. Plus considering the number of GPCs that are deactivated on the 3080, there's a question of whether or not the GPU in theory would even make the most of it.

Or NVIDIA could just give it 12GB and run into another GTX 970 situation (or more accurately a GTX 660 Ti). I'm sure people would love to have 2GB segmented off. Using 2 GiB chips instead of 1 GiB chips could also be an option, but considering that normal GDDR6 runs for about $10 per GiB at volume pricing and assuming GDDR6X is somewhere between 1.5x and 2.0x more in cost, we could've seen a 3080 with a launch price of more like $850-$1000 and we'll just have Turing all over again.

I don't see this fetish for having more VRAM when there's no dire need for it.
I completely agree with you because you and I are knowledgeable enough to think rationally about it. I don't think that it's worth it either but that's because we know better.

Wayyyy back in the day, I made a stop-gap replacement for my PC with a Palit GeForce 8500 GT. If you can imagine, it had 1GB of DDR3 VRAM. There's no way that card could use more than 256MB but it was the same price so I bought it. I never would've paid more for it. The next card on the market with a full 1GB of RAM was the ATi Radeon HD 4870, a card in a whole other universe of performance from the 8500 GT. They did it to attract idiots but I only bought it because it was on sale for the same price as the 256MB model. I'm certain that there was literally no advantage to the 1GB of VRAM on such a weak card. However, I'm quite certain that nVidia scammed quite a few people into paying more for it.

The same kinda thing happened when some of ATI's AIB partners offered the RX 480 and RX 580 with 8GB of RAM when they could barely use 4GB (zero performance effect in all benchmarks and games). This is how little the public actually knows.
 

Phaaze88

Titan
Ambassador
Now there's another problem with the RTX 3080 and it has nothing to do with availability. Apparently, when any of the AIB custom models hit 2GHz, the drivers crash. These are nVidia cards, right? LOL
I have to question how hot the cards were running.
With Nvidia's Gpu Boost, temperatures = stability. The cooler they run, the better, for both higher core frequencies and sustained - assuming the power limits don't cause further hiccups.
 

ddferrari

Distinguished
Apr 29, 2010
388
6
18,865
First , thanks for your efforts and hard work reviewing the card.

now to the serious stuff :

1- No 8K benchmarks ? COME ON !!! this card should be tested on 8K as well , you tested GTX 1080 ti in 4k and this card is better in 8K than 1080 ti in 4K I dont care if it shows 30 fps it should be 8K benchmarked .

2- Why didnt you include Memory usage in each benchmark ? VRAM usage should be part of ANY benchmark table and ANY resolution from now on. add it ! make it Min Max memory usage !

3- You are a REVIEW site , you claim Memory used by VRAM is not the actual memory needed , and some are caching. FINE TEST IT . TESSSST IT , we wont take your words on it , and we wont takie "Just buy it " advice anymore . it is EASY TO TEST , you have 8GB cards , 10 GB cards , 11 GB cards you can find the spot where the game slows DOWN and YOU CAN TEST HOW MUCH VRAM IS REALLY NEEDED.

4-

no we wont "just stop" and we wont "just buy it"

DO YOUR HOMEWORK AND TEST MEMORY USAGE , or we will move to another review site.

5- Funny you did not mention the RTX 3070 Ti 16GB VRAM leaked by Lenovo Documents by accident? and you still say stop it and buy it for 10GB VRAM RTX 3080 ?
Stop saying "we". Don't drag the rest of us into your silly rant.

How about a nice cup of reality?

A) You don't own an 8K monitor, nor will you in the foreseeable future.
B) You don't get to complain about something that is free- as in this website.
C) Do your OWN homework if you want answers to questions that the rest of us wouldn't waste our breath on.

Everyone LOVES a cocky NOOB!
 
  • Like
Reactions: King_V

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Stop saying "we". Don't drag the rest of us into your silly rant.

How about a nice cup of reality?

A) You don't own an 8K monitor, nor will you in the foreseeable future.
B) You don't get to complain about something that is free- as in this website.
C) Do your OWN homework if you want answers to questions that the rest of us wouldn't waste our breath on.

Everyone LOVES a cocky NOOB!

A) I am planning to get an 8K TV soon . This one ($2450) :

https://www.bhphotovideo.com/c/product/1483338-REG/samsung_qn55q900rbfxza_q900_55_class_hdr.html

B) this site is not free . you pay for it in Ads and clicks and views. every product you buy has ads money in it , from water bottles to cars.

C) I dont care what you want . But this is a review site and should review all possibilities.
 

mac_angel

Distinguished
Mar 12, 2008
566
83
19,060
Wait, isn't Variable Refresh Rate just GSync for Nvidia (with grudging support for FreeSync)? Or is there something more involved?

Variable Refresh Rate is one of the new standards that are in HDMI 2.1. For a company to say they have HDMI 2.1, or any other generation, they have to support all the technologies that Consumer Electronics Control have standardized. Much like the standards of USB, PCIe Gen 3 or Gen 4, etc.
Now, companies can add forward technologies into lower generation items, and then advertise that technology specifically. But they can't get a certification, and put it on the box/website/advertisements, without supporting all the technologies that are specified for that generation. Much like Samsung did with their 2019 TVs, the RU8000 (55" and up), and their Q series TVs, all were HDMI 2.0 (maybe "b", but I don't know), but also had support for VRR. You could turn on "Freesync" in the settings when in Game mode. For some reason Samsung ditched Freesync/VRR on all of this years versions of the same TVs.
So, many displays, TVs, etc, that support VRR/Freesync, should be able to work properly with NVidia's 30XX GPUs. But for some reason, no one has tested it yet, though a lot of people have been asking about it. Especially since you can get a 55" 4K HDR TV with pretty decent low input lag for a lot cheaper than buying an actual monitor. People wouldn't have to spend $4000+ on the Alienware and other ones.