What I really meant to say was: "The 3070 is plenty good enough for you, so leave the 3080s so they are available for people who really need them".If I were thinking about a 3070, I would buy the 3080 anyway...
What I really meant to say was: "The 3070 is plenty good enough for you, so leave the 3080s so they are available for people who really need them".If I were thinking about a 3070, I would buy the 3080 anyway...
I still can't find anything online about the RTX 3080 and Variable Refresh Rate or the other new tech they were advertising.
I still can't find anything online about the RTX 3080 and Variable Refresh Rate or the other new tech they were advertising.
Fixed. Weird, because I use an internal linking tool so it should have grabbed the right link. I've checked now (in our CMS) and it says it has the right link. But just in case it messes up again: https://www.tomshardware.com/news/virtuallink-is-dead"And last but not least, there's no VirtualLink port this round — apparently, VirtualLink is dead. RIP. "
The link is leading to an unrelated page about the best processors 2020. I searched for "virtual" in that article and there were no results, so I believe the link is wrong.
The biggest issue with 8K textures is that to actually use them ... you need an 8K display. The basic MIPMAPPING algorithms goes like this:Jarred, I'll start by saying that your review is absolutely spectacular! You have definitely outdone yourself!
I do have a comment to make but it's not about your review, it's about the product itself. You were saying something about the VRAM that I 100% agree with but there is a problem with our kind of thinking:
"If you're worried about 10GB of memory not being enough, my advice is to just stop. Ultra settings often end up being a placebo effect compared to high settings — 4K textures are mostly useful on 4K displays, and 8K textures are either used for virtual texturing (meaning, parts of the texture are used rather than the whole thing at once) or not used at all. We might see games in the next few years where a 16GB card could perform better than a 10GB card, at which point dropping texture quality a notch will cut VRAM use in half and look nearly indistinguishable. "
On a "FLAGSHIP" card like this that is hailed as a card that's specifically for gaming at 2160p, should people have to turn their textures down because nVidia didn't give the card the 12-16GB of VRAM that it should have? Sure, it doesn't matter, I agree with that, but I also think that nVidia shouldn't have made it necessary because this is their "FLAGSHIP" card. Your thoughts?
Yeah, I think he just doesn't believe VRR works with HDMI 2.1 or something? It's part of the spec, so it should work. But the only HDMI 2.1 displays tend to be 4K or 8K TVs, which most people don't have. G-Sync is Nvidia tech for it, FreeSync is AMD's knock-off version that's royalty free, and Adaptive Sync is basically FreeSync without the AMD branding -- so Nvidia always says Adaptive Sync or G-Sync Compatible and never FreeSync Compatible.Wait, isn't Variable Refresh Rate just GSync for Nvidia (with grudging support for FreeSync)? Or is there something more involved?
Oh yes, I completely agree with you. It's not a screen image thing, it's a product image thing. Look at it like a consumer who doesn't know better like you and I do.The biggest issue with 8K textures is that to actually use them ... you need an 8K display. The basic MIPMAPPING algorithms goes like this:
So, if you have a 1920x1080 display, the maximum width of a texture would be 1920 pixels (unless the game allows you to get so close that you're effectively seeing half of a texture spread over the entire screen -- not really a useful metric). Which means at most, playing at 1080p, a game will use a 2K texture. But the vast majority of textures are going to be further back and only cover a fraction of the display, so they'll use 1K or 512 or 256 or even 128 size textures.
- Check on-screen pixel size of polygon
- Select closest MIPMAP that's one step above the pixel size
To get 4K textures to be useful, you need a 4K screen. Most of the textures will still be 2K, 1K, or lower. But the close ground/wall textures (and maybe sky, depending on how it's done) could use the 4K textures. If you have a 4K screen with 8K textures, the 8K textures get loaded into system RAM but shouldn't end up being used on any objects, because an object would never cover enough of the display that an 8K texture would be needed.
This is why, if a game actually tells you the base texture size it's using (2K, 4K, 8K) and you compare screenshots at 4K using all of the texture options, there's almost no visible difference between 2K and above. Also keep in mind that it's possible to use a 1K texture on a 2K polygon and still not see much of a difference. And if a game were to implement some form of smart texture upscaling (put those tensor cores to use doing something else!) it could potentially reduce the memory footprint while delivering similar quality.
If anything, less-informed customers will either think the 2080 Ti is better because it's more expensive still or the 3080 is better because it's a higher number. Or they think one card is better because it clocks higher than another. Or another card is better because it has more buzzwords on the box.Oh yes, I completely agree with you. It's not a screen image thing, it's a product image thing. Look at it like a consumer who doesn't know better like you and I do.
So, nVidia released their new "flagship" GPU but it has less VRAM than their top card from the previous generation. This means that they deliberately sandbagged the card because a new flagship card shouldn't have less VRAM than its predecessor. Now, as a semi-ignorant consumer, you'll be turned off by this because all you know is that it has less of something important than the card that came before it. You'll still buy it of course, but you might also join the chorus of the other ignorant sheep bleating loudly about nothing.
As a tech expert, I'm sure that you find few things to be as annoying as some noob whining about something completely irrelevant. I know that it drives me absolutely bonkers sometimes and I just wish that they'd shut up. Like that dumb lawsuit against AMD about what constitutes a CPU core. Anyone with half a brain and some knowledge in tech knows that a CPU core is an integer core, but here we have idiots using up taxpayer money in court. It's dumb, it's wrong and it's irrelevant but it's nevertheless a thing.
So yes, I know that it doesn't matter and you know that it doesn't matter. I'm sure that filling a 10GB frame buffer with less than 8K textures would be borderline impossible (which is why I'm not the least bit worried about the "only" 8GB of GDDR6 in my RX 5700 XT). However, there are a bunch of people whining about the 10GB frame buffer being too small. I'm just wondering if it would have been wiser for nVidia to just put 12GB on it if only to prevent this very thing from happening. They must have seen it coming.
Then, instead of bleating sheep, we'd have Silence of the Lambs.
I agree with you, but this would be one less reason and that's not a bad thing.If anything, less-informed customers will either think the 2080 Ti is better because it's more expensive still or the 3080 is better because it's a higher number. Or they think one card is better because it clocks higher than another. Or another card is better because it has more buzzwords on the box.
There are a dozen plus factors that less-informed customers will use to think why one product is better than another.
Trying to cater to the everyman who at the end of the day will probably buy what the equally ill-informed customer service rep tells them to buy isn't a sound product development strategyI agree with you, but this would be one less reason and that's not a bad thing.
I agree, but bigger numbers ALWAYS sound better and it's not like nVidia can't afford it. The most priceless thing a company can buy is consumer goodwill.Trying to cater to the everyman who at the end of the day will probably buy what the equally ill-informed customer service rep tells them to buy isn't a sound product development strategy
In this instance, they may not be able to. 12GB of VRAM requires 12 memory channels to be fully functional, which if we went by the 3090's specs, 12 memory channels appears to be the full compliment. Requiring 12 memory channels would reduce yield count and exacerbate the supply, which as we know, appears to be pretty low at the moment. Plus considering the number of GPCs that are deactivated on the 3080, there's a question of whether or not the GPU in theory would even make the most of it.I agree, but bigger numbers ALWAYS sound better and it's not like nVidia can't afford it. The most priceless thing a company can buy is consumer goodwill.
I completely agree with you because you and I are knowledgeable enough to think rationally about it. I don't think that it's worth it either but that's because we know better.In this instance, they may not be able to. 12GB of VRAM requires 12 memory channels to be fully functional, which if we went by the 3090's specs, 12 memory channels appears to be the full compliment. Requiring 12 memory channels would reduce yield count and exacerbate the supply, which as we know, appears to be pretty low at the moment. Plus considering the number of GPCs that are deactivated on the 3080, there's a question of whether or not the GPU in theory would even make the most of it.
Or NVIDIA could just give it 12GB and run into another GTX 970 situation (or more accurately a GTX 660 Ti). I'm sure people would love to have 2GB segmented off. Using 2 GiB chips instead of 1 GiB chips could also be an option, but considering that normal GDDR6 runs for about $10 per GiB at volume pricing and assuming GDDR6X is somewhere between 1.5x and 2.0x more in cost, we could've seen a 3080 with a launch price of more like $850-$1000 and we'll just have Turing all over again.
I don't see this fetish for having more VRAM when there's no dire need for it.
I have to question how hot the cards were running.Now there's another problem with the RTX 3080 and it has nothing to do with availability. Apparently, when any of the AIB custom models hit 2GHz, the drivers crash. These are nVidia cards, right? LOL
Stop saying "we". Don't drag the rest of us into your silly rant.First , thanks for your efforts and hard work reviewing the card.
now to the serious stuff :
1- No 8K benchmarks ? COME ON !!! this card should be tested on 8K as well , you tested GTX 1080 ti in 4k and this card is better in 8K than 1080 ti in 4K I dont care if it shows 30 fps it should be 8K benchmarked .
2- Why didnt you include Memory usage in each benchmark ? VRAM usage should be part of ANY benchmark table and ANY resolution from now on. add it ! make it Min Max memory usage !
3- You are a REVIEW site , you claim Memory used by VRAM is not the actual memory needed , and some are caching. FINE TEST IT . TESSSST IT , we wont take your words on it , and we wont takie "Just buy it " advice anymore . it is EASY TO TEST , you have 8GB cards , 10 GB cards , 11 GB cards you can find the spot where the game slows DOWN and YOU CAN TEST HOW MUCH VRAM IS REALLY NEEDED.
4-
no we wont "just stop" and we wont "just buy it"
DO YOUR HOMEWORK AND TEST MEMORY USAGE , or we will move to another review site.
5- Funny you did not mention the RTX 3070 Ti 16GB VRAM leaked by Lenovo Documents by accident? and you still say stop it and buy it for 10GB VRAM RTX 3080 ?
Stop saying "we". Don't drag the rest of us into your silly rant.
How about a nice cup of reality?
A) You don't own an 8K monitor, nor will you in the foreseeable future.
B) You don't get to complain about something that is free- as in this website.
C) Do your OWN homework if you want answers to questions that the rest of us wouldn't waste our breath on.
Everyone LOVES a cocky NOOB!
Wait, isn't Variable Refresh Rate just GSync for Nvidia (with grudging support for FreeSync)? Or is there something more involved?
If this was a requirement of all "review sites", there would be no review sites.C) I dont care what you want . But this is a review site and should review all possibilities.