News Nvidia GeForce RTX 3080 Founders Edition Review: A Huge Generational Leap in Performance

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
First , thanks for your efforts and hard work reviewing the card.

now to the serious stuff :

1- No 8K benchmarks ? COME ON !!! this card should be tested on 8K as well , you tested GTX 1080 ti in 4k and this card is better in 8K than 1080 ti in 4K I dont care if it shows 30 fps it should be 8K benchmarked .

2- Why didnt you include Memory usage in each benchmark ? VRAM usage should be part of ANY benchmark table and ANY resolution from now on. add it ! make it Min Max memory usage !

3- You are a REVIEW site , you claim Memory used by VRAM is not the actual memory needed , and some are caching. FINE TEST IT . TESSSST IT , we wont take your words on it , and we wont takie "Just buy it " advice anymore . it is EASY TO TEST , you have 8GB cards , 10 GB cards , 11 GB cards you can find the spot where the game slows DOWN and YOU CAN TEST HOW MUCH VRAM IS REALLY NEEDED.

4-
If you're worried about 10GB of memory not being enough, my advice is to just stop.

no we wont "just stop" and we wont "just buy it"

DO YOUR HOMEWORK AND TEST MEMORY USAGE , or we will move to another review site.

5- Funny you did not mention the RTX 3070 Ti 16GB VRAM leaked by Lenovo Documents by accident? and you still say stop it and buy it for 10GB VRAM RTX 3080 ?
 
Last edited:
  • Like
Reactions: Dean0919 and LB23
Aug 16, 2020
21
6
15
What about temperature ? Is the fan enough for running those games in the long terms ? I understand that the fan was enough for the benchmarks ?
 

King_V

Illustrious
Ambassador
Ok, I will admit, I generally prefer AMD to Nvidia. I, like many people, was disappointed in Nvidia's pricing with the Ampere Turing (edit, because I clearly can't keep the names straight) cards, given the price increase to performance increase ratio being poor.

This, however is impressive, both in pricing, and the increase in performance. This is Nvidia doing it right.

The engineers at AMD certainly have their work cut out for them. If they can't keep up, their saving grace is that they've offered much better bang for the buck, and generally been better in the mainstream, high-volume market. With some exceptions (5500XT, I'm glaring angrily at you!)
 
Last edited:

cknobman

Distinguished
May 2, 2006
1,117
263
19,660
This is very impressive in regards to price/performance.

But honestly given the massive power draw and die size not as impressive as I thought it would be.
Nvidia just made everything bigger and more hungry.
 

mikepellegrini

Commendable
Apr 27, 2020
9
5
1,515
In terms of a CPU bottleneck, how's a socket 2066 processor likely to stack up? I've got a Core i7 7820 (8 core) currently running at stock speeds. I heavily lust after Microsoft Flight Simulator running at 4K ultra with an RTX 3080. Am I gonna be CPU limited?

I delidded my CPU and had it running at 4.7 GHz but returned to stock settings because playing current games (e.g., RDR2) my CPU hardly ever gets above about a 10% load. It seemed pointless to run an OC. But if MFS needs more CPU, I can redo the OC.

Just wondering what I'm in for. Nobody runs benchmarks with socket 2066 systems.


Asus ROG Rampage VI Apex
Core i7 7820 delidded - Thermal Grizzley Conductanaut TIM
NZXT Kraken X62
CORSAIR Vengeance LPX 32GB (4x8GB) DDR4 3000MHz (PC4-24000) C15
Samsung 960 PRO Series - 512GB PCIe NVMe - M.2 Internal SSD (MZ-V6P512BW)
Toshiba 3.5-Inch 2TB 7200 RPM SATA3/SATA 6.0 GB/s 64MB Hard Drive DT01ACA200
Evga GTX 980
SilverStone Technology Strider 1000W 80 Plus Platinum Modular PSU 1000 Power Supply (PS-ST1000-PT)
Phanteks Enthoo EVOLV ATX Mid Tower Chassis, Black Cases PH-ES515E_BK
 
2- Why didnt you include Memory usage in each benchmark ? VRAM usage should be part of ANY benchmark table and ANY resolution from now on. add it ! make it Min Max memory usage !
3- You are a REVIEW site , you claim Memory used by VRAM is not the actual memory needed , and some are caching. FINE TEST IT . TESSSST IT , we wont take your words on it , and we wont takie "Just buy it " advice anymore . it is EASY TO TEST , you have 8GB cards , 10 GB cards , 11 GB cards you can find the spot where the game slows DOWN and YOU CAN TEST HOW MUCH VRAM IS REALLY NEEDED.

4-

no we wont "just stop" and we wont "just buy it"

DO YOUR HOMEWORK AND TEST MEMORY USAGE , or we will move to another review site.
Because it's moot and only useful as a data point for people to misinterpret.

Using something like GPU-z or MSI Afterburner isn't a good idea because they report VRAM usage in its entirety, meaning it's the game + whatever everyone else is using. Windows already uses a good 500MB on my system, it could be different for anyone else, and it could change over time. I know of a method to gather an individual app's VRAM usage that involves using PerfMon (it's a built-in Windows tool), but it's a pain in the ass to gather it. The tool records VRAM usage by PID, which changes every time the game is launched.

In the end though, it's as what Jarred said: an app may have more VRAM allocated to it than necessary but not actually use it. Various games already fill out VRAM yet suffer no real performance degradation for it (FFXV an Call of Duty come to mind). And I'm almost certain that not everything in VRAM is actually necessary to render a frame.

Although if @JarredWaltonGPU is interested, I did find an app that tries to allocate a bunch of VRAM and keep itself in there if he wants to go down this rabbit hole.
 
  • Like
Reactions: Shadowclash10

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
What about temperature ? Is the fan enough for running those games in the long terms ? I understand that the fan was enough for the benchmarks ?
JayzTwoCents using the stock fan curve never saw the temperature go above 72C during testing in an open test bench. When he used the default user defined fan curve in Afterburner, the temps peaked at only 61C
 
  • Like
Reactions: Olivier_00

JarredWaltonGPU

Senior GPU Editor
Editor
In terms of a CPU bottleneck, how's a socket 2066 processor likely to stack up? I've got a Core i7 7820 (8 core) currently running at stock speeds. I heavily lust after Microsoft Flight Simulator running at 4K ultra with an RTX 3080. Am I gonna be CPU limited?

I delidded my CPU and had it running at 4.7 GHz but returned to stock settings because playing current games (e.g., RDR2) my CPU hardly ever gets above about a 10% load. It seemed pointless to run an OC. But if MFS needs more CPU, I can redo the OC.

Just wondering what I'm in for. Nobody runs benchmarks with socket 2066 systems.

Asus ROG Rampage VI Apex
Core i7 7820 delidded - Thermal Grizzley Conductanaut TIM
NZXT Kraken X62
CORSAIR Vengeance LPX 32GB (4x8GB) DDR4 3000MHz (PC4-24000) C15
Samsung 960 PRO Series - 512GB PCIe NVMe - M.2 Internal SSD (MZ-V6P512BW)
Toshiba 3.5-Inch 2TB 7200 RPM SATA3/SATA 6.0 GB/s 64MB Hard Drive DT01ACA200
Evga GTX 980
SilverStone Technology Strider 1000W 80 Plus Platinum Modular PSU 1000 Power Supply (PS-ST1000-PT)
Phanteks Enthoo EVOLV ATX Mid Tower Chassis, Black Cases PH-ES515E_BK
There's a reason we avoid 2066... ;)

But seriously, Intel's Skylake-X and Cascade Lake-X CPUs are fine and generally land in between Coffee Lake and Ryzen 3000 in terms of gaming performance. With a 4K monitor, you're not going to lose much in performance -- see the RTX 3080 CPU scaling article.
 
  • Like
Reactions: mikepellegrini

JarredWaltonGPU

Senior GPU Editor
Editor
Basically nothing about all the new HDMI 2.1 features??!?? I've been dying to hear about how these work for over a year.
There's not a ton to say about HDMI 2.1. It's here, the card has it, it supports 8K60 HDR. I don't have any displays that support HDMI 2.1, unfortunately, but I'm not sure what more you want to know. Does the HDMI 2.1 connectivity work? It better! :)
 

tummybunny

Reputable
Apr 24, 2020
33
31
4,560
are there any 2.1 monitors on the market yet?

No. I've copied the below list from elsewhere.

Acer​
XV282K KV​
28"​
144​
IPS​
600​
Yes​
January 2021 in China​
Asus​
-​
27"/32"/43"​
120​
-​
-​
-​
Holidays​
Eve​
Spectrum​
27"​
144​
Nano IPS​
600​
Yes​
-​
Philips​
328M1R​
32"​
120​
VA​
600​
Yes​
Early 2021​
ViewSonic​
Elite XG320U​
32"​
144​
 

tummybunny

Reputable
Apr 24, 2020
33
31
4,560
There's not a ton to say about HDMI 2.1. It's here, the card has it, it supports 8K60 HDR. I don't any displays that support HDMI 2.1, but I'm not sure what more you want to know. Does the HDMI 2.1 connectivity work? It better! :)

Thabks for the interest and the review!

What is 4K 120hz 4.4.4 gaming like? It's never been possible before.

What is VariableRefresh Rate like? It's brand new

What's Auto Low Latency Mode like? Also brand new.

Is Quick Media Switching any good?

What's Quick Frame Transport like?

To be fair there are not many HDMI 2.1 displays available yet but I own one and am super interested in what it can do! This gpu is the first opportunity to find out.
 
  • Like
Reactions: saunupe1911

saunupe1911

Distinguished
Apr 17, 2016
203
74
18,660
There's not a ton to say about HDMI 2.1. It's here, the card has it, it supports 8K60 HDR. I don't any displays that support HDMI 2.1, but I'm not sure what more you want to know. Does the HDMI 2.1 connectivity work? It better! :)

You may not be familiar with the AV world but people are itching to see how these cards perform on the new 2020 HDTVs that have HDMI 2.1 and various VRR features. The LG OLEDs even have G Sync so a lot of people want to know how these cards perform on these TVs. The LG OLEDs, Sony 900H, Samsung Q80T/Q90T, and the Vizio 2020 Quantam series are TVs that people are itching to see VRR and 4k 120hz, and even 4k Full RGB at 60hz
 
  • Like
Reactions: mac_angel

JarredWaltonGPU

Senior GPU Editor
Editor
Thabks for the interest and the review!

What is 4K 120hz 4.4.4 gaming like? It's never been possible before.

What is VariableRefresh Rate like? It's brand new

What's Auto Low Latency Mode like? Also brand new.

Is Quick Media Switching any good?

What's Quick Frame Transport like?

To be fair there are not many HDMI 2.1 displays available yet but I own one and am super interested in what it can do! This gpu is the first opportunity to find out.
Most of these are display questions, not GPU questions. 4K 98Hz was possible before at 24-bit color, which is going to be nearly the same as 120Hz with VRR, because even with a 3080 a lot of games won't reach more than 98 fps. The rest obviously requires I have an HDMI 2.1 display or TV, which I don't have and don't plan on purchasing. Hopefully our display reviewer can test some of the new HDMI 2.1 displays when they become available.
 

Phaaze88

Titan
Ambassador
Impressive indeed. Looks like Jensen was telling the truth to us 1080Ti owners :ROFLMAO:
They still lied about 2x performance, whatever that was supposed to mean.

I watched the Gamers' Nexus video, and for 1440p, I'm looking at a probable 70%-ish uplift on average - across their samples tested anyways.
Also what I gathered from that video: 3080 is dumb for 1080p. People are going to do it anyway, and complain about fps, only to see that one or more cpu threads are maxed out... :pfff:

I also watched other Steve's video, and his testing landed closer to 60% overall - compared to the 1080Ti - still impressive nonetheless.

I'm in no rush to go and grab one - there's still the RX 6000 series, plus I have to wait for Alphacool's cooling solutions for these cards anyway.
 

JarredWaltonGPU

Senior GPU Editor
Editor
You may not be familiar with the AV world but people are itching to see how these cards perform on the new 2020 HDTVs that have HDMI 2.1 and various VRR features. The LG OLEDs even have G Sync so a lot of people want to know how these cards perform on these TVs. The LG OLEDs, Sony 900H, Samsung Q80T/Q90T, and the Vizio 2020 Quantam series are TVs that people are itching to see VRR and 4k 120hz, and even 4k Full RGB at 60hz
Yeah, I get that ... but that's less a GPU review topic and more a display review topic. The experience for all of these things is going to vary quite a bit, depending on what display you're using.
 
  • Like
Reactions: ddferrari