News AMD Radeon RX 6800 XT and RX 6800 Review

giorgiog

Distinguished
Jun 6, 2010
3
7
18,515
0
As someone with an EVGA 3080 FTW... who gives a <Mod Edit> about raytracing? I turn it off in CoD CW to get way better framerates at 4k. Raytracing isn't worth it. Yes it's beautiful, but hardly anything more than a marketing gimmick at this point. If you can get a 3080 or a 6800, enjoy your awesome video card!
 
Last edited by a moderator:

dmoros78v

Reputable
Oct 31, 2017
13
10
4,515
0
After watching Digital Foundry video on the comparison of Watch Dogs Legion X Box Series X ray tracing to PC versions ray tracing, I already suspected what was found in this review. Digital Foundry found a RTX 2060 Super with console ray tracing setting offered same (and sometimes better) performance than the console version... and thats a "mere" 7.2 GFLOP GPU, so rasterizing performance surely was not the culprit, ray tracing performance had to be, and this review just confirmed it. And that was again without DLSS!, enter DLSS and the 2060 super beats the console every time.
 
Reactions: DMAN999
Ray Tracing uses a random point sampling denoising feature between frames to reduce the number of samples necessary. The less samples you take, the blurrier the image looks. Looks like AMD is really reducing this sampling in an effort to speed up ray tracing. I think they reduced the size of the bounding boxes as well to reduce collisions. There's a lack of light reflections that show up in NVIDIA's RT that fail to show up on AMD's.

One test is to see if they are using Temporal AA is to look at scenes where there are sudden light changes. "After glows" appear when lights suddenly flip off causing temporary "ghost images"

The quality is obviously suffering.
 
Last edited:
As someone with an EVGA 3080 FTW... who gives a <Mod Edit> about raytracing? I turn it off in CoD CW to get way better framerates at 4k. Raytracing isn't worth it. Yes it's beautiful, but hardly anything more than a marketing gimmick at this point. If you can get a 3080 or a 6800, enjoy your awesome video card!
Plenty of people do including Sony and Microsoft whom both promote it as the "next big thing" on their consoles.

Not all games are trigger happy twitch fest. Some of more experiential where quality matters.
 
Last edited:
Reactions: dmoros78v

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
617
510
1,270
1
Ray Tracing uses a random point sampling denoising feature between frames to reduce the number of samples necessary. The less samples you take, the blurrier the image looks. Looks like AMD is really reducing this sampling in an effort to speed up ray tracing. The quality is obviously suffering.
This should be up to the game dev, though, not the GPU drivers or whatever. DXR generally says, "trace these rays" and expects an answer. Not "trace some of these rays and give me your best guess."
 

dmoros78v

Reputable
Oct 31, 2017
13
10
4,515
0
As someone with an EVGA 3080 FTW... who gives a <Mod Edit> about raytracing? I turn it off in CoD CW to get way better framerates at 4k. Raytracing isn't worth it. Yes it's beautiful, but hardly anything more than a marketing gimmick at this point. If you can get a 3080 or a 6800, enjoy your awesome video card!
Thats the thing, thats your opinion, wich is mostly enforced by what you value more (fps) but not all people think the same. For CoD or any other First Person Shooter I agree, but some other games like Control, Watch Dogs, Tomb Raider, or mostly "Cinematic" third person games I will prefer eye candy over fps every time, I´m more than happy with 30 fps on those types of games with maximum eye candy, also because 30 fps looks more "Cinematic" (do you remember all the criticism Peter Jackson got for shooting The Hobbit at 48 fps and how it looked like a "Soap Opera" and not like a big budget movie because of it? well I think most people agree as the experiment has not been repeated again and we are still watching movies at 24 fps.
 
Reactions: Jim90
As someone with an EVGA 3080 FTW... who gives a <Mod Edit> about raytracing? I turn it off in CoD CW to get way better framerates at 4k. Raytracing isn't worth it. Yes it's beautiful, but hardly anything more than a marketing gimmick at this point. If you can get a 3080 or a 6800, enjoy your awesome video card!
At 1440p 144Hz using a 3080 I like having RT on in COD MW while getting over 120/130 FPS, often over 140fps depending on the map. If really bothered by very high fps why would you choose to game at 4K?
 
Reactions: Makaveli and RodroX

Phaaze88

Glorious
Ambassador
Ohh SNAP!
A)Nvidia changing FP32 compute on RTX 30 gave AMD an edge at 1080p... but I'm still on the fence about spending that kind of money on that puny little resolution.
Sorry to 1080p gamers, but that's not a cost effective investment.

B)AMD also has longer-lived driver support - sans Radeon 7, coughcough - compared to Nvidia.
So for people who tend to hold on to gpus for a while, RX 6800/XT should age a bit better than the 3080.

C)Well, look at that... AMD isn't the power hog here either, LOL.

D)A shame about their reference cooler though.

E)The ray tracing thing I can live without it on both ends. Higher resolution > RT, from my perspective.
 
Last edited:
Reactions: Makaveli and King_V

RodroX

Estimable
Im guessing theres still a lot of work to do on the developer side and probably on the driver side. But it looks pretty impresive compared to the RX 5000 series.

I think RT is worth it, on the other hand I can live without it, for now.

Time will tell if AMD can adjust the software to bring at least the same quality as nvidia, but then if the performance remains close to what this review shows then I guess we will have to wait for RX 7000 series, maybe then RT on AMD wil worth out time and money.
 
Reactions: Makaveli

Vladimir Iliev

Reputable
Dec 21, 2015
7
1
4,515
0
Who cares about Ray tracing when on Nvdia is STILL UNPLAYABLE other than showing off to a friend. It sucks so much that I can't even believe there are ppl saying that they actually use it for real gaming. Using like 50-70FPS sounds to me like going 10 years back on low end machine. All monitors now are 144hz+ and I personally hate getting below 100FPS on any game @2k.
Also Im really happy that AMD are finally on par with Nvidia on rasterization which is the only thing that matters for real gamers currently and finally there is some real choise. We saw that new competition in action with the pricing of 3000 series already. I might swith to AMD next with next GPU.
 
Reactions: Makaveli

mac_angel

Distinguished
Mar 12, 2008
298
8
18,785
0
still reading the article, but something I have been curious about for a long time. Why not offer Ray-Tracing as an add-in card? It seems mostly compute based, and SLI/Crossfire wouldn't be needed. Bandwidth between PCIe a problem?
 

Phaaze88

Glorious
Ambassador
Who cares about Ray tracing when on Nvdia is STILL UNPLAYABLE other than showing off to a friend. It sucks so much that I can't even believe there are ppl saying that they actually use it for real gaming. Using like 50-70FPS sounds to me like going 10 years back on low end machine. All monitors now are 144hz+ and I personally hate getting below 100FPS on any game at 2k.
Also Im really happy that AMD are finally on par with Nvidia on rasterization which is the only thing that matters for real gamers currently and finally there is some real choise. We saw that new competition in action with the pricing of 3000 series already. I might swith to AMD next with next GPU.
No offense intended towards you, but your perspectives on what playable framerates are and designation of 'real gaming/gamers' is narrow.
 
Jun 20, 2020
18
7
15
0
I own 200 games and ZERO of them have ray tracing.

I want ray tracing - but not until its hardware agnostic and implemented in everything. Im not buying an early generation of proprietary tech that only works on 10 or 15 games total. Thats silly. Keep working on it guys and maybe next or next next Gen will be worth considering Ray Tracing ability as a prime consideration.
 
Reactions: artk2219
With games ever increasing in complexity the 128MB of infinity cache will age quickly. Its showing its limits at 4k now.

Anyone want the bet in 2 years, 6800XT will run slower than 3080 @1440p. Its a lot like Fury. It will age poorly due to limited (fast) memory.

I'm willing to bet donation money to favorite charity.
 

dmoros78v

Reputable
Oct 31, 2017
13
10
4,515
0
I own 200 games and ZERO of them have ray tracing.

I want ray tracing - but not until its hardware agnostic and implemented in everything. I'm not buying an early generation of proprietary tech that only works on 10 or 15 games total. Thats silly. Keep working on it guys and maybe next or next next Gen will be worth considering Ray Tracing ability as a prime consideration.
Well, unless you plan on staying on GTX 1000 or Radeon 5000 series seems like you are out of luck aren't you?

Besides, ray tracing IS hardware agnostic, is part of DX12, and with the arrival of these new cards you can activate it on either NVIDIA or AMD. If it weren't hardware agnostic, you would not be seeing this review using ray tracing on control, tomb raider, etc on an AMD Card if it was an NVIDIA proprietary tech (like DLSS is). Technically you can even activate it WITHOUT an accelerated ray tracing card if you want to see an slide show. Brings back memories of the first 3D accelerators from Rendition and 3dfx... good ol days.

Finally, with ray tracing now on both major consoles, you can bet it will be implemented on almost every major game from now on.
 

in_the_loop

Distinguished
Dec 15, 2007
135
6
18,685
0
Who cares about Ray tracing when on Nvdia is STILL UNPLAYABLE other than showing off to a friend. It sucks so much that I can't even believe there are ppl saying that they actually use it for real gaming. Using like 50-70FPS sounds to me like going 10 years back on low end machine. All monitors now are 144hz+ and I personally hate getting below 100FPS on any game @2k.
Also Im really happy that AMD are finally on par with Nvidia on rasterization which is the only thing that matters for real gamers currently and finally there is some real choise. We saw that new competition in action with the pricing of 3000 series already. I might swith to AMD next with next GPU.
60 Hz is more than enough for many people (not all).
All monitors is NOT 144hz+ and LOTS and lots of people, including me, is playing on 55 inch TVs.@4kand 60Hz. I can even play at 120HZ(1080p)natively (not interlaced) if I want to, but i don't.
I did a test some time ago and the highest framerate I could see a difference in was up to 47 FPS, beyond that no difference for me.
So for me, at 60 Hz there is no difference to higher refresh rates.
That is not to say that some people are more sensitive to it. And that even if you are not sensitive to it, you may still benefit from higher refresh rates for competitive online gaming, where lag and latency means everything. But I never play multiplayer games! And 4k@55 inch TV at max settings beats lower res 27 inch monitors with 144 HZ big time for me.
 

Gurg

Distinguished
Mar 13, 2013
450
48
18,820
1
A couple of take aways:
At $299 @ Microcenter 9900k seems like a great CPU for gaming when paired with a top card for higher resolution gaming.
All this talk about AMDs Smart Access Memory seems overblown as RX 6800xt combined with 5900x CPU still underperforms same RX 6800xt GPU with 9900k or 10900k at higher resolutions in gaming.

All this though "the slightly slower RAM might be a bit of a handicap on the Intel PCs".

Hopefully availability issues will be resolved for AMD and Nvidia and with closer performance differentials in both CPUs and GPUs eventually leading to lower prices for consumers.
 
Last edited:
Nov 18, 2020
1
0
10
0
What i want to know which wasn't covered, is what is the performance impact on both sides when OBS is offloading video encoding to the GPU well you play. AKA the streaming setup.
 

Specter0420

Distinguished
Apr 8, 2010
85
5
18,635
0
So Microsoft Flight Simulator has already fallen off of your benchmark list? That was on there for what? One CPU generation and half a GPU generation? Is this a joke Jarred or did I just miss it while skimming the article?
 
Jul 1, 2020
4
0
10
0
Who cares about Ray tracing when on Nvdia is STILL UNPLAYABLE other than showing off to a friend. It sucks so much that I can't even believe there are ppl saying that they actually use it for real gaming. Using like 50-70FPS sounds to me like going 10 years back on low end machine. All monitors now are 144hz+ and I personally hate getting below 100FPS on any game @2k.
Also Im really happy that AMD are finally on par with Nvidia on rasterization which is the only thing that matters for real gamers currently and finally there is some real choise. We saw that new competition in action with the pricing of 3000 series already. I might swith to AMD next with next GPU.
Anyone who's used to a really good display (like NEC's PA Series, or even EA models) would never buy anything else for anything else than gaming. I'm not fancy of multi-display setups and I'm entirely happy with 60 fps and all that eye-candy.
 

ASK THE COMMUNITY

TRENDING THREADS