News This Is What AMD's Radeon RX 6000 "Big Navi" Looks Like

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Given the size of the card, a narrow 256-bit memory seems unlikely. 16GB over 512-bit makes more sense. That's consistent with the Xbox's 10GB over 320-bit. I suspect AMD's RT implementation is much more memory intensive than Nvidia's.
 
4 fans, sounds like amateur hour. I am waiting for a Graphics card to be released where you would need an HVAC technician to install it for you.
If you can make-do with only an HVAC tech, you are still in amateur league!

Big Boy HVAC jobs require structural engineers, electrical engineers, construction crews if engineers conclude that significant site modifications are required, a crane crew or two to safely get the several tons unit(s) on the roof, etc. :)
 
fellas , do you know if 5700XT has ray tracing like nvidia or not ? can it be enabled in crimson ?
No, sorry. The current, released Navi cards don't have ray tracing. AFAIK, I don't remember AMD saying they'll get ray tracing down the line - they simply don't have the hardware baked in. However, there is a possibility that they'll get ray tracing like the GTX 1000 series did - to show that older cards just can't ray trace. Now, Big Navi, such as this card, should have ray tracing.

Do note that the RX 5700 XT is a very good card though, above a 2060 Super and a bit below a 2070 Super. Trust me, ray tracing doesn't really matter - coming from a 2060 Super. Really, the only game where ray tracing really shines and is awesome looking is Minecraft. The problem is: Minecraft is one game, Minecraft ray tracing kills your framerates, and there are only a few, prebuilt maps that actually work with ray tracing.

However, if you are asking about it as a purchase, I wouldn't buy a mid-tier GPU right now, as long as you are fine waiting a few months. Or just buy a ridiciouly cheap 5700XT/2060S/2070/2070S/2080/2080S/2080 Ti - everyone wants those off their hands because of Ampere.
 
THe 5700 xt red devil is a fast card. But the drivers suck. My new system with a 3700x the GPU drivers crash 10 times a day. I dont trust AMD. I going 3080 even if it cost more

Might it be your power supply?

The power delivery system on the non-reference designs was sub standard by most mfg's COMPARED to the AMD reference design. It made them a lot more sensitive to transients in the power delivery. As a result, a quality power supply was a must.
 
  • Like
Reactions: artk2219 and King_V
If you can make-do with only an HVAC tech, you are still in amateur league!

Big Boy HVAC jobs require structural engineers, electrical engineers, construction crews if engineers conclude that significant site modifications are required, a crane crew or two to safely get the several tons unit(s) on the roof, etc. :)

A single copper coil (radiator) to cool a white room can weigh several tons by itself.
 
Last edited:
fellas , do you know if 5700XT has ray tracing like nvidia or not ? can it be enabled in crimson ?

There's things you need to understand.

  1. There's Ray Tracing old school, (Computationally expensive, like in movies)
  2. There's Software Ray Tracing emulated, (Very fast simulations like Sonic Ether's SEUS implementation
    and
    )
  3. There's Microsoft Ray Tracing, (DXR which is an OPEN standard like DX)
  4. There's NVIDIA's RTX Ray tracing. (Which is a closed standard like PhysX and proprietary)
NVIDIA's ray tracing is the fastest currently, and found on most AAA titles. But it has drawbacks. Mainly it's proprietary. This means support is questionable among game makers. And on AAA titles, its SAF (Slow As Freak). Only the game "Control" seemed to do okay with most options implemented. The 20 series is just under powered as a Ray Tracing engine for AAA games.

However I run ray tracing with my RX5700XT on Minecraft using SEUS. I get a really good 50->60fps @ 1080p.

Microsoft has recently released DX12 Ultimate with DXR (Direct X Ray tracing extensions) which is GPU agnostic. This means any developer can code once and not care so much about architecture (NVIDIA/Intel/AMD) These extensions will be used by the next gen XBox. This will port over to Desktop PC's well and will likely win out.

That said, DXR by Microsoft will not run well (if at all) on RX5000 series. RX6000 series is supposed to have support for it. But we won't find out until they reach reviewers hands.

In the end, I believe DXR will win out over NVIDIA's proprietary implementation. (Much how FreeSync/Adaptive sync won out over G-Sync. Open standards always appeal to a wider audience) And only DXR will run on XBox Series X. PS5 is using an AMD chipset, so I imagine there will be extensions for that as well similar to how Vulkan exists. So ports to the PC that support DXR and RT Extensions will be easier for all hardware.

Right now performance of ray tracing is a big unknown on the RX6000 series. I've heard rumors of slightly faster than 20 series Turing, to 1.5x fast as Turing for the RX6000 series. As always WAIT FOR THE REVIEWS if it's an important feature to you.

The advantage NVIDIA has is DLSS 2.0. NVIDIA renders the newest games at 4x's the resolution you would expect, and applies Artificial Intelligence to them to perform pattern matching. This is downloaded and applied to tensor cores on the 20 and 30 series to fill in missing pixels without actually having to do expensive computationally expensive ray tracing work. This allows you to run considerably faster because you can render at a lower resolution and let AI upsample to 4K. That said, this technology is dependent on NVIDIA supporting it on their game servers. So it is per game dependent. AI training is expensive so don't expect them to do it on every game.

AMD has their own upsampling based on an open standard and it works very well on all games. But is generally regarded as inferior to NVIDIA's DLSS 2.0 in terms of detail. I believe it's using a form of binomial or bi-cubic interpolation with some temporal AA thrown in. But I may be mistaken on that.
 
Last edited:
No, sorry. The current, released Navi cards don't have ray tracing. AFAIK, I don't remember AMD saying they'll get ray tracing down the line - they simply don't have the hardware baked in. However, there is a possibility that they'll get ray tracing like the GTX 1000 series did - to show that older cards just can't ray trace. Now, Big Navi, such as this card, should have ray tracing.
The thing that bugged me is that AMD could support DXR (and possibly Vulkan's equivalent) through a fallback layer. I'm led to believe in some old posts in the past that at some point AMD did support the fallback layer, or at least someone managed to hack it in. http://boostclock.com/show/000198/DXR-fallback-layer-perf-preview.html is one such example. There was another that I can't seem to find, but someone tested Radeon VII with Microsoft's DXR samples and found performance was severely lacking. Outside of possible driver support, my guess was it could be that DXR strongly prefers the graphics pipeline rather than the compute one.

Either way, there's a lot of compute power AMD GPUs could leverage with a "software based" ray tracing, but I guess it's moot to even try to convince AMD to add it in.

At least it's not as bad as AMD not bothering with DX11 deferred contexts.

There's things you need to understand.

  1. There's Ray Tracing old school, (Computationally expensive, like in movies)
  2. There's Ray Tracing emulated, (Very fast simulations like Sonic Ether's SEUS implementation
    and
    )
  3. There's Microsoft Ray Tracing, (DXR which is an OPEN standard like DX)
  4. There's NVIDIA's RTX Ray tracing. (Which is a closed standard like PhysX and proprietary)
NVIDIA's ray tracing is the fastest currently, and found on most AAA titles. But it has drawbacks. Mainly it's proprietary. This means support is questionable among game makers. And on AAA titles, its SAF (Slow As Freak). Only the game "Control" seemed to do okay with most options implemented. The 20 series is just under powered as a Ray Tracing engine for AAA games.

However I run ray tracing with my RX5700XT on Minecraft using SEUS. I get a really good 50->60fps @ 1080p.

Microsoft has recently released DX12 Ultimate with DXR (Direct X Ray tracing extensions) which is GPU agnostic. This means any developer can code once and not care so much about architecture (NVIDIA/Intel/AMD) These extensions will be used by the next gen XBox. This will port over to Desktop PC's well and will likely win out.

That said, DXR by Microsoft will not run well (if at all) on RX5000 series. RX6000 series is supposed to have support for it. But we won't find out until they reach reviewers hands.

In the end, I believe DXR will win out over NVIDIA's proprietary implementation. (Much how FreeSync/Adaptive sync won out over G-Sync. Open standards always appeal to a wider audience)

Right now performance of ray tracing is a big unknown on the RX6000 series. I've heard rumors of slightly faster than 20 series Turing, to 1.5x fast as Turing for the RX6000 series. As always WAIT FOR THE REVIEWS if it's an important feature to you.
NVIDIA's proprietary ray tracing API is OptiX. As far as I know, no game that does path tracing uses this. They all use DXR or Vulkan.

If there's anything proprietary, you must be thinking of DLSS.
 
  • Like
Reactions: TJ Hooker
The thing that bugged me is that AMD could support DXR (and possibly Vulkan's equivalent) through a fallback layer. I'm led to believe in some old posts in the past that at some point AMD did support the fallback layer, or at least someone managed to hack it in. http://boostclock.com/show/000198/DXR-fallback-layer-perf-preview.html is one such example. There was another that I can't seem to find, but someone tested Radeon VII with Microsoft's DXR samples and found performance was severely lacking. Outside of possible driver support, my guess was it could be that DXR strongly prefers the graphics pipeline rather than the compute one.

Either way, there's a lot of compute power AMD GPUs could leverage with a "software based" ray tracing, but I guess it's moot to even try to convince AMD to add it in.

At least it's not as bad as AMD not bothering with DX11 deferred contexts.


NVIDIA's proprietary ray tracing API is OptiX. As far as I know, no game that does path tracing uses this. They all use DXR or Vulkan.

If there's anything proprietary, you must be thinking of DLSS.

From what I understand the AAA game titles were tapping directly into OptiX. Thus it's proprietary. NVIDIA was releasing tech demos for things like Tomb Raider, BF5 etc way before DXR was released. And this falls in line with NVIDIA's business model of closed standards to lock people into eco systems.
 
Gigabyte already released a five fan card, their Super OverClock HD 7970...

https://www.techspot.com/news/49385-gigabytes-five-fan-super-overclock-radeon-hd-7970.html

Sure, they're five tiny fans, but five fans nonetheless, running at up to 10,000RPM on a thick triple-slot cooler with nine heatpipes and a vapor chamber. They later did a version of the GTX 680 using the same cooler.

The problem with such a design is you lose a large percentage of your fan area to the hub. The area right behind the hub typically has dead airflow. To overcome this limitation you need faster fans (which are noisier) and diffusers/flow mixing dampers/blades. I have never seen this on a consumer card even though it would be cheap to implement.
 
From what I understand the AAA game titles were tapping directly into OptiX. Thus it's proprietary. NVIDIA was releasing tech demos for things like Tomb Raider and such way before DXR was released.
If OptiX was being used, I'm pretty sure it would've been said. But every game that has a path tracing render path released since the GeForce 20 series is using either DXR or NVIDIA's ray tracing extensions on Vulkan.

I mean, if you can provide me something of a commercially released game that uses OptiX instead of DXR, I'll be happy to see it.
 
  • Like
Reactions: TJ Hooker
If OptiX was being used, I'm pretty sure it would've been said. But every game that has a path tracing render path released since the GeForce 20 series is using either DXR or NVIDIA's ray tracing extensions on Vulkan.

I mean, if you can provide me something of a commercially released game that uses OptiX instead of DXR, I'll be happy to see it.

Such code is hidden from us peons. However a dependency chain check on the dll as it executes will show it's loaded. It may be entirely possible it's vulkan extension only and that the OptiX is just loaded and not used. But that would be "odd" Maybe the Vulkan is tapping into the OptiX. Hard to say. Either way it's not agnostic.
 
From what I understand the AAA game titles were tapping directly into OptiX. Thus it's proprietary. NVIDIA was releasing tech demos for things like Tomb Raider, BF5 etc way before DXR was released. And this falls in line with NVIDIA's business model of closed standards to lock people into eco systems.

OptiX is designed for offline rendering. While it offers order-of-magnitude improvement over CPU-based ray-tracing, we're still talking minutes per frame here.
 
  • Like
Reactions: TJ Hooker
fellas , do you know if 5700XT has ray tracing like nvidia or not ? can it be enabled in crimson ?
I feel late to the party, but the short answer is, it's possible to run DXR on AMD hardware before Navi 2. The problem is AMD doesn't want to enable support for it.

Ray tracing doesn't require special hardware to run. In fact, screen-space reflections use basic ray tracing. A lot of so-called ray tracing mods add SSR or at least make it better and since they're technically using ray tracing, they can call it that. Also the solution NVIDIA went doesn't require special hardware either. It's just that NVIDIA has hardware acceleration built in. DXR and possibly Vulkan have a so-called fallback layer that the API can use to run ray tracing on GPUs that don't have hardware acceleration.
 
  1. There's Ray Tracing old school, (Computationally expensive, like in movies)
  2. There's Software Ray Tracing emulated, (Very fast simulations like Sonic Ether's SEUS implementation and )
  3. There's Microsoft Ray Tracing, (DXR which is an OPEN standard like DX)
  4. There's NVIDIA's RTX Ray tracing. (Which is a closed standard like PhysX and proprietary)
"RT old school" (e.g. movies) is high fidelity, 'full RT' (at least in big budget movies from the last ~10 years), done without HW acceleration AFAIK, and not real time. SEUS is another RT implementation (that either doesn't use or at least doesn't require HW acceleration), designed for real time applications and to be compatible with existing graphics card drivers/APIs. DXR is a real time RT API, i.e. a way for developers to add RT to their applications (as compared to 'rolling their own' implementation a la SEUS), with or without HW acceleration. So far RT games for the most part only use RT for part of their rendering (and possibly at a lower fidelity compared to movies), I believe largely because full RT rendering still isn't feasible. Although full RT looks to be becoming a possibility in less demanding games, e.g. Minecraft RTX.

Nvidia's RTX is their driver/hardware (including fixed function RT acceleration) implementation that supports DXR/Vulkan RT (among other things). Yes, it's proprietary, but as others have pointed out it's not an alternative to DXR, it generally works with DXR. AMD's software/hardware implementation of RT for video games will also very likely be proprietary (definitely the hardware anyway).

The advantage NVIDIA has is DLSS 2.0. [...] This allows you to run considerably faster because you can render at a lower resolution and let AI upsample to 4K. That said, this technology is dependent on NVIDIA supporting it on their game servers. So it is per game dependent. AI training is expensive so don't expect them to do it on every game.
To expand/clarify a bit, DLSS 2.0 no longer requires game specific AI models. I.e. they train and maintain a single model, which is used for all games. Games still require support for DLSS (2.0) on a case by case basis though. Also, DLSS now allows for arbitrary target resolutions, so you could choose e.g. 1440p, 4K, or 8K as your output resolution. Render resolution is determined by your DLSS quality setting I believe.

NVIDIA was releasing tech demos for things like Tomb Raider, BF5 etc way before DXR was released.
I don't see how this is significant; although DXR may not have been formally released its development was obviously well on its way when they were releasing these demos. Nvidia probably had a fair bit of input into creating the DXR API, given that they were the only graphics vendor working on a real time RT ecosystem while DXR was being created. RTX and DXR may very well have developed in parallel to some extent.
 
Last edited:
Ray tracing doesn't require special hardware to run.
It may not 'require' special hardware but dedicated hardware is a lot more everything-efficient at solving billions of intersections per second than an equivalent shader-based implementation. You don't need special hardware to render graphics either but few people would consider the DirectX reference rasterizer anywhere near fast enough to be considered playable beyond pretty old games.
 
  • Like
Reactions: Shadowclash10
It may not 'require' special hardware but dedicated hardware is a lot more everything-efficient at solving billions of intersections per second than an equivalent shader-based implementation. You don't need special hardware to render graphics either but few people would consider the DirectX reference rasterizer anywhere near fast enough to be considered playable beyond pretty old games.
It's kind of a problem when people think that ray tracing requires some sort of hardware to even use. I encountered someone who thought NVIDIA was scamming us after Crytek showed off Noir.