News This Is What AMD's Radeon RX 6000 "Big Navi" Looks Like

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
It's kind of a problem when people think that ray tracing requires some sort of hardware to even use. I encountered someone who thought NVIDIA was scamming us after Crytek showed off Noir.

They used to demo ray-tracing on the Amiga 500. So of course we don't need special hardware to ray trace. The complication is all due to God. If He had only created Adam and Eve as two perfectly shiny spheres...
 
It does. The rays just go in the reverse direction, from the objects' surfaces to the camera.
That would imply the pixel should receive a cone of "samples", which from what I've seen, it's more like a straight line from the pixel to the closest object.

EDIT: In most articles I've read about the subject, rasterization itself also doesn't resolve the color of the pixel. It only resolves what the pixel should cover. Ray tracing can do both.
 

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
That would imply the pixel should receive a cone of "samples", which from what I've seen, it's more like a straight line from the pixel to the closest object.

There is no cone if the camera is only capable of capturing rays perfectly perpendicular to the lens. Rasterization can be thought of as an extremely limited form of ray tracing. Screen-space reflections is extremely limited too. You can't do a mirror. You can't do shiny balls. Why call it "a form of ray tracing"?
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Gigabyte already released a five fan card, their Super OverClock HD 7970...

https://www.techspot.com/news/49385-gigabytes-five-fan-super-overclock-radeon-hd-7970.html

Sure, they're five tiny fans, but five fans nonetheless, running at up to 10,000RPM on a thick triple-slot cooler with nine heatpipes and a vapor chamber. They later did a version of the GTX 680 using the same cooler.
I had forgotten about that one. This is one of those things that is so stupid it's great. As far as functionality goes, I don't see why this would work any better than traditional fans blowing onto the pcb design.
 

InvalidError

Titan
Moderator
As far as functionality goes, I don't see why this would work any better than traditional fans blowing onto the pcb design.
If you are going to crossfire those things, then the top GPUs' fans end up dead-ended on the back of the next GPU below. You don't have that problem with fans that suck air from the card edge.

Also, since the fans are on the card's edge, hot air exiting the heatsink on the motherboard's side of the GPU heatsink has to make its way all the way back to the GPU's other edge to re-enter fans, which gives case airflow a better chance of carrying it away. With fans that blow directly at the PCB, all the air going toward the motherboard gets immediately sucked back in since typical front/top/rear fan setups won't provide any meaningful case airflow there to flush it out.
 
If you are going to crossfire those things, then the top GPUs' fans end up dead-ended on the back of the next GPU below. You don't have that problem with fans that suck air from the card edge.

Also, since the fans are on the card's edge, hot air exiting the heatsink on the motherboard's side of the GPU heatsink has to make its way all the way back to the GPU's other edge to re-enter fans, which gives case airflow a better chance of carrying it away. With fans that blow directly at the PCB, all the air going toward the motherboard gets immediately sucked back in since typical front/top/rear fan setups won't provide any meaningful case airflow there to flush it out.
Actually, those fans apparently work in a pull configuration, so the warm air is exhausted out the fan side, away from the motherboard. Which is actually probably better, since less heated air is likely to get trapped under the card.
 

animekenji

Distinguished
Dec 31, 2010
196
33
18,690
With 2x8-pin connectors this thing had better be a lot faster than a RTX 3080 or else it's just another underperforming power hog, like we have been getting from AMD for years already.
 

animekenji

Distinguished
Dec 31, 2010
196
33
18,690
No, sorry. The current, released Navi cards don't have ray tracing. AFAIK, I don't remember AMD saying they'll get ray tracing down the line - they simply don't have the hardware baked in. However, there is a possibility that they'll get ray tracing like the GTX 1000 series did - to show that older cards just can't ray trace. Now, Big Navi, such as this card, should have ray tracing.

Do note that the RX 5700 XT is a very good card though, above a 2060 Super and a bit below a 2070 Super. Trust me, ray tracing doesn't really matter - coming from a 2060 Super. Really, the only game where ray tracing really shines and is awesome looking is Minecraft. The problem is: Minecraft is one game, Minecraft ray tracing kills your framerates, and there are only a few, prebuilt maps that actually work with ray tracing.

However, if you are asking about it as a purchase, I wouldn't buy a mid-tier GPU right now, as long as you are fine waiting a few months. Or just buy a ridiciouly cheap 5700XT/2060S/2070/2070S/2080/2080S/2080 Ti - everyone wants those off their hands because of Ampere.

Raytracing didn't matter last gen because of the performance hit the cards took when it was turned on, but that won't be the situation this time. The cards will have more cores and the RT cores will be greatly improved, so performance will suffer less when it's turned on. Also, a lot of people weren't willing to spend the money that Nvidia was asking for RTX 20 cards and made do with their old cards, AMD cards, or the GTX 16 cards. There won't be anymore GTX cards this time around, and people who have been waiting on the sidelines for raytracing to mature to the point where it is finally worth having, will be dumping all their old cards that don't have it. The market for games that use raytracing is going to explode overnight. Raytracing didn't matter last generation, but it will matter a lot this generation. If AMD doesn't offer it, or they offer only a weak implementation of it that isn't as good as RTX 30, they are going to be left in the dust. They can afford to get away with that on console, where the customer is locked in to the GPU that came with the system, but PC gamers have options. A PCIe card can be removed and replaced if the customer doesn't like it.
 
Last edited:
  • Like
Reactions: Shadowclash10

King_V

Illustrious
Ambassador
With 2x8-pin connectors this thing had better be a lot faster than a RTX 3080 or else it's just another underperforming power hog, like we have been getting from AMD for years already.

So, you mean, like how the RX 5700 and the RTX 2070 are both rated at 185W, and at approximately equal performance? Or how the RX 5600 XT at 150W outperforms the RTX 2060 at 160W?
 
With 2x8-pin connectors this thing had better be a lot faster than a RTX 3080 or else it's just another underperforming power hog, like we have been getting from AMD for years already.
2x8-pin power requirements don't really mean anything other than the card requires more power than AMD is comfortable with on a 6+8 pin setup. So the card could be anywhere from like 225W to 250W starting.
 
Last edited:
anyone knows if they'll release a SAPPHIRE NITRO + version of this.
While I imagine various versions will get released eventually, it was rumored that board partners had still not received hardware samples some weeks back, so I suspect we might only have reference models initially, much like what we saw with the 5700 and 5700 XT for a few months or so following their launch. Of course, with AMD moving away from blower-style reference coolers, that might be less of a concern this time around.

With 2x8-pin connectors this thing had better be a lot faster than a RTX 3080 or else it's just another underperforming power hog, like we have been getting from AMD for years already.
The 12-pin connectors used on the Founder's Edition RTX 3070, 3080, and 3090 are equivalent to two 8-pin connectors, just wired a bit differently on the card's end to make the wiring look a little more compact. And at least the 3080 and 3090 could be considered "power hogs" as you put it, both having higher power draw than any AMD graphics card from recent years.

Considering AMD is claiming relatively large efficiency gains for RDNA2, and the efficiency of the 3080 doesn't seem to have improved all that much from the 2080 Ti, it's very possible that big Navi could match or beat the 3080 in terms of efficiency. The RX 5000-series cards were already not too far behind the 20-series cards in terms of efficiency, after all.

Raytracing didn't matter last gen because of the performance hit the cards took when it was turned on, but that won't be the situation this time. The cards will have more cores and the RT cores will be greatly improved, so performance will suffer less when it's turned on.
I think you may have posted this prior to reviews coming out, but judging by the 3080's performance in games with raytracing enabled, RT performance didn't improve much. The relative performance hit for enabling RT seems to be nearly as large as it was with the 20-series, just there's more performance to go around at a given price level to begin with (assuming cards are actually available for those prices any time soon). I agree that we probably will see significantly more games utilizing RT this generation though.
 
  • Like
Reactions: applesteam

InvalidError

Titan
Moderator
The relative performance hit for enabling RT seems to be nearly as large as it was with the 20-series, just there's more performance to go around at a given price level to begin with
The difference in RT performance varies considerably depending on what set of RT features is being used and will likely change some more as games get tweaked around the 3000-series' feature set.
 
The difference in RT performance varies considerably depending on what set of RT features is being used and will likely change some more as games get tweaked around the 3000-series' feature set.
Maybe, but all the benchmarks I've seen with Ampere so far have shown the performance hit of enabling RT to be at least 90% of what it was for Turing. Much of this could be because a lot of the raytracing work is being done outside the RT cores. So, the actual ray-casting part that can kill performance on cards without RT cores might have been improved, but that already may have amounted to a relatively small portion of the total performance hit on Turing. So, things like initialization of the scene for RT and noise cleanup might be where much of the remaining performance hit lies.