News Watch AMD's Radeon RX 6000 'Big Navi' RDNA 2 GPU Launch Here at 12pm ET

ezst036

Honorable
Oct 5, 2018
509
379
11,920
I have high hopes for Big Navi. I'm planning for a new Linux gaming box and one of these is going to be in there.

With AMD's open source driver on Linux and Valve's constant striving for better gaming in Linux it's going to be a lot of fun!
 

Joseph_138

Distinguished
I'm not optimistic. AMD hasn't been competitive for a few years, ever since Nvidia dropped Pascal on us. Turing was 'meh', but Nvidia got away with it because AMD wasn't there to compete with them tier for tier, and dollar for dollar. The rumors I have been seeing online all seem to point to Big Navi being equal to, or faster than Ampere, but at a cost of greater power consumption. If power consumption isn't an issue for you, then it's awesome, but for anyone wanting to squeeze another year out of their power supply, it's not an option if the equivalent Big Navi card draws 20-30 more watts of power, regardless of whether it's faster or not. Many people can afford to upgrade their video card, but not their power supply at the same time. That makes the AMD card a more expensive upgrade than Nvidia.

AMD is also late to the raytracing party, so who knows if their implementation of it is compatible with Nvidias? Nvidia also has technologies unique to them that AMD won't be able to use like GDDR6X memory and DLSS. DLSS is particularly impressive because it allows a less expensive card to run at higher resolutions than it would normally be capable of in native mode. You can play in 4K using DLSS rendering at 1080p and the quality is impressive, judging by the stills and videos that are being shared. There are losses, yes, but they are not so noticeable that they would completely ruin the 4K experience. So you can play in 4K using a cheaper card than what you would require on the AMD side, making Nvidia an even better value proposition.
 

King_V

Illustrious
Ambassador
I'm not optimistic. AMD hasn't been competitive for a few years, ever since Nvidia dropped Pascal on us. Turing was 'meh', but Nvidia got away with it because AMD wasn't there to compete with them tier for tier, and dollar for dollar. The rumors I have been seeing online all seem to point to Big Navi being equal to, or faster than Ampere, but at a cost of greater power consumption. If power consumption isn't an issue for you, then it's awesome, but for anyone wanting to squeeze another year out of their power supply, it's not an option if the equivalent Big Navi card draws 20-30 more watts of power, regardless of whether it's faster or not. Many people can afford to upgrade their video card, but not their power supply at the same time. That makes the AMD card a more expensive upgrade than Nvidia.

AMD is also late to the raytracing party, so who knows if their implementation of it is compatible with Nvidias? Nvidia also has technologies unique to them that AMD won't be able to use like GDDR6X memory and DLSS. DLSS is particularly impressive because it allows a less expensive card to run at higher resolutions than it would normally be capable of in native mode. You can play in 4K using DLSS rendering at 1080p and the quality is impressive, judging by the stills and videos that are being shared. There are losses, yes, but they are not so noticeable that they would completely ruin the 4K experience. So you can play in 4K using a cheaper card than what you would require on the AMD side, making Nvidia an even better value proposition.

These are very strange things to say, particularly given that the RX 5600 XT outdid the RTX 2060 both in performance and power efficiency. And, for what you get per dollar, the only disappointment was the RX 5500 XT. The rest of the Navi cards blew away Nvidia in the price-to-performance ratio.

Further - why do you think the definition of "being competitive" absolutely MUST be matching Nvidia tier-for-tier, dollar-for-dollar. That's an extremely narrow definition.

Was grabbing the mid-range with Polaris a failure to compete? Was getting the consoles a failure to compete?

Seems like you selectively grabbed onto the "common wisdom" from a few years ago, and became dogmatic about it. Almost like you feel it's your sacred duty to be a naysayer.

Ultimately, we'll see what happens when the hardware gets reviewed.
 

InvalidError

Titan
Moderator
AMD is also late to the raytracing party, so who knows if their implementation of it is compatible with Nvidias?
No need to wonder, Nvidia's implementation of RT is proprietary to Nvidia just like the inner workings of AMD's RT are proprietary to AMD, rinse and repeat for every other company with RT hardware ambitions. The only thing that matters is how well DXR and other RT APIs can be mapped to each company's hardware, just like it is for regular 3D acceleration.
 

ajr1775

Distinguished
Jun 1, 2014
55
18
18,535
At this point, whoever is the first to get physical cards on physical shelves at my local physical store gets my money. Simple as.

This. Nvidia did a good job of amping demand. Now that void is going to be filled by AMD assuming their supply is on point.
 
Last edited:
  • Like
Reactions: raycrayz

raycrayz

Reputable
Oct 12, 2017
12
5
4,515
I'm not optimistic. AMD hasn't been competitive for a few years, ever since Nvidia dropped Pascal on us. Turing was 'meh', but Nvidia got away with it because AMD wasn't there to compete with them tier for tier, and dollar for dollar. The rumors I have been seeing online all seem to point to Big Navi being equal to, or faster than Ampere, but at a cost of greater power consumption. If power consumption isn't an issue for you, then it's awesome, but for anyone wanting to squeeze another year out of their power supply, it's not an option if the equivalent Big Navi card draws 20-30 more watts of power, regardless of whether it's faster or not. Many people can afford to upgrade their video card, but not their power supply at the same time. That makes the AMD card a more expensive upgrade than Nvidia.

AMD is also late to the raytracing party, so who knows if their implementation of it is compatible with Nvidias? Nvidia also has technologies unique to them that AMD won't be able to use like GDDR6X memory and DLSS. DLSS is particularly impressive because it allows a less expensive card to run at higher resolutions than it would normally be capable of in native mode. You can play in 4K using DLSS rendering at 1080p and the quality is impressive, judging by the stills and videos that are being shared. There are losses, yes, but they are not so noticeable that they would completely ruin the 4K experience. So you can play in 4K using a cheaper card than what you would require on the AMD side, making Nvidia an even better value proposition.

Anyone in that situation needs a new motherboard, new ram, new CPU, and new Video card.
 
  • Like
Reactions: Avro Arrow

ajr1775

Distinguished
Jun 1, 2014
55
18
18,535
I'm going to guess that between all of the new stuff AMD is making on 7nm, its TSMC wafer agreement is going to get stretched thin and supplies will be relatively limited for the next many months.

If so that is OK so long as they address that up front and not be like Nvidia who avoided that all together.
 
There is just so much wrong with this post. I'll have to break it down bit by bit.
I'm not optimistic. AMD hasn't been competitive for a few years, ever since Nvidia dropped Pascal on us. Turing was 'meh', but Nvidia got away with it because AMD wasn't there to compete with them tier for tier, and dollar for dollar.
100% TRUE - Because of AMD's near-bankruptcy, ATi had ZERO money for creating a new architecture and was forced to use GCN for far longer than it should have. Fiji should have been the last iteration of GCN because it was the last time that GCN was really competitive.
The rumors I have been seeing online all seem to point to Big Navi being equal to, or faster than Ampere, but at a cost of greater power consumption.
100% FALSE: It has been shown that it's about the same. I don't know what you've been reading but:
"So we see that the actual power consumption of the Navi21 XT should be about the same level as that of the GeForce RTX 3080! "
- Igor's Lab
Link to Igor's Lab article

Igor's Lab has been the go-to for hardware leaks lately. Where have you been getting your information?
EDIT: As we all saw, AMD actually claims lower power numbers compared to nVidia. They've been right about everything else so far so I see no reason to disbelieve them now.
If power consumption isn't an issue for you, then it's awesome
If power consumption IS an issue for you, then you won't be interested in Ampere either, especially since (as shown) they both have the same power draw.
AMD is also late to the raytracing party
No, nVidia was just early. Ray tracing is not a major feature in games but a frill at best. Most games do not implement ray-tracing. Having ray tracing performance that matches the RTX 2080 Ti would be more than enough.
, so who knows if their implementation of it is compatible with Nvidias?
Who cares about ray tracing, a technology that is so in its infancy that it's not even truly relevant yet? I can tell you right now that it won't be compatible with nVidia's tech because nVidia makes all of their tech proprietary to soak people more while ATi doesn't. The tech that ATi WILL be using will be DX12 (which AMD helped to create with Mantle), a technology that nVidia better get on board with because DX12 is the standard, RTX is not and never was.

Considering that ALL console games will be using AMD and ATi hardware, whatever tech nVidia is using will be completely irrelevant. Do you really think that Microsoft or Sony will gave a rat's posterior how a game runs on a GeForce card when their consoles are using Radeon GPUs? Here's a hint, they don't.

I think that you severely overestimate nVidia's power over a market that is actually owned by Microsoft (because, Windows and DirectX).

Remember that nVidia adapts to support DirectX, DirectX does NOT adapt to support nVidia.
Nvidia also has technologies unique to them that AMD won't be able to use like GDDR6X memory
I missed your post where you were crying that the RTX 3070 also doesn't use GDDR6X. Oh, that's right, you didn't make one. Why is that?
and DLSS. DLSS is particularly impressive because it allows a less expensive card to run at higher resolutions than it would normally be capable of in native mode.
I'm quite certain that everyone here is quite aware of what DLSS 2.x is capable of. We are, after all, tech enthusiasts. However, these are uber high-end cards being unveiled today. Something like DLSS would be more useful in the mid-to-lower end for cards that can't do higher resolutions like 1440p and 2160p natively to begin with. If a card performs well at 4K natively, why bother with upsampling? Upsampling is irrelevant at this level.
You can play in 4K using DLSS rendering at 1080p and the quality is impressive, judging by the stills and videos that are being shared. There are losses, yes, but they are not so noticeable that they would completely ruin the 4K experience. So you can play in 4K using a cheaper card than what you would require on the AMD side, making Nvidia an even better value proposition.
Paul at RedGamingTech has reported that ATi does implement a feature that is more or less the same as DLSS (under a different name). From what Paul reports, the picture quality will be somewhat inferior (he doesn't know to what degree) but the resulting frame rate will be higher (again, he doesn't know to what degree). He also said that, unlike DLSS which only works on some games (that support DLSS), ATi's technology should work on ALL games. I'm guessing that, again, ATi's tech will probably use a DirectX implementation for their upsampling an ALL games support DirectX.

You know, you're really projecting a bit of a frantic state of mind over this. You're also throwing out misinformation, wrong impressions and a seeming belief that nVidia is king and they rule all. You're also talking about ray tracing like Jensen Huang would by parroting what nVidia's marketing says while completely ignoring the reality of the situation.

You're one of three things. You're either an nVidia employee, an nVidia fanboy or a noob who really doesn't know better. Regardless, all of your "concerns" have been rendered moot now. You can be optimistic again.
 
Last edited:
  • Like
Reactions: King_V and raycrayz

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
Hats off to AMD on the performance front. Looks like the nailed it. It's almost eerie how close the AMD and Nvidia cards ended up based on where both sides were coming from. Now, we have to wait and see how the software team did.
 
Takeaways -

Surprised that AMD says the RX 6800XT competes with the RTX 3080 and that the RX 6900XT competes with the RTX 3090. Well played on their part. There were some early non-official statements/reviews that said the RX 6800XT would be more competitive with the RTX 3070.
Price is around what I expected. Cheaper than the competing NVIDIA counterpart. Where the 6900XT is concerned, much cheaper.
I'm not sure about the 'rage mode' and 'smart access memory' though. Rage mode is just an overclock and, as most of us know, there are a lot of variables in this. The SAM looks interesting though. I guess we'll see.

Of course, since this is AMD's announcement I will take all of this with a grain of salt. Can't wait for full, independent reviews.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
$999 for a 3090 competitor should help bring down the price of those gpu's from nvidia. i feel another "depends on the game to who wins" type thing coming soon and that's a good thing

overall looks like a good day for buyers either way :)

I dont think so , Nvidia would release RTX 3080 ti with 20GB VRAM and for $999 to answer AMD.
 
How do we know RTX has vastly superior ray tracing? DLSS is a nice feature but isn't used much right now. How is $999 for the 6900XT a joke?

I saw the early benchmark leaks. Combined with dlss it really isnt a comparison. And in all those presentations not once did they compare dlss and rt performance. Thats the typical slide of hand that ALL marketing companies do. So it wasnt an apples to apples comparison.

People who buy top tier expect it to be best at everything. The 6900 does not deliver this. Its just a tad slower without dlss. And again rt isnt competitive. 24GB is also more useful for conent creators.

Amd is giving you 8 cu more and a small clock bump for $250 more. Thats a 40%.

Again AMD, close but so far off the mark.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
How do we know RTX has vastly superior ray tracing? DLSS is a nice feature but isn't used much right now. How is $999 for the 6900XT a joke?
I had the stream running in the background, and wasn't listening to every word said. Did they even mention ray tracing during this announcement? I don't remember seeing any benchmark charts. That's not a good sign for ray tracing performance.