News Fake Rumors Suck: Will the Real RX 6000 Please Stand Up?

The thing I hate about rumors and "leaks" with websites that report news is that if the rumor is true, they give themselves a pat on the back and attempt to claim points, if the rumor was false... well, it was a rumor so it shouldn't have been taken with any more than a grain of salt anyway (even if the site was hyping it up as if it was true).

Always remain skeptical until it actually exists and was verified in the hands of someone other than the company making the product.
 
  • Like
Reactions: thisisaname
The thing I hate about rumors and "leaks" with websites that report news is that if the rumor is true, they give themselves a pat on the back and attempt to claim points, if the rumor was false... well, it was a rumor so it shouldn't have been taken with any more than a grain of salt anyway (even if the site was hyping it up as if it was true).

Always remain skeptical until it actually exists and was verified in the hands of someone other than the company making the product.
There are plenty of sites that actively retcon after a launch as well, going back and deleting all old references to incorrect specs, or at least redirecting to the real data. We do that to some extent as well, because that's just the way things work with web publishing these days. Still, I try to be very clear in our speculative / rumor posts when we do or don't know actual specs.
 
  • Like
Reactions: gg83

hannibal

Distinguished
Well it happens everytime... it can be Nvidia, Intel, amd, Apple... there always Are rumours that Are pure ....
this article was not wery well done... I have to say.
Rumours did say that nvdia ampera will have amazing raytrasing results... did not happen. Rumours did say that amd Zen2 would run 5.x GHz... did not happen... rumours say that Intel 7nm will save us all very soon. Did not happen (yet at least) rumours did say that iPhone 12 will have 120hz screen... did not happen
It is normal thing that allways happen and has always happened.

that is why... allways wait test results, no matter what company it is. This article was just a click bate that no matter how hard you could explain it...
 
Well it happens everytime... it can be Nvidia, Intel, amd, Apple... there always Are rumours that Are pure ....
this article was not wery well done... I have to say.
Rumours did say that nvdia ampera will have amazing raytrasing results... did not happen. Rumours did say that amd Zen2 would run 5.x GHz... did not happen... rumours say that Intel 7nm will save us all very soon. Did not happen (yet at least) rumours did say that iPhone 12 will have 120hz screen... did not happen
It is normal thing that allways happen and has always happened.

that is why... allways wait test results, no matter what company it is. This article was just a click bate that no matter how hard you could explain it...
Nope! Click bait would have been actually reporting on yet another RX 6900 XT or whatever benchmark leak. We chose to skip most of those because the 'sources' were so tenuous. This is me explaining why, and basically just saying wait until next week and we'll know for sure what the specs are, and pricing. Then we actually have something concrete to argue about, rather than some almost certainly fake rumors.

If I had written a clickbait headline, it would have been "AMD RX 6800 XT Performance Leaks Point to Better Than RTX 3080 Performance" and it would have gotten two or three times as much traffic because of the keywords and phrasing.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
810
20,060
JaredWalton, are we still up for the Root Beer bet based on what those performance numbers Lisa Su showed?

You claim those performance #'s Lisa Su showed were from the top of the GPU stack.

I'm betting it's not the top of the GPU stack, that's it's a lower tier card.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
JaredWalton, are we still up for the Root Beer bet based on what those performance numbers Lisa Su showed?

You claim those performance #'s Lisa Su showed were from the top of the GPU stack.

I'm betting it's not the top of the GPU stack, that's it's a lower tier card.
Even if they showed the 72CU GPU, the top of the line 80CU GPU has only 11% more CU's which means it still isn't going to catch the 3090, which is on average about 10-15% faster than the 3080. I don't really see the point in holding back such a small difference in performance either. You're proposing that AMD gave a sneak peak of a card that about equals the 3080, but at the grand unveiling of RDNA2, they're going to "shock the world" and reveal a card that is single digits faster than the sneak peak card and still slower than the 3090? Why would they do that?
 
JaredWalton, are we still up for the Root Beer bet based on what those performance numbers Lisa Su showed?

You claim those performance #'s Lisa Su showed were from the top of the GPU stack.

I'm betting it's not the top of the GPU stack, that's it's a lower tier card.
Yup. Loser has to Venmo the winner $2.50 -- get a quality root beer or two for that. :beercheers:
If AMD showed the second tier card, I lose. If AMD showed the top card that's launching in presumably November, I win.

Let's throw in a clause, though: If AMD showed the not-fastest Navi 21 card but the fastest card doesn't launch this year, it's a draw. So yeah, I'm aware of rumors that there will be Navi 21 XTX, Navi 21 XT, and Navi 21 XL (6900 XT, 6800 XT, 6800). If it showed 6800 XT but the 6900 XT is "coming later" (sort of like how with Zen 2, the Ryzen 9 3950X didn't launch until several months later), then we'll call it quits. (And if this happens, I will be severely disappointed in AMD.)
 
  • Like
Reactions: gg83

Conahl

Commendable
Apr 24, 2020
243
82
1,660
Even if they showed the 72CU GPU, the top of the line 80CU GPU has only 11% more CU's which means it still isn't going to catch the 3090, which is on average about 10-15% faster than the 3080. I don't really see the point in holding back such a small difference in performance either. You're proposing that AMD gave a sneak peak of a card that about equals the 3080, but at the grand unveiling of RDNA2, they're going to "shock the world" and reveal a card that is single digits faster than the sneak peak card and still slower than the 3090? Why would they do that?

rdna2 doesnt have to catch the 3090, it just has to be fast enough, where it makes the 3090, and the price nvidia is charging for it, look even more stupid. think about it spungimaster, IF amd did show its top card, in a TEASER, what would they have to show on the 28th, and in a way, what would even be the point of that event ? sorry to say this to you, but some of your posts, seems to indicate, you hate, or strongly dislike amd for some reason. and you also seem to, in other posts, praise intel and nvidia. amd has done a really good job with its Zen line, and looks to be doing very well with rdna and rdna2.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
810
20,060
Yup. Loser has to Venmo the winner $2.50 -- get a quality root beer or two for that. :beercheers:
If AMD showed the second tier card, I lose. If AMD showed the top card that's launching in presumably November, I win.

Let's throw in a clause, though: If AMD showed the not-fastest Navi 21 card but the fastest card doesn't launch this year, it's a draw. So yeah, I'm aware of rumors that there will be Navi 21 XTX, Navi 21 XT, and Navi 21 XL (6900 XT, 6800 XT, 6800). If it showed 6800 XT but the 6900 XT is "coming later" (sort of like how with Zen 2, the Ryzen 9 3950X didn't launch until several months later), then we'll call it quits. (And if this happens, I will be severely disappointed in AMD.)
I don't use Venmo, I was hoping to meet up IRL. Are you opposed to meeting up in Los Angeles somewhere?
 

CerianK

Distinguished
Nov 7, 2008
260
50
18,870
...(sort of like how with Zen 2, the Ryzen 9 3950X didn't launch until several months later)
I had assumed that they were still waiting on TSMC to tweak the 7nm node to be able to differentiate boost for the 3950X, and to be able to provide sufficient quantities. If true, then TSMC now has the node under decent control, so I don't expect any delays in any part of the GPU (or CPU) product stack (other than the expected refresh down the road). However, there are potentially pleasant ways that I could be wrong, so no worries really.
 
I had assumed that they were still waiting on TSMC to tweak the 7nm node to be able to differentiate boost for the 3950X, and to be able to provide sufficient quantities. If true, then TSMC now has the node under decent control, so I don't expect any delays in any part of the GPU (or CPU) product stack (other than the expected refresh down the road). However, there are potentially pleasant ways that I could be wrong, so no worries really.
Pretty sure it was delays to get enough chips binned for use as 3950X. Plus, AMD could basically do two 3700X or 3800X chips instead of a 3950X (whereas 3900X was only two 3600X/3600 CPUs). Supply was definitely constrained at the Zen 2 launch for a while.
 

gg83

Distinguished
Jul 10, 2015
639
293
19,260
Well it happens everytime... it can be Nvidia, Intel, amd, Apple... there always Are rumours that Are pure ....
this article was not wery well done... I have to say.
Rumours did say that nvdia ampera will have amazing raytrasing results... did not happen. Rumours did say that amd Zen2 would run 5.x GHz... did not happen... rumours say that Intel 7nm will save us all very soon. Did not happen (yet at least) rumours did say that iPhone 12 will have 120hz screen... did not happen
It is normal thing that allways happen and has always happened.

that is why... allways wait test results, no matter what company it is. This article was just a click bate that no matter how hard you could explain it...
The title starts with "fake rumors suck". How do you see it as click-bait?
 

Jim90

Distinguished
The vast majority of your readers are not kindergarten age - be careful of treating them as such. We're also fully aware of the current facts - that Lisa demo'd a 6000 series card which very likely gets 'close enough' to the 3080 (we also know that a hell of an achievement in speed of tech advancement for AMD's gfx dept). We also know that this might indeed have been their top card...or might not. We won't be disappointed if it is since it's performance is clearly excellent. We also know, for obvious reasons, that the 6000 series will almost certainly be more power efficient than Ampere.
AMD merely needs to price competitively and be able to deliver in enough numbers. With Zen2 we all know they were 'aggressive' in pricing and we all know the negative effect that had on Intel with their extreme downward pricing adjustments and market share loss, a loss which will clearly increase in speed with Zen3.
The vast majority also know that every leak is subject to a truth test, and that sites such as yours will help here in those reviews - understand also, that we NEVER base our purchase choice on one review site...benchmarks from one site are always compared with multiple sites, for obvious 'paid' and other reasons.

However, please do allow us to sit back and enjoy our leaks. We can all do with a good laugh.
 

D_wiz_kid

Honorable
Feb 21, 2014
2
0
10,510
There are tons of leaks and rumors surrounding RX 6000 right now, but we remain skeptical.

Fake Rumors Suck: Will the Real RX 6000 Please Stand Up? : Read more
Meh, it's looking like the 6800xt is going to crap on the 3080 in everything but Raytracing, which quite frankly, who cares? I bet you the 6900xt beats the 3090 in most practical benchmarks, and maybe not 8k and once again maybe not raytracing, but in terms of clock rate and vram.. also better performance/watt.. Book it.
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
rdna2 doesnt have to catch the 3090, it just has to be fast enough, where it makes the 3090, and the price nvidia is charging for it, look even more stupid. think about it spungimaster, IF amd did show its top card, in a TEASER, what would they have to show on the 28th, and in a way, what would even be the point of that event ? sorry to say this to you, but some of your posts, seems to indicate, you hate, or strongly dislike amd for some reason. and you also seem to, in other posts, praise intel and nvidia. amd has done a really good job with its Zen line, and looks to be doing very well with rdna and rdna2.
AMD doesn't have to beat a 3090. That isn't the point I'm making. If you're going to hold something back, you better have something worthwhile to show. Something that's 7-8% faster, which isn't even a performance tier higher, than what you teased and still slower than the top the competition has isn't worth holding back. In the end price/performance and features is what's going to matter, not the absolute performance in rasterized graphics. And you are clearly pro AMD, anti-anyone competing against AMD like about 75-80% of every message board and comment section. Nothing wrong with that, I don't really care. I'm not pro-Intel/Nvidia. I just don't believe everything they do is terrible like the AMD side does. On the flipside I will not give AMD any handicap or pass where they are inferior just because they are the perceived underdog as I know they are just as morally bankrupt as Intel or Nvidia. There are no good guys among these companies so picking sides is stupid. The company that makes the product that fits my needs the closest will get my money regardless of the politics behind the scenes which I don't care about.
 
  • Like
Reactions: JarredWaltonGPU

Conahl

Commendable
Apr 24, 2020
243
82
1,660
In the end price/performance and features is what's going to matter, not the absolute performance in rasterized graphics
and if the games one plays, doesnt support RT, or any of the features ampre offers, then rasterized graphics performance, does matter. right know, i could care less about TR, or DLSS, as i more then likely wont use them, a few friends i work with, also dont care about RT. they are waiting to see what rdna2 is like, and if it is on par, or faster then ampre in the games they play, they will get that.

If you're going to hold something back, you better have something worthwhile to show. Something that's 7-8% faster, which isn't even a performance tier higher, than what you teased and still slower than the top the competition has isn't worth holding back
whos to say amd doesnt ? as i said, WHY would amd in a teaser, show their best card ? if they did, then practically anything they do on oct 28, would be a waste of time.

And you are clearly pro AMD
yea, thats why the last amd cpu i bought, was a phenom 2 x4 i think it was, after that it has been intel, up until april this year, all 6 comps i have here were intel based, 3 of them, are now amd, wouldnt consider that pro intel or pro amd, bought what i did, as the performance and price was there.

I'm not pro-Intel/Nvidia. I just don't believe everything they do is terrible like the AMD side does

yea ok spungie, any thing you say. that sounds like you are a little bias against amd to me :)
 
and if the games one plays, doesnt support RT, or any of the features ampre offers, then rasterized graphics performance, does matter. right know, i could care less about TR, or DLSS, as i more then likely wont use them, a few friends i work with, also dont care about RT. they are waiting to see what rdna2 is like, and if it is on par, or faster then ampre in the games they play, they will get that.
Of course you don't care about RT and DLSS, since if you're not running an Nvidia card, you can't possibly use either one right now. AMD fans will only start caring about RT once RX 6000 series arrives and not before. They've been saying, repeatedly, that ray tracing doesn't matter because AMD GPUs can't do it. However, there are now multiple games where ray tracing actually does matter. Control absolutely looks better with ray tracing. Fortnite looks better with ray tracing, so if you're playing in creative mode, or just not ultra concerned with being competitive, turn it on. Some of the other games with RT effects aren't getting as much benefit from it (BFV, CoD:MW, SotTR, Metro Exodus). Others I just personally don't care about too much (Minecraft RTX, Quake II RTX, Justice, WoW, Ring of Elysium, Wolfenstein Youngblood). But many more games are in the works, and quite a few look like they'll benefit quite a bit from RT. And with the next generation consoles supporting the feature, the number is only going to grow.

RT is important, and AMD will have that with Big Navi. DLSS, though, may be just as important and it's a serious problem for AMD. It's proprietary Nvidia tech, and it's becoming much easier for games to implement. Unreal Engine supports it, and I think Unity does as well -- you basically just have to flip a switch in the code and add a UI to enable/disable DLSS. Even worse (for AMD) is that DLSS 2.x actually works. There were plenty of issues with DLSS 1.0, but DLSS 2.x is pretty dang awesome.

So, we can run benchmarks comparing AMD vs. Nvidia running the same settings without DLSS, but as soon as you enable DLSS, Nvidia is going to win in probably every game that supports it. Unless AMD can come up with a compelling alternative that works just as well (hint: FidelityFX isn't it, at least not in the current implementation), DLSS is potentially more of a game changer than RT.

Think about it: Nvidia can run 1080p with DLSS upscaling to 4K, end up looking better in many games than native 4K (because TAA sucks, but that's another matter), and with performance that's probably better than 1440p native. And 4K native normally runs 40-45% slower than 1440p. So, if your choice is:

  1. AMD, at 1440p native, getting 75 fps
  2. Nvidia, at 4K DLSS, getting 85 fps -- or 1440p DLSS getting 120+ fps
That's clearly an Nvidia win for anyone that cares about the gaming experience rather than pure apples-to-apples benchmarks.

I'm not really looking forward to the flame wars that are going to come up when we have to talk about image quality in relative performance for equivalent quality in future games that support DLSS. Because already, the writing on the wall says Nvidia will win those comparisons. Even if image quality ends up being equivalent for whatever AMD does vs. DLSS, unless AMD can do it while rendering a lower resolution, performance will likely favor Nvidia. I remember saying back at the original RTX launch, "You know, as cool as ray tracing sounds, DLSS and the Tensor cores could end up being more important." That's proving to be the case. Just like so many other things related to deep learning, the potential gains are incredible.

Cyberpunk 2077 is going to be an important test vehicle for both companies. If the ray tracing effects really help the overall look and feel of the game, and then DLSS means some of the modest Nvidia GPUs can keep up with Big Navi, the better GPUs for Cyberpunk will end up being from Nvidia. We'll find out in about a month exactly how the various GPUs from both companies stack up in various ray tracing games and non-ray tracing games, though. Watch Dogs Legion, Cyberpunk, Call of Duty, and several more are coming this year -- and there will probably be 20 or more RT + DLSS games released in 2021.
 

Conahl

Commendable
Apr 24, 2020
243
82
1,660
Of course you don't care about RT and DLSS, since if you're not running an Nvidia card, you can't possibly use either one right now. AMD fans will only start caring about RT once RX 6000 series arrives and not before
thats not entirely true, i have friends who DO have rtx 20 series cards and do play some of the games you mentioned, and they still are not that interested in RT ( dont ask me why, they wont give me a straight answer ). so to say its because of one being an AMD fan, is a little, close minded, for lack of a better word.

Control absolutely looks better with ray tracing. Fortnite looks better with ray tracing, so if you're playing in creative mode, or just not ultra concerned with being competitive, turn it on. Some of the other games with RT effects aren't getting as much benefit from it (BFV, CoD:MW, SotTR, Metro Exodus). Others I just personally don't care about too much (Minecraft RTX, Quake II RTX, Justice, WoW, Ring of Elysium, Wolfenstein Youngblood).
if you listed all the games that currently support RT, well, other then WoW, i play none of those :) thats why i am not interested in RT. TBH Jared, the price of admission for RT, was, and still is just too expensive on the nvidia side, and thats the other side of the coin. maybe this new gen of vid cards could change this, but still too early to tell.

DLSS, IF you are running a res that can take advantage of it, sure, but if not ? is there any benefit to running it vs native res ? i know know no one with a 4k monitor, a few with 2k and a few more with 1440p, but most, still use 1080p, which res would benefit most from DLSS ?

as i said if a vid card has a feature that one will not use right now, or they wont use in upcoming games they play, is there a need to put much weight in that feature when it comes to choosing which card to get ? specially if the card that has the feature, is too expensive to begin with ? aka the prices of rtx 20 series, all but the 2060 were priced ( canadian market ) out of a lot of peoples reach that i know, and RT on the 20 series, wasnt that good :) the 3080, starts at $950 cdn, again, out of reach. the 3070 ? IF it starts at $550 cdn, maybe, but i think it is $499 US ? so that could put it at, at least 650 cdn to start.
 

BILL1957

Commendable
Sep 8, 2020
59
17
1,535
rdna2 doesnt have to catch the 3090, it just has to be fast enough, where it makes the 3090, and the price nvidia is charging for it, look even more stupid. think about it spungimaster, IF amd did show its top card, in a TEASER, what would they have to show on the 28th, and in a way, what would even be the point of that event ? sorry to say this to you, but some of your posts, seems to indicate, you hate, or strongly dislike amd for some reason. and you also seem to, in other posts, praise intel and nvidia. amd has done a really good job with its Zen line, and looks to be doing very well with rdna and rdna2.
But what you are saying just keeps AMD as being the slower card as compared to Nvidia but a more bang for the buck option the reputation they have been holding for quite some time in the GPU market.
I am sure both companies actually want to be the producer of the "fastest" GPU available.
So for AMD to beat Nvidia it does need to be faster, it needs to also offer a competitive alternative to DLSS and be at least close in its RT performance.
And in addition to that have drivers that actually perform well at launch not months after launch.
Amd has a lot of boxes that need to be checked to actually beat Nvidia as being the best GPU performance wise available to buy