News Newegg Insider Reportedly Reveals Radeon RX 6900 XT, RX 6800XT, RX 6700 XT Specifications

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
does it really matter? their drives will suck and your system will crash 10 times a day. I have a new system with 3700x and 5700xt and it crashes all the time with gpu driver issues. Stick a different GPU in and it runs fine. I am going for a 3080 for the fist time in 10 years.

I have had a 6950 HD (good card), R9 390X (good card, but hot) and now a reddevil 5700 xt. It is fast but I do a clean 9install every new driver and it still crashes up to 10 times a day.

You've mentioned this before but I dont think you've answered if you've ever done any basic troubleshooting steps. Back off whatever overclocks you've applied, check your power supply, ram, and disk. You could even do a fresh install of windows or RMA the card if youre sure its giving you issues. But just sitting there and complaining that your PC is crashing 10 times a day and doing nothing about it sounds like a personal problem.
 
  • Like
Reactions: King_V

Thretosix

Honorable
Apr 27, 2017
28
8
10,535
If AMD can at least deliver good results at a good price point............remember, Steam hardware survey listed a GTX 1060 as the most popular GPU.....then it will be fine. That coupled with all the gaming console business will keep them going. Also, they shouldn't have the availability issues at launch that Nvidia has encountered due mostly to such high and never before seen demand. https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
Steam is used by many gamers and non gamers e.g. software such as Wallpaper Engine are run from Steam. That survey includes the many people who play games such as Roblox essentially inclusive of casual gamers not requiring hefty GPUs. When we are comparing the latest from team red and team green I don't think this is the demographic that the GTX 1060 is representing, the fact that a 2 generation old Nvidia GPU as the majority in this survey is actually contrary to your suggestion of value. For instance why isn't it an AMD GPU, if you dig down the list as there isn't an AMD card represented until #9 in the list, only 2 AMD GPUs in the top 20 and #20 is an integrated graphics GPU from Intel. All of AMD only takes up 10.89% of the entire survey. There is truly great value that AMD offers for consumers, though the Ampere lineup in comparison to what the Turing architecture provided is showing great value right away. It's one of those things where we have to wait and see the price range AMD sets before anyone can draw any real conclusions. I just don't think this Steam survey represents the point you are trying to make. I do agree with your stance on AMD GPUs though.
 

bigdragon

Distinguished
Oct 19, 2011
1,142
609
20,160
I hope AMD delivers the performance this time around. I had an AMD 7950 in my PC prior to getting a 1070. These specs are looking good, but I'm still leery of GPUs that break 250W.
 
Steam is used by many gamers and non gamers e.g. software such as Wallpaper Engine are run from Steam. That survey includes the many people who play games such as Roblox essentially inclusive of casual gamers not requiring hefty GPUs. When we are comparing the latest from team red and team green I don't think this is the demographic that the GTX 1060 is representing, the fact that a 2 generation old Nvidia GPU as the majority in this survey is actually contrary to your suggestion of value. For instance why isn't it an AMD GPU, if you dig down the list as there isn't an AMD card represented until #9 in the list, only 2 AMD GPUs in the top 20 and #20 is an integrated graphics GPU from Intel. All of AMD only takes up 10.89% of the entire survey. There is truly great value that AMD offers for consumers, though the Ampere lineup in comparison to what the Turing architecture provided is showing great value right away. It's one of those things where we have to wait and see the price range AMD sets before anyone can draw any real conclusions. I just don't think this Steam survey represents the point you are trying to make. I do agree with your stance on AMD GPUs though.

The RX580's were great mining cards. I saw them sold for as high as $500. When you undervolted them and ran them on linux, their efficiency was hard to beat except maybe by Vega. This is why you never saw them on steam and why there was a sudden glut of 8GB RX580's for $130 after the mining crash.

The problem with GCN there was a lot of wasted circuits that were never used by regular gamers which caused the card to run hot. And to keep up with Pascal, AMD really had to push the 480 outside their efficiency envelope.


IMHO AMD started dropping the ball after the 7970 which was a great card for it's time. But they were resource strapped after that. Now they have the resources, they are a viable alternative. They are hand tuning architectures for maximum performance and borrowing technologies from the CPU division. Decent improvements and I think will compete nicely with the 3080 on non RT games.

Some I'm sure will gripe about how they didn't beat the 3090 and this makes NVIDIA still the market leader. They will then point out market cap. Market cap is purely based on what investors think you have. As NVIDIA did have an undisputed lead for so long, investors will flock to them. But the PE ratios are way off now there are valid competitors in the market space.

Nothing personal, but if you buy the 3090 card for anything but production work, then you have more money than brains.
 
  • Like
Reactions: King_V
I suspect the 6Gb is a typo, and teh 6700 XT will have 8GB instead, which would fit teh progression so far: (16, 12, 8).

a Probable 6600/6500 might be 6 and 4 GB, depending on how capable the Zen 3 APU graphics are
That's what I was thinking as well. I think that the only card that will have 6GB would be the base model RX 6500 because the RX 5600 XT already has 6GB. Since the RX 5500 XT has only 4GB, it would make sense to add 2GB of VRAM to that card if only to make it more attractive than the previous generation.
 
Unless the performance-per-CU significantly improved, it seems a bit unlikely that the successor to the 5700 XT would have the same core count as its predecessor, especially with Nvidia being more competitive with pricing this generation. The RTX 3070 will supposedly be somewhere around 50% faster than a 5700XT for as little as $500, and I suspect there will be a 3060-class card for around $400 or less with perhaps around 25% more performance than a 5700 XT. More likely, a 40 CU card would be priced closer to the 5600 XT's price slot at under $300, and at that level, 6GB of VRAM might make a little more sense. That would leave the 60 CU card to be more of a successor to the 5700 XT, with performance close to that of the 3070, while the 80 CU card might compete more with the 3080. Again, that's assuming performance-per-CU hasn't changed dramatically.
I don't see AMD pricing this generation of Radeons higher than the last. I fully expect the RX 6700 XT to be no more expensive than the RX 5700 XT. This is how it's supposed to be, new generations offer more performance for the same money. It's been like this since the dawn of the video cards (ATi EGA Wonder was $700 in 1987 and the VGA Wonder was $700 in 1988) If prices went up every generation then cards would be $20,000 each by now. I believe that any predictions of prices going up will be false.

The only outlier to this was when nVidia, who had their top Ti card at $650 for the GTX 980 Ti moved to $700 with the GT 1080 Ti and then $1200 for the RTX 2080 Ti. They sneakily moved their x80 (RTX 2080) to $800 and have kept the RTX 3080 at $800. People are like "oh but the performance-per-dollar is better" which has been true for EVERY generation of GPU since 1988 so it's not like it's some new thing. Big gains in performance and performance-per-dollar are the reason why new generations come out, not some special feature that only applies to this generation.

It's called "manipulating the narrative" and it works very well on people who are relatively new to buying video cards. For guys like me who started with the EGA Wonder back in 1988 (given to me by my dad for my first build when he got the VGA Wonder), this price manipulation by nVidia is VERY easy to see. I don't see AMD following suit here.
 
Last edited:

tamalero

Distinguished
Oct 25, 2006
1,226
242
19,670
This argument is pretty ironic, considering how NVidia is now facing similar issues with blackscreen and crashes.

does it really matter? their drives will suck and your system will crash 10 times a day. I have a new system with 3700x and 5700xt and it crashes all the time with gpu driver issues. Stick a different GPU in and it runs fine. I am going for a 3080 for the fist time in 10 years.

I have had a 6950 HD (good card), R9 390X (good card, but hot) and now a reddevil 5700 xt. It is fast but I do a clean 9install every new driver and it still crashes up to 10 times a day.
 
  • Like
Reactions: artk2219

Spj76

Prominent
Dec 7, 2019
11
2
525
does it really matter? their drives will suck and your system will crash 10 times a day. I have a new system with 3700x and 5700xt and it crashes all the time with gpu driver issues. Stick a different GPU in and it runs fine. I am going for a 3080 for the fist time in 10 years.

I have had a 6950 HD (good card), R9 390X (good card, but hot) and now a reddevil 5700 xt. It is fast but I do a clean 9install every new driver and it still crashes up to 10 times a day.
if you have to do that every day, you are doing something wrong.
Psu issues ?
are you using a daisy chain ? try using 2 cables, mine would not work with 1 cable with daisy chain.
mine is 8 months old with zero issues since then.
 
  • Like
Reactions: artk2219 and King_V
I don't see AMD pricing this generation of Radeons higher than the last. I fully expect the RX 6700 XT to be no more expensive than the RX 5700 XT.
I didn't say the 6700XT would be priced higher, just that we might see more cores at a similar price level, while the card with the same number of cores as the 5700XT could potentially be moved down into the sub-$300 range. A lot depends on whether or not there have been changes that significantly affect the performance of those cores though. Another card could be positioned above those as well.

As for increases in pricing for similar product names, we did arguably see that with the 5000-series, though AMD went from "70" naming on prior generations to "700" naming there. I think they were mainly just aligning more with Nvidia's new model numbers though.
 
  • Like
Reactions: Avro Arrow

hannibal

Distinguished
I suspect the 6Gb is a typo, and teh 6700 XT will have 8GB instead, which would fit teh progression so far: (16, 12, 8).

a Probable 6600/6500 might be 6 and 4 GB, depending on how capable the Zen 3 APU graphics are
192bit memory does not allow 8Gb... it is 3, 6 or 12gb...
but all in all that info was wrong in Many ways so we should Nit worry about. 6500 or 6300 will use that 20cu 192bit... or something similar, if there were any bits of truth in this ”leak”...
Rdna2 is good, but not 100% faster than rdna1!
aka 20cu with 192bit mem bus is slower than old 40cu 256bit buss 5700XT...
And hard to see from marketing point of view to make 6700xt slower than 5700xt... People should read these leaked fake news with some grain of salt attitude...
 

hannibal

Distinguished
if you have to do that every day, you are doing something wrong.
Psu issues ?
are you using a daisy chain ? try using 2 cables, mine would not work with 1 cable with daisy chain.
mine is 8 months old with zero issues since then.

Same here. 5700xt with zero issues... Rock solid, with Platinum rated psu feeding it.
my x570 motherboard on the other hand was really bugged on the release. Have been Also rock solid now, but first two moths it was quite bad.

all in all... Rushed released Are bad. No matter if it is Intel, AMD or Nvidia! Allways should use some time to really test the system with all bells and whisles... with rushed release you destroy the image of good product!
My x570 was bad in release, but rock solid now! Aka rushed product.
My 5700xt has been all the time solid for me, but some users did have problems. Rushed release.
Nvidia 2080ti did have problems at release... rushed release and now again 3080 has problems at release. Rushed release... Don`t manufacturers learn or is using customers as labrats so much easier and cheaper that They do this on purpose? Just wonder...
 
  • Like
Reactions: artk2219

Trantor2020

Reputable
Sep 29, 2020
2
1
4,515
does it really matter? their drives will suck and your system will crash 10 times a day. I have a new system with 3700x and 5700xt and it crashes all the time with gpu driver issues. Stick a different GPU in and it runs fine. I am going for a 3080 for the fist time in 10 years.

I have had a 6950 HD (good card), R9 390X (good card, but hot) and now a reddevil 5700 xt. It is fast but I do a clean 9install every new driver and it still crashes up to 10 times a day.
Strange! I had the also the 390x, upgraded last year to the 5700xt (original AMD) and not one crash. Maybe once or twice some odd behaviour after a driver update, but a clean uninstall with DDU did fix these. Maybe too much overclocking?
 
  • Like
Reactions: artk2219
I didn't say the 6700XT would be priced higher, just that we might see more cores at a similar price level, while the card with the same number of cores as the 5700XT could potentially be moved down into the sub-$300 range. A lot depends on whether or not there have been changes that significantly affect the performance of those cores though. Another card could be positioned above those as well.

As for increases in pricing for similar product names, we did arguably see that with the 5000-series, though AMD went from "70" naming on prior generations to "700" naming there. I think they were mainly just aligning more with Nvidia's new model numbers though.
Ah ok, I must have misunderstood you, my bad. I agree that AMD was aligning more with nVidia's pricing and they shouldn't do that because that's basically admitting that it's nVidia's market and is a very weak position to take. AMD needs to bend nVidia to their own pricing rather than bending to nVidia's.
 
  • Like
Reactions: artk2219
I'm not so sure given the bus size. putting 8GB on a 192 bit bus is a lil difficult. :D
You're right about that but for all we know, every leak has been inaccurate in one way or another. Like, I can't fathom how something like a 128MB cache can offset 128 bits of VRAM bandwidth (as compared to a 384-bit bus). Like, what exactly is 128MB supposed to store when the memory system's speed is measured in Gigabits per second?
 

spongiemaster

Admirable
Dec 12, 2019
2,346
1,325
7,560
Ah ok, I must have misunderstood you, my bad. I agree that AMD was aligning more with nVidia's pricing and they shouldn't do that because that's basically admitting that it's nVidia's market and is a very weak position to take. AMD needs to bend nVidia to their own pricing rather than bending to nVidia's.
AMD is doing the smart thing for once, piggybacking off of Nvidia's pricing. Nvidia's pricing is what is allowing AMD to charge more. AMD has learned to undercut Nvidia just slightly which prevents any response from NVidia so it doesn't initiate a price war which neither side will win. If Nvidia charges $500, and AMD comes in at $300 prompting a price response by Nvidia to $400, then you just lost half your price advantage. If the end result is a $100 difference, then AMD should have charged $400 to begin with and left Nvidia at $500, which is the strategy AMD is using now. AMD's job is to maximize their revenue, not minimize Nvidia's.
 
  • Like
Reactions: Avro Arrow
AMD is doing the smart thing for once, piggybacking off of Nvidia's pricing. Nvidia's pricing is what is allowing AMD to charge more. AMD has learned to undercut Nvidia just slightly which prevents any response from NVidia so it doesn't initiate a price war which neither side will win. If Nvidia charges $500, and AMD comes in at $300 prompting a price response by Nvidia to $400, then you just lost half your price advantage. If the end result is a $100 difference, then AMD should have charged $400 to begin with and left Nvidia at $500, which is the strategy AMD is using now. AMD's job is to maximize their revenue, not minimize Nvidia's.
I see what you're saying and from a completely profit-based analysis, I completely agree with you. The thing is, right now, it's not so much short-term profit that AMD needs because they've finally managed to get their debts paid off and EPYC is starting to make them a killing in the server space.

What AMD needs is to gain back marketshare and in effect, mindshare. The way I see it, they have to do two things in order for that to happen and they did one of these two things with Ryzen.

First, they severely undercut Intel's pricing which made people willing to give them a try. They had to do this because people don't like to switch from something that works and Intel, for all its faults as a corporation, made absolutely fantastic, stable products that people love to use and who can blame them? They had great performance and great stability. People were always willing to pay extra to get those things and they did. They have to do this to nVidia as well because nVidia also makes great products that perform extremely well.

The second thing isn't really applicable to Ryzen but is VERY applicable to Radeon. Fix the software side of the company because Radeon drivers, historically, have been gawd-awful. This is doubly-bad when nVidia drivers have historically been rock-solid. More than anything, I believe that this has been the secret to nVidia's success. Like Jensen says, "It just works" and that's what people want so the driver developers at ATi have got to get their act together. If they don't, they'll never be thought of as good as nVidia no matter how well their cards perform.

Maximizing profit per item when you have a tiny customer base doesn't do all that much. However, if ATi makes a fast and stable arch, all that AMD has to do is be willing to make less profit per GPU and increase volume. This is the only way that they can recover from the aftermath of what was the mining craze. Polaris was a perfect storm because RX 4xx and 5xx cards were cheap and had a fantastic hash-rate per watt for Etherium. That made Polaris expensive and hard-to-find for gamers. It's probably a major reason that the GTX 1060 is the most popular card on Steam by a country mile. At the most popular price point for most gamers, the GTX 1060 was basically the only game in town for gamers.
 

ajr1775

Distinguished
Jun 1, 2014
54
18
18,535
Steam is used by many gamers and non gamers e.g. software such as Wallpaper Engine are run from Steam. That survey includes the many people who play games such as Roblox essentially inclusive of casual gamers not requiring hefty GPUs. When we are comparing the latest from team red and team green I don't think this is the demographic that the GTX 1060 is representing, the fact that a 2 generation old Nvidia GPU as the majority in this survey is actually contrary to your suggestion of value. For instance why isn't it an AMD GPU, if you dig down the list as there isn't an AMD card represented until #9 in the list, only 2 AMD GPUs in the top 20 and #20 is an integrated graphics GPU from Intel. All of AMD only takes up 10.89% of the entire survey. There is truly great value that AMD offers for consumers, though the Ampere lineup in comparison to what the Turing architecture provided is showing great value right away. It's one of those things where we have to wait and see the price range AMD sets before anyone can draw any real conclusions. I just don't think this Steam survey represents the point you are trying to make. I do agree with your stance on AMD GPUs though.

The point was that a lot of folks get by with less.
Unless you have money to throw away, or it's for business, there is no point. To each their own. 100%+ cost increase for 15% performance boost for gaming? You could buy two cards for what that cost.

What's funny is if you could SLId two RTX 3080 you would get more than a 15% boost via the 2nd card but they took SLI out.