News AMD Can't Beat Ada, So Brags About Old Ampere Comparisons

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Hey, that's just a downside of mass market hardware. We like high-volume manufacturing when it works in our favor, this is just one of its downsides.
That is the downside of having a duopoly in a market with high entry price. Makes it nearly impossible for anybody new to become a credible threat, which enables them to ignore whatever market segment they feel like since nobody will snatch it from them.

I'd hope Intel will go for serious market share with the B750 at sub-$300 for twice the A750's performance, though my expectation is that Intel will be its usual greedy self, designing and pricing itself out of the new challenger / early adopter price range.
 
  • Like
Reactions: King_V
That is the downside of having a duopoly in a market with high entry price.
I don't really know how you think competition fixes a situation like this any more than it has. GPU prices have come down, and competition isn't forcing new products to launch before old ones sell out.

The same thing is happening, in the NAND and DRAM markets. You have new products getting put on hold, while manufacturers try to control costs and cut down on existing inventory.

In general, having more players in the GPU market would put a squeeze on margins, but I don't see them forcing new product launches before existing inventory is gone. Except, if you have a product that's so weak it's completely nonviable at any price, but a company making such bad GPUs would be unable to exert much of a competitive influence on the others.
 
  • Like
Reactions: King_V
Ahhhh…that shiny new 4090 in the case was me paying for the privilege of not having to worry about GPUs for the next few years.

Overpriced? I dunno. The best should come at a premium… As for AMD cards… never owned one. Ever.

Nvidia has always been the best. If AMD made a better card I’d buy it.
If you buy a GPU for several years, going for the top of the range is usually a bad move : as soon as future games rely on a feature your very pricey GPU doesn't have, it'll suffer tremendously against even mid-range cards. See : programmable shaders, tessellation, async compute - this kind of stuff. That's before even mentioning the cost of keeping you system cool and fed with juicy electrons. I've been building gaming PCs for 30 years now, both professionally and family - so I do know about trouble-less systems and durable bang for buck.

If you've never owned an AMD card, then you don't know how good or bad they actually are - and since you seem to want to keep a card for years, then it's surprising because AMD cards ever since they were called "Ati" have been all about forward-thinking feature set.

Cases in point :
  • the R300 generation (Radeon 9500/9700) was the first DirectX 9 card ever - it was DX9 compliant before DX9 was even out, leaving Nvidia's Geforce FX "dustbuster" chewing on dust. Geforce 6000 barely caught up feature-wise and only beat Ati... On price. Owning a Radeon 9700 Pro meant at least 3 years being at the top of the world, Nvidia only managed to really beat it with the legendary Geforce 8800 series.
  • Second case was right when AMD bought Ati, the Radeon HD4850 wasn't particularly full of features, but there's a reason why it stayed on Tom's Recommended GPU list for almost 3 years : performance/price ratio was unbeatable for the duration. You could throw anything you wanted at it, it would chew it out and spit out names.
  • Third and fourth cases were GCN 2 (Radeon R9 270) and GCN 4 "Polaris", especially the RX480 8 Gb : 1440p gaming was possible from when it came out (2016) up until 2021. And now, it has kicked out the Geforce GTX1060 6Gb as the minimum card to have - better DX12/Vulkan performance, async compute support, variable refresh rate support and much larger VRAM, it's still serviceable today almost 7 years after it came out.
Yes, either I or someone in my family owned one of these cards. And I had to support them.
 
If you buy a GPU for several years, going for the top of the range is usually a bad move : as soon as future games rely on a feature your very pricey GPU doesn't have, it'll suffer tremendously against even mid-range cards. See : programmable shaders, tessellation, async compute - this kind of stuff. That's before even mentioning the cost of keeping you system cool and fed with juicy electrons. I've been building gaming PCs for 30 years now, both professionally and family - so I do know about trouble-less systems and durable bang for buck.

Hey that's cool... but given the current market the term "bad move" doesn't really apply to the 4090. It does a ridiculously better job at everything the 3090 did and it's not even a fair comparison.

https://www.3dmark.com/compare/pr/750873/pr/2328716

The numbers don't lie... the 4090 is such a huge generational leap over the 3090... the likes of which hasn't been seen for as far back as I can remember... and that's why I bought it.

When I said I expect to be good to go for a few years that's because I game in 4K Ultra and the 4090 will do 60 fps without much trouble so I just don't see myself needing an upgrade anytime soon... at least until UE5 is in widespread use.

AMD doesn't have anything that is remotely close to this performance.
 
Hey that's cool... but given the current market the term "bad move" doesn't really apply to the 4090. It does a ridiculously better job at everything the 3090 did and it's not even a fair comparison.

https://www.3dmark.com/compare/pr/750873/pr/2328716

The numbers don't lie... the 4090 is such a huge generational leap over the 3090... the likes of which hasn't been seen for as far back as I can remember... and that's why I bought it.

When I said I expect to be good to go for a few years that's because I game in 4K Ultra and the 4090 will do 60 fps without much trouble so I just don't see myself needing an upgrade anytime soon... at least until UE5 is in widespread use.

AMD doesn't have anything that is remotely close to this performance.
That's the problem right there : you will NOT be able to stay at 4K Ultra for years - games get ever more hungry for performance as time goes by. Also, gaming at 4K vs gaming at 1440p... Having tried both, while I did feel the resolution jump going from 1080p to 1440p on a 27" display, I didn't feel it going to 4K on a 32" display. For desktop use, maybe - but not games, not for now.
 
  • Like
Reactions: bloodroses
This is not how to sell your cards. AMD needs to have an identity for their GPUs to win mindshare
You mean an identity like her?
8761823_ra.jpg
 
Hope AMD cards sale well, then we could get cheaper Nvidia card.

My GTX1060 could barely maintain 60fps at mid/low settings nowadays

Don't want to grab neither RX 6700 XT nor RTX 3060 Ti at current price
I wouldn't hold my breath, AMD's campaign cries of desperate and Nvidia's previous gen GPUs are holding their price, they sell close to the original, then symbolic, MSRP, despite the fact that Nvidia isn't refraining from releasing new GPUs.
 
Both AMD and nVidia still have plenty availability of old cards, so I don't know why some people thinks this is one and not the other's problem. BOTH had (have?) a problem with old stock still on shelves and that explains why nVidia and AMD have been very slow on the rollout of the new gen at the "mid" (ugh; the new midrange sucks) and not even news for the "low" (again: ugh...).

As for the comparison itself. What's wrong with it? I mean, marketing be doing marketing things and keeping the conversation on their brand going. Are they lying in the comparison? Would it sway any nVidia fanboi from getting nVidia? No. Would it sway an AMD fanboi from getting an AMD card? No. It just gets their name on people's minds and, while I hate it I can't argue it is not true: as long as it gets your name/brand on people's minds, then it's good publicity.

Regards.
 
  • Like
Reactions: bit_user and King_V
100% agree. Best value is the cutting edge gamer who needs the best now and sells what was best a year or less ago used at a discount. One of my best GPUs was a 1080ti (Asus Strix I think) I got for nearly 40% less than new when people were jumping on board 2000RTX when it launched. I ran that for a little over 3 years. I think it was the larger vRAM it had that helped it run so long.
 
Both AMD and nVidia still have plenty availability of old cards, so I don't know why some people thinks this is one and not the other's problem. BOTH had (have?) a problem with old stock still on shelves and that explains why nVidia and AMD have been very slow on the rollout of the new gen at the "mid" (ugh; the new midrange sucks) and not even news for the "low" (again: ugh...).
Anything from a 3080 on up has been gone for months on the NVidia side. While available at other retailers, even 3070Ti's are all sold at Best Buy. You can still currently buy any RDNA2 card all the way up to the top 6950 XT.
 
The same thing is happening, in the NAND and DRAM markets. You have new products getting put on hold, while manufacturers try to control costs and cut down on existing inventory.
In the DRAM and NAND space, there is still enough competition left that manufacturers actually do have to cut prices sometimes to the point of making losses just to shift enough inventory to keep fabs going and avoid the full stop-restart costs. We wouldn't have ~10% price cuts every quarter to help move inventory if there were only two viable manufacturers left.

When you have actual competition, you cannot afford to hold inventory to jack up prices and let your competitors steal your customers like what happens with GPUs.
 
Anything from a 3080 on up has been gone for months on the NVidia side. While available at other retailers, even 3070Ti's are all sold at Best Buy. You can still currently buy any RDNA2 card all the way up to the top 6950 XT.
These are global companies, so it's not just about the USA. But you're actually justifying AMD's marketing here as their 6800 siblings are battling what is effectively the 3070 family and under. Much like you, I can't find any 6800XTs anywhere and 6800's are even harder to find over here in Europe and UK, more specifically. I do wonder how many 6950XTs AMD has left though, which to your point, seems to still be trickling down retail channels everywhere, so I'm not sure there. I can see plenty and I mean PLENTY of 3060s and 3070s still in websites all around.

But in any case, I doubt either of them still have stores with loads of pallets of GPUs in them. I'm sure they're now trying to liquidate the last remaining ones (hence I wasn't sure if it's already past or not completely). This is backed up the fact they'll launch their respective low/mid cards in May.

Regards
 
can't find any 6800XTs anywhere and 6800's are even harder to find over here in Europe and UK
Amazon UK have number of different RX 6800 XTs in stock now.

Further to what others are saying, if you don't play the latest AAA games and/or don't play any demanding games at 4K and higher, there is absolutely nothing wrong with buying a last generation GPU at a good price.
 
  • Like
Reactions: -Fran-
Amazon UK have number of different RX 6800 XTs in stock now.

Further to what others are saying, if you don't play the latest AAA games and/or don't play any demanding games at 4K and higher, there is absolutely nothing wrong with buying a last generation GPU at a good price.
Thanks for the correction. I only see two models in Amazon though:


After doing a new wider search, I did find more models, so my information may have been outdated. I could even find some RX6800's, but most are a tad overpriced. Only 1 model, in OCUK, looks decently priced.

Regards.
 
Sasa Marinkovic needs to shut his mouth. He has already managed to alienate god-only-knows how many people with his antics and AMD's recent launches of products have been absolute trash. Zen4 had motherboards that made their CPUs next to impossible to recommend and cynically re-naming the RX 7800 XT as the RX 7900 XT has had a negative effect on the entire Radeon RX 7000 product stack. There's nothing more annoying than a person acting arrogantly when they have nothing to be proud of.

Sure, the RX 6800 is a great buy on the market today but it's also far below his MSRP so what the hell is he taking credit for? Every time he opens his mouth and says something publicly, he sticks his foot in it. I honestly don't know how he managed to get where he is because all he does is annoy people.

I say this as someone who has had all-AMD builds since 2008.
 
Last edited:
Further to what others are saying, if you don't play the latest AAA games and/or don't play any demanding games at 4K and higher, there is absolutely nothing wrong with buying a last generation GPU at a good price.
Or don't care about RT Ultra.

I'm still using a GTX1050 because there is basically nothing worth upgrading to under $300 including taxes that doesn't look like a complete rip-off.
 
no many people care RT performance, specially in mid range card. Same happened with the introduction of Anti-Aliasing, at that time cards cannot handle it, it took years to be used as a norm
 
"Always" is a very dangerous word to throw around... it oft makes a point quite inaccurate.

Ok... maybe "Always" was the wrong adjective... but I'll just say that the GeForce 3 in 2001 was my first Nvidia card (Chameleon demo anyone?) and every time I've went to upgrade I've never seen anything better from ATI/AMD than Nvidia's current flagship.
 
Ok... maybe "Always" was the wrong adjective... but I'll just say that the GeForce 3 in 2001 was my first Nvidia card (Chameleon demo anyone?) and every time I've went to upgrade I've never seen anything better from ATI/AMD than Nvidia's current flagship.

I guess that depends on when you upgrade. I went from a Geforce 3 to a Radeon 9800 Pro that embarassed the Geforce FX that was "competeing" against it at the time. Radeon HD 4000/5000 series come to mind as well.