News Lisa Su says Radeon RX 9070-series GPU sales are 10X higher than its predecessors — for the first week of availability

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Still not sure there will be a B770. There are talks of a 2025 launch for Celestial, which may mean skipping Battlemage and pushing a full line up of C cards using Intel 18A.

Since we have seen no hints of Battlemage mobile or anything smaller in the line up, it does raise a lot of questions. (No Battlemage Pro cards either)

Battlemage cores look to be slightly larger, even after the shrink from N6 to N5. (Unless the B580 is not a fully enabled chip)

A 30/32 Xe core Battlemage would still be around the promised RTX 4070 like performance, which if sold cheap enough would still have a decent market impact. With $500-550 being the floor for that class of card, if they came in at $500, that would be quite good today. If they priced it like they did the B580, then it would be like $400, and could get them more good will.

I agree a 56-64 core Battlemage is not an economic winner.

Hard data on 18A with or without power Via vs TSMC N5 (or smaller) would make the fortune telling a little easier when it comes to Celestial. Tom's article points out rumors that Intel may have the clock speed advantage over TSMC N2, while TSMC retains higher density.

But if the news today about Nvidia looking at 18A for gaming GPUs is any indication, they are somewhat competitive in that regard.

Edit- Raw math, bad to use, shows about a 70% density increase from N5 to 18A (or even a little better with Samsung or TSMC), which would put a 400mm2 die like was used for the A770 at 54 or so Xe cores. So your 56 Xe core idea isn't too far fetched with a node shrink.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
Anyway, if AMD could effectively match the 5080 and 5090 on performance, I think right now there would be plenty of uptake if the price were reasonable.

And therein lies the problem. Pricing will not be reasonable, not right now.

Exactly.

And if somehow AMD managed to create a better GPU than the likes of 4090 or 5080/5090, we can be sure they'd price it just as high as Nvidia does.

Otherwise, there wouldn't be any decent profit margin.

Unfortunately, when it comes to cutting edge PC gaming... you have to pay the premium. A cutting edge GPU should be expensive.
 
  • Like
Reactions: JarredWaltonGPU
Battlemage while a very good improvement over Alchemist appears to be basically a lost generation. Intel was undoubtedly bashful about using a lot of wafer buys given how many A series sat on shelves despite being decent deals. It was also never going to beat AMD/nvidia in terms of cost to produce vs performance. It was only used in LNL as an IGP and there's no sign it will be used in anything else. There also haven't been leaks regarding other Battlemage GPU die since before the B580 launched. Typically if there was going to be another card launch coming we'd have seen something.

Then there's of course the elephant in the room: Celestial. It debuts later this year in PTL which should mark a bit over a year between new architectures appearing in actual products. If Intel is really looking for an opportunity to break in this should be a good time. We already know AMD/nvidia aren't dramatically shifting performance and by the time Intel would be bringing dGPUs it would be the middle of the current generation.

If Celestial is a similar perf/area improvement as Battlemage that should improve profitability quite a bit. If Intel can make a Celestial dGPU on say Intel 3 they should have much more volume flexibility than using TSMC. This is of course best case scenario, but I think seeing Celestial dGPUs next year is still more likely than seeing mid/high end Battlemage this year.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
My take is, considering how much hatred there is towards AMD GPU products and how many "3 years later" reviews there are on those same products that end with "actually, these products were great!", you should look at reviews that announce the best cost per frame and buy the best ratio for the best tier you can afford.
Surprisingly, they end up up being "AMD" quite often.
I am not anti amd?
I actually planning to go team red this gen as I dislike nvidias cost for crap performance on anything but their 80/90 tier.

My point was AMD not selling their own msrp version when its widely known even AIB's now scalp consumer means there was zero shot at an msrp model this generation. It was never anything to do about hating amd or the performance.
 
Well, I just got a 9070XT for £650, but it should have been ~£580 instead. It's a Sapphire Pulse, just like the 7900GRE its replacing. I also took the opportunity to get a 9950X3D to replace my 8500G; now I'll have all the NVMes working and a proper X16 slot working as intended, LOL 😀

I still think the price is too high for what AMD is offering, but at least when you draw the comparison to nVidia pricing for people that just wants to play games, it's less terrible. Slightly less.

Also, I've been playing CP2077 with just reflections on, using the 8500G and the 7900GRE, with everything else on "high" (except shadows; those are medium) native 1440p and it feels like a solid experience. And this is using the 8500G, which is a bottom of the barrel AM5 APU. I think it would be an even better experience if the obcenely terrible TAA implementation could be turned off.

Point is: I was running visual comparisons and, Path Tracing, now I can say with proper first hand experience, does not justify the performance drop. Just turning on RT reflections make the game look amazing. Personal preference, but I thought I'd toss that out there.

Regards.
 
  • Like
Reactions: valthuer
Well, I just got a 9070XT for £650, but it should have been ~£580 instead. It's a Sapphire Pulse, just like the 7900GRE its replacing. I also took the opportunity to get a 9950X3D to replace my 8500G; now I'll have all the NVMes working and a proper X16 slot working as intended, LOL 😀

Hey, congrats on the refreshed rig! 😎

Sounds pretty impressive!

With 9950X3D, you 're getting the best of both worlds; gaming AND productivity!

May i suggest you also try Black Myth: Wukong?

On this latest generation build of yours, it should be an amazing experience!
 
  • Like
Reactions: -Fran-
You apparently don't know about a lot of AMD GPUs. During the past decade or so, AMD has had "Made By AMD" (MBA) reference cards for:

R9 290/290X
R9 Fury X
R9 Nano
RX 480
RX Vega 56
RX Vega 64
Radeon Vega VII
RX 5700
RX 5700 XT
RX 6700 XT
RX 6800
RX 6800 XT
RX 6900 XT
RX 7600
RX 7700 XT
RX 7800 XT
RX 7900 XT
RX 7900 XTX

So yeah, it has also skipped plenty of GPUs, but the halo parts have had a reference design for every generation going back to at least the 290X. (390/390X didn't get a reference design, but those were effectively just a refresh of the existing Hawaii architecture — same goes for RX 6950 XT.)
Oh, sorry - I only got a Mach64, a Rage 3D, a Radeon 9500 Pro, a Radeon 9600, a RadeonHD 4850 (reference design by Sapphire - got blasted because too hot, as it was one of the few single slot GPUs out there), a Radeon 4870, a RadeonHD 7700 1Gb, a Radeon R7 270X, a Radeon RX 480 8Gb (reference design by Sapphire, got blasted because it used a blower design), a Radeon Vega 56, a Radeon RX 6600 XT... So no, I hardly ever owned a Radeon card in my 30 years of building PCs. /sarcasm
Reference designs were made according to AMD's specifications so as to sell at MSRP, but AMD didn't MAKE them.
And they were never called Founders Edition, so my comment stands.
 
3 years later they get a good review could that be down to it taking then 3 years to deliver a driver that does the card justice?
No, only that it takes 3 years for the hate to wind down and someone to test the card and say "oh - actually, they're quite nice" without getting blasted off the Internet by haters.
I've owned many Ati/AMD cards. And pretty much every time, they were good from the get go, sometimes got better with driver updates, but features they sported only got adopted late so by the time they could be fully used, they had left the spotlights.
Stuff like :
  • 32-bit colour 3D acceleration (yeah, that was a looong time ago)
  • programmable shaders (r300 : DX9 support before DX9 was out)
  • DX10 for cheap (HD4850 : reference on Tom's for 2 years 7 months)
  • accessible FHD gaming (7700 GHz edition)
  • Mantle then Vulkan
  • async compute
  • ReBAR
As for driver troubles... I had more of these with Nvidia cards, actually - juggling Detonator releases with my TnT, same with my Geforce 4200 8X, and my 6600, again with a 9600, some more with GTX 730 and 750, again with RTX 3060... Frankly, I never understood the AMD driver bashing : yes, they usually had teething pains for up to 6 months after release, but considering how long an architecture could stay up, that's nothing - on the other hand, Nvidia cards usually start to bug out after a couple of years because Nvidia introduce regressions that are hardly ever solved - meaning you have to juggle drivers.
 
Oh, sorry - I only got a Mach64, a Rage 3D, a Radeon 9500 Pro, a Radeon 9600, a RadeonHD 4850 (reference design by Sapphire - got blasted because too hot, as it was one of the few single slot GPUs out there), a Radeon 4870, a RadeonHD 7700 1Gb, a Radeon R7 270X, a Radeon RX 480 8Gb (reference design by Sapphire, got blasted because it used a blower design), a Radeon Vega 56, a Radeon RX 6600 XT... So no, I hardly ever owned a Radeon card in my 30 years of building PCs. /sarcasm
Reference designs were made according to AMD's specifications so as to sell at MSRP, but AMD didn't MAKE them.
And they were never called Founders Edition, so my comment stands.
Your comment was being intentionally obtuse. When someone says "the moment they said they weren't doing a founders was [the] day I knew MSRP for [AMD GPUs] wouldn't happen" and you responded with "AMD not making a 'founders edition,'" you even put it in quotes. That suggests that you already knew it wasn't actually called a "Founders Edition" and was talking about reference cards.

Then you go on with "they were merely base specs models made by Sapphire," which is factually wrong. I gave you a full list of reference MBA models that AMD had released. (That's a link discussing it as well, on Reddit.) Were they actually "Made By AMD" or by some company that AMD contracted? It doesn't matter, because it was AMD's design — the coolers on the Gigabyte, MSI, and yes, Sapphire models that use the MBA reference design are in fact identical.

You're right. AMD doesn't make Founders Edition cards. No one other than you suggested it did. You're also wrong that AMD doesn't do reference designs, because it has done more than a dozen such designs in the past decade. It doesn't matter if you owned a bunch of AMD GPUs, it's obvious that you didn't know about "a bunch of AMD GPUs" that were in fact reference designs, and you again perpetuated the incorrect claim that Sapphire makes the reference cards.

Who actually makes the reference MBA designs? Not Sapphire — it's PC Partner Group.
 
  • Like
Reactions: thisisaname
this tbh...moment they said they werent doign a founders was day I knew msrp for theirs wouldnt happen.

If the maker no longer has foudners then board partners can YOLO on what they list it as becasue what else you going to buy? intel???
You're right. AMD doesn't make Founders Edition cards. No one other than you suggested it did.
See above. You even upvoted his comment.
So, maybe I confused PC Partners made cards for cards sold by Sapphire according to AMD's reference designs (of which I owned a couple, as mentioned in my previous comment), but before you started hassling me all I made was correct someone calling AMD reference design Founders Edition, which is a copyrighted term for something that's as old as PC graphics - such as CGA cards clones made by someone else than IBM AKA 'reference design'.
 
Once again my "reference" AMD RX 7900 XT runs cool, quiet and fast. And it even still has that nice little USB-C/Virtual Link port that I can plug my PSVR2 into. 😊