News AMD Radeon RX 9070 series prices leak on Micro Center — starting at nearly $700 for XT versions

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Intel GPUs? Intel's GPU division has been bleeding red since it launched. Maybe it will do better with Battlemage, but Intel was so far behind on GPUs it basically didn't have a choice in how to attack the market. Even though Battlemage is selling out (for B580, not B570), I don't know that Intel has actually shipped that many GPUs.

AMD has to price the cards appropriately. Losing money on GPU sales isn't going to work. What AMD really needs to do is price the cards as high as it can get away with — that's a business decision. If chopping $50 off the price makes them more attractive to buyers and they sell out... then AMD wonders if they would have sold out anyway. And the answer right now is yes.
I think this is unfair. No one should care about the company (except for perhaps success for U.S. markets, or whoever your home nation is). We should only care about getting our moneys worth.
 
As stated by one certain Steve, Intel literally need to wrap their GPUs in packages made out of dollar bills to sell them. It's a whole other game there, where Intel tries to even enter the market.

AMD on the other hand runs the gig for 20+ years, they have a loyal customer base, a well-hated green big bad AND they are going to release these GPUs right at the point where there is a massive shortage of alternatives.

There is no reason whatsoever for them to price it lower than $700, especially because they're not really in the position to do so given the losses.

And as I said, you won't even be able to buy it at $700 for more than 30 minutes. Let's go crazy, they could toss it out at 500 too and it would correct itself to 900 anyway, with more going to AIBs and retail chains.
There is a rumor that Dell wanted to sell AMD Epyc Servers.

AMD basically said... you have to sell our GPUs and Desktop CPUs as well, we have HP and other vendors already happily selling Epyc.

Dell will be selling desktop AMD for the first time....


I think for a lot of the vendors AMD may have power to control prices... Sell X volume at X price or we can withhold CPU sales...

Won't affect Powercolor or Sapphire but those are already seen as premium in the Radeon market.
 
That remains to be seen.

I, personally, do not share that optimism. I bet for a few good months you will be seeing these selling for $900 give or take $50.

Unless AMD has some insane massive stock that will somehow end the whole world GPU shortage, the prices will creep up above MSRP, especially given it's a pure board partner launch.
... I see you didn't bother to read my entire post..

I said IF whoooole lot of IF

I hope AMD does the impossible and actually takes this opportunity by the horns and shocks us all

But... historically they never miss an opportunity to miss an opportunity.
 
If AMD's prices are ACTUAL price rather than imaginary price... this would make it about $300 to $400 less than the 5070 ti.

Some leaked reviews show the XT should beat the Ti in just about everything including Raytracing (take with a whole ocean of salt until multiple reviews go live).

IF both of those are true only nvid fanboys will still buy the RTX over the Radeon...

IF
whoooole lot of IF there.
Spoiler: It's not going to beat the 5070 Ti in nearly all tests. No way. I think it might win some, lose some, tie in others. And if AMD matches Nvidia on performance, it has to cost less. That's just a simple fact. Nvidia is the top GPU maker, and many people would rather have an Nvidia card — all other things being roughly equal. But all things are not equal, because Nvidia has a much larger software ecosystem.

CUDA basically works everywhere, with caveats. You can run the same code on a data center GPU, an RTX 50 GPU, or even a GTX 10-series part. ROCm by comparison only has limited support. I don't think anything below the 7700 supports it, and you need a 6800 or higher on the RDNA2 series. 6700 and below, and everything prior to RDNA2, is completely unsupported.

Then you factor in DLSS, Broadcast, superior video encoding quality, lower power consumption / higher efficiency.... Basically, AMD can't sell GPUs at the same price per performance ratio as Nvidia. Neither can Intel.

RTX 5070 Ti has 16GB GDDR7 memory.
RX 9070 XT has 16GB GDDR6 memory.

That right there tells you that AMD isn't planning to beat Nvidia in performance, by all indications. Nvidia has 40% more memory bandwidth. Could AMD come close? Sure. Will it win in some rasterization games? Undoubtedly. Will it offer better RT performance than the prior generation? It better! Will the RT performance match Nvidia's equivalent? I'd be shocked if it does.
 
  • Like
Reactions: adbatista
... I see you didn't bother to read my entire post..

I said IF whoooole lot of IF

I hope AMD does the impossible and actually takes this opportunity by the horns and shocks us all

But... historically they never miss an opportunity to miss an opportunity.
There is absolutely zero financial incentive for them to do this. Every dollar they reduce the price below what scalpers will get is lost money. If the market says the cards are worth $900+, then trying to sell them for $600 is just a terrible business decision that will have no positive aspects to counterbalance the lost revenue.
 
AMD on the other hand runs the gig for 20+ years, they have a loyal customer base, a well-hated green big bad AND they are going to release these GPUs right at the point where there is a massive shortage of alternatives.
well, they didn't have the loyaal customer base, they lost market share from 44.5% in 2014 to now barely at 10%, past 2-3 gen doing basically Nvidia -$50 style of pricing have cost them like 10% market share already..

Of course, it's their business, ppl like myself have quite some hope on them really roll out something really attractive this gen, if it's $700 I would personally hate why they don't just let it release in CES, you missed a major event and ended up still going Nv -$50
 
I think this is unfair. No one should care about the company (except for perhaps success for U.S. markets, or whoever your home nation is). We should only care about getting our moneys worth.
Unfair in what sense? I'm talking about Business 101 here, not wishes and hopes and dreams. As a company, you want to sell your product at the price that gives you the best return.

Let's say a product costs $100 to make. If you sell it for $500, you make $400 per unit in profit, but maybe you only sell 50,000 units. If you sell it for $300, you make $200 per unit but you might sell 500,000 units. Obviously the latter is the better choice. That's just an example that's not tied to GPUs or anything to illustrate the point.

These are not discrete steps, and it's why market analysis departments exist. AMD wants to plot price vs cost vs expected sales, then choose the spot on the curves that delivers the best overall returns. Dropping the price a bit for some additional market share and customer goodwill is a viable tactic, but we're talking maybe 5~10 percent, not 50% or whatever.

So, let's say that everything that goes into RX 9070 XT ends up costing $300. That's at least a moderately reasonable estimate. AMD, the AIBs, the distributors, and the retailers all need to make money, and the street price generally ends up being double the BOM (bill of materials) in most cases. So, $600 minimum then.

If AMD pushes that up to $650, everyone makes a bit more money, and in a supply constrained situation where you sell out, that's the best move. If it pushes the price to $700 and everything still keeps selling out, then that's also the right move. And AIBs will then push things higher, maybe $750 to $800 for "nicer" cards with RGB lighting and such, and up to $900 for "extreme" models. But the BOM isn't more than maybe $20 higher on the upscale models, so this is really just an excuse to charge more because the companies expect the cards to sell out.

My best guess, right now (before AMD announces actual pricing during tomorrow's event):
$649 for the 9070 XT, $549 for the 9070. And it expects the 9070 to beat the 5070 on performance (remember, it also has 16GB according to rumors), while the 9070 XT will probably come up slightly behind the 5070 Ti. We might get the 9070 at $499, if AMD wants to be aggressive / generous. But the early Micro Center prices don't indicate that's likely.
 
well, they didn't have the loyaal customer base, they lost market share from 44.5% in 2014 to now barely at 10%, past 2-3 gen doing basically Nvidia -$50 style of pricing have cost them like 10% market share already..
Yes, but know why this happened?

Nvidia had a better product. There is no changing that at least until UDNA a year+ off.

And there is so much they can undercut anyway, their whole 2024 is in red by several hundred million bucks. Do you expect them to make that even worse?

The point is - these GPUs will sell out anyway. So, what is the point?
 
Spoiler: It's not going to beat the 5070 Ti in nearly all tests. No way. I think it might win some, lose some, tie in others. And if AMD matches Nvidia on performance, it has to cost less. That's just a simple fact. Nvidia is the top GPU maker, and many people would rather have an Nvidia card — all other things being roughly equal. But all things are not equal, because Nvidia has a much larger software ecosystem.

CUDA basically works everywhere, with caveats. You can run the same code on a data center GPU, an RTX 50 GPU, or even a GTX 10-series part. ROCm by comparison only has limited support. I don't think anything below the 7700 supports it, and you need a 6800 or higher on the RDNA2 series. 6700 and below, and everything prior to RDNA2, is completely unsupported.

Then you factor in DLSS, Broadcast, superior video encoding quality, lower power consumption / higher efficiency.... Basically, AMD can't sell GPUs at the same price per performance ratio as Nvidia. Neither can Intel.

RTX 5070 Ti has 16GB GDDR7 memory.
RX 9070 XT has 16GB GDDR6 memory.

That right there tells you that AMD isn't planning to beat Nvidia in performance, by all indications. Nvidia has 40% more memory bandwidth. Could AMD come close? Sure. Will it win in some rasterization games? Undoubtedly. Will it offer better RT performance than the prior generation? It better! Will the RT performance match Nvidia's equivalent? I'd be shocked if it does.
(Take with whole ocean of salt)...

you literally used more words to say the same thing i was insinuating.

Regardless, the biggest factor here is if the MSRP is real and AMD somehow prevents stores and OEMs from inflating it... to borrow from GN, if AMD price = NVIDIA - $50 then AMD has failed and this will be the most disappointing launch ever.

Also the memory bandwith... I don't think GDDR7 is really having that much effect on the performance of the Lovelace cards. The 4070 TI Super had 672.3 gbps memory bandwidth. The 5070 TI has 896 gbps... thats so if memory was a direct uplift the 5070 should be at minimum 25% faster than the 4070 ti super... and it is not.

GDDR7 has the future potential of having 3gb clusters so once those are out a 256 bit bus will support 24 gigs of memory... that will have impact if they do a refresh, but currently it does not look as though the blackwell cards are using all of that supposed speed.

All that aside... I'm not expecting a rabbit out of the hat situation here. In their Radeon dept AMD has a culture that is reactive rather than proactive. I highly doubt they will change with this launch.
 
  • Like
Reactions: JarredWaltonGPU
Food for thought.

1740672828669.png

That little purple blob in the top-right is gamers now. From over half of all Nvidia income to maybe 10%. We're not going to get a bunch of cheaper and faster GPUs any time soon. We are the afterthought right now. And AMD? It will probably follow the same playbook and increase data center CPU and GPU production.
 
  • Like
Reactions: adbatista
>Spoiler: It's not going to beat the 5070 Ti in nearly all tests. No way. I think it might win some, lose some, tie in others.

Hope you got your asbestos suit on, come the review. I can already see AMD fans lighting their torches and sharpening pitchforks. Your star rating will be the star of the show, as usual.
It will come down to the price and performance, whether or not the MSRP is "real." It's the only thing we officially have to go by. But I'm not expecting RX 9070 XT to beat 7900 XTX in performance, as AMD has said nothing to suggest that's where it's going. Less memory, less compute, less cores, smaller chip, lower price. How low remains to be seen, and how fast the RDNA4 architecture ends up being remains to be seen. Probably a 4-star card, 3.5-star if the price is particularly bad (high), 4.5-star if it's particularly good (low). LOL
 
AMD had been touting cost savings in manufacturing with RDNA4, so I guess with Gen4 lots of folks hoped to see:

A high range of Radeon RX 9950XTX / Radeon RX 9900XT / Radoen RX 9900 combo (which they'd name Radeon RX 9095XTX / Radeon RX 9090XT / Radeon RX 9090 in today's parlance) for $1200/$950/$800usd.

Then maybe we thought a high midrange Radeon RX 9800XT / Radeon RX 9800 combo (which would be Radeon RX 9080XT / Radeon RX 9080) for $700/$650usd.

Then perhaps a true midrage Radeon RX 9700XT / Radeon RX 9700 (which is Radeon RX 9070XT / Radeon RX 9070 in today's parlance) for $550/$500usd.

But no GEN4 cost savings is being passed down. Basically they're pricing the midrange like the previous high midrange... and leaving off true high midrange and high offerings. We were told this was coming, but it's just now sinking in... Some of us were just setting ourselves up for disappointment waiting on that Radeon RX 7950XTX successor...

However, the thought of paying $500 for a future Radeon RX 9060 (a low range card) just isn't sitting well with allot of folks I guess. Everybody still having trouble adapting to the new pricing that's been the norm for a few years now.

Oh well. RDNA5 will be a long way off, so this is the reality for at least the next 2+ years...
 
Food for thought.

View attachment 392

That little purple blob in the top-right is gamers now. From over half of all Nvidia income to maybe 10%. We're not going to get a bunch of cheaper and faster GPUs any time soon. We are the afterthought right now. And AMD? It will probably follow the same playbook and increase data center CPU and GPU production.

Yep, this is why I keep telling people that nVidia isn't a gaming GPU company anymore. They make so much money from their datacenter AI GPU segment that gaming seems to Jensen's vanity project (by comparison).

I'm really hoping the 9070XT is $650 or less (preferable less) with the 9070 being around $500. That would put it well under the nVidia 70 models while providing comparable performance. Make sure there is sufficient inventory and gamers will flock to them.
 
  • Like
Reactions: JarredWaltonGPU
Food for thought.

View attachment 392

That little purple blob in the top-right is gamers now. From over half of all Nvidia income to maybe 10%. We're not going to get a bunch of cheaper and faster GPUs any time soon. We are the afterthought right now. And AMD? It will probably follow the same playbook and increase data center CPU and GPU production.
Nvidia announced Q4 yesterday. 6.4% for the quarter. 8.7% for the year ($11.4 billion out of 130.5).
 
yeah at that price unless you need the vram buy the 5070 as we all know RT is better on team green.

AMD had chance to steal a large chunk of gaming market if they had priced aggressively at time when nvidias at its msot vulnerable to the market (insane pricing & problematic connector/gimping vram)
 
>Spoiler: It's not going to beat the 5070 Ti in nearly all tests. No way. I think it might win some, lose some, tie in others.

Hope you got your asbestos suit on, come the review. I can already see AMD fans lighting their torches and sharpening pitchforks. Your star rating will be the star of the show, as usual.

I don't expect it to either. As leaks have suggested 7900xtx like performance in raster, it would put it probably about even with the 5070ti, give or take 5%. Now AMD just needs to have enough supply and the right price, as the 5070ti fails in those areas right now. If I could afford it, I would consider one to replace my RX 6800 with, so my RX 6800 can go back into my AMD rig, and I can finally get rid of my flaky 980ti that is in it right now.
 
  • Like
Reactions: JarredWaltonGPU
don't expect it to either. As leaks have suggested 7900xtx like performance in raster

Ehh it was RT that it got close to not raster. The XTX was significantly ahead.

The leaked article was careful to present a very specific view of first party (by AMD) benchmarks to generate hype. Most of the increase was from the enhancements to RT, raster was only marginally better.

The 9070 XT will have 256-bit memory bus with 20Gbps GDDR6, 640GB/s
The 7900 XTX has 384-bit memory bus also with 20GBps GDDR6, 960GB/s

This is the reason the XTX was entirely left out of the discussion, you don't overcome a 50% higher memory bandwidth with just architectural improvements. AMD decided just not to compete with the 80 models this time around, instead focusing on the 60 and 70 models.

https://www.tomshardware.com/pc-components/gpus/nvidia-geforce-rtx-5080-review/4

The 7000 series didn't have very good RT support though, and this showed with the 40 and recently 50 series beating them badly in RT titles. This is what AMD was addressing with the 9000 series, it gets complicated but several key RT instructions can now handle twice as much calculations as last generation. The XTX is generally around the same performance as a 4080 and slightly below a 5080 in rasterization.

Expecting the 9070XT to match the 7900 XTX in rasterization would be like expecting the 9070XT to match the 4080/4080 Super and 5080 in rasterization.