News AMD RX 9070 XT allegedly tested in Black Myth Wukong and Cyberpunk 2077 — RDNA 4 flagship purportedly lands 4% faster than the RTX 4070 Ti Super pe...

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You are a fool if you believe AMD doesn't know the price they are going to choose yet. They knew about the pricing after Nvidia CES presentation. The room is screaming 480$ for the 9070XT, but if they go 450$, that would be a game changer.

I believe AMD saw Intel little success with their new ARC GPU launch, and I think they get the message, which Frank also acknowledge with PCWorld, which he stated that pricing would please customers this time.
Only a fool listens to anything Frank Azor says. You better hope you're wrong. If AMD knew the price when Nvidia announced theirs without seeing any benchmarks, that doesn't bode well for AMD's performance. They must know it's going to be slower than the 5070.
 
It'll be great if the 9070 XT slots in at between the 5070 Ti and 5080 in performance, depending on the game, settings, frame gen, at a ~$500 price tag. Also, remember that GPU performance generally just gets better after launch as drivers are optimized further for architecture and games.

AMD has messed up soooo many times in the past by over promising and under delivering. They've surely learned their lesson by now, right...right...?😉
 
  • Like
Reactions: oofdragon
Intel needs to fix their driver overhead issues. Battlemage has great performance, for the price, but it requires a fast CPU to overcome its issues. They get that fixed, and it will be a great choice for many.
For sure!

I was actually sad when the street price was nowhere near the announced MSRP (at least in the UK) and then the strange driver overhead problem appeared. And this is not even counting the still ongoing shenanigans in older games and some current ones that just do not work. I have friends with Alchemist that still are struggling with some games. It's very polarising in its nature: your game either works "fine" or it's flat our broken/unplayable.

I hope they fix all the gremlins in due time, since the promised price point is excellent.

Only a fool listens to anything Frank Azor says. You better hope you're wrong. If AMD knew the price when Nvidia announced theirs without seeing any benchmarks, that doesn't bode well for AMD's performance. They must know it's going to be slower than the 5070.
I agree with you, but it's not a matter of "being a fool". Mr Azor is a spokesperson, just like Mr Petersen is, and they're just communicating what the rest of the marketing division has agreed to be let be* known at this time. This is not to say what he is saying is a flat out lie, but there's a huge need to read between the lines on what he's saying. I think the 9070XT will be competitive against the 5070, but they were definitely waiting on nVidia to announce pricing; and he, kind of, let us know as much.

Regards.
 
wouldn’t the relevant comparison be the upcoming 5070?
Ultimately, yes, it'll compete with the 5070, maybe even the 5070 Ti. For now, there's no way to properly compare it to another unreleased product, though we're obviously going to compare specs even if it's only an academic exercise at this point.
 
  • Like
Reactions: KyaraM
The RTX 5070 12GB is $550.
The RX 9070XT 16GB would be a solid choice at $450, but only if the RT performance is decent.
If the RT performance is poor, then it should be a $400 card.
No real gamer cares about the marketing gimmick called RT, and yes the 9070 XT will perform equal or better than the 7900XTX ray tracing wise which means even if you do believe the marketing you will be able to play any ray traced game just fine

Btw.. trading blows with the RTX 4080 16GB makes it a solid buy even at $50 less than this Nvidia GPU. At $500 it's legit a game changer
 
Maybe I'm missing something here, but comparing it to the previous gen Nvidia and getting mediocre results seems a bit disheartening. By AMD's own word, they changed the naming convention in order to help consumers make logical comparisons between brands - which means AMD drew a line in the sand when they went with the "70" on the end of the nomenclature. People are naturally going to compare the 9070 and the 5070... which means that in order to be successful, AMD will have to beat the 5070 either in performance or price. Based on the numbers shown here, I'm not sure if they can do either.

On another note - I care about Raytracing... I want to be able to run a game with the absolute best visual experience possible, therefore it is a big consideration for my future purchase.
 
  • Like
Reactions: valthuer and KyaraM
Maybe I'm missing something here, but comparing it to the previous gen Nvidia and getting mediocre results seems a bit disheartening. By AMD's own word, they changed the naming convention in order to help consumers make logical comparisons between brands - which means AMD drew a line in the sand when they went with the "70" on the end of the nomenclature. People are naturally going to compare the 9070 and the 5070... which means that in order to be successful, AMD will have to beat the 5070 either in performance or price. Based on the numbers shown here, I'm not sure if they can do either.

On another note - I care about Raytracing... I want to be able to run a game with the absolute best visual experience possible, therefore it is a big consideration for my future purchase.
I mean its not like they can compare it to the 5000 series from Nvidia, its not even out yet...
Maybe AMD can request a reviewer card from Nvidia:)
 
No real gamer cares about the marketing gimmick called RT, and yes the 9070 XT will perform equal or better than the 7900XTX ray tracing wise which means even if you do believe the marketing you will be able to play any ray traced game just fine

Btw.. trading blows with the RTX 4080 16GB makes it a solid buy even at $50 less than this Nvidia GPU. At $500 it's legit a game changer

1. Nope, you are in the minority if you think gamers don't care about RT.
2. You're contradicting yourself in the second half of your run-on sentence. If it's useless, why are they adding more RT cores?
3. 7900XTX performed like dog turds with RT enabled, despite having very good raster.
4. Okay, well if you want to pay more for less, then $500 it is, and get decimated in sales, again.
 
  • Like
Reactions: KyaraM
1. Nope, you are in the minority if you think gamers don't care about RT.
2. You're contradicting yourself in the second half of your run-on sentence. If it's useless, why are they adding more RT cores?
3. 7900XTX performed like dog turds with RT enabled, despite having very good raster.
4. Okay, well if you want to pay more for less, then $500 it is, and get decimated in sales, again.

It is rare I ever see anyone mention wanting to use RT. It's still a gimmick, and a long way from being the norm. Nvidia is adding more RT cores, because they are trying to push the technology.

Look into the past on CPU's. AMD introduced 64bit to the consumer market, yet nobody was using 64bit. They introduced dual core, when most software wasn't ready for it. It was to push a technology in hopes that it will give them an edge, over the competition, in the future.

Nvida is doing the same with RT cores. Also their obsession with AI is fueling the push to add more RT cores.

Most people lack a graphics card powerful enough to do actual RT gaming.
 
The numbers look pretty good, especially for games that favor Nvidia architecture.
However we all know that the base SKUs are only one puzzle piece. The AIBs crank up performance, price (and styling). Then there may also be bundling to factor. When AMD's last generation of CPU/GPU came out almost concurrently they were bundling some featured games with it which further complicated the calculus.
All I want is for crypto and AI to get the hell away from gaming / workstation GPUs so the hoarding and flipping these cards don't distort the market. The market is still screwed up today.
 
Given how far behind the 7800XT was behind the 4070 TI Super in RTing, looks like 9070XT has indeed made large strides being line-ball. Ratse rlooks good to and could indeed be better than 7900XT. Price though will depend on 5070 IMO, if that trades blows with 9070XT then people will still buy it despite 4GB VRAM less unless 9070XT is under $500 which is not likely. If 9070XT is stronger though $500 would be good price.

For me I want to know what the TOPS of the RDNA4 cards is too. Blackwell cards are insanely styrong and I use my GPU for a lot of AI photo upscaling etc. RDNA2 is miserably bad. I know RDNA3 is better but it's hard to find the number those cards unlike Nvidia's. My old 1080 Ti has much much higher TOPS than my 6800XT, like 5x better.
 
I mean its not like they can compare it to the 5000 series from Nvidia, its not even out yet...
Because of the stupid branding redo, AMD is intentionally trying to make its new '70' series compare to Nvidia's newest '70' series in the mind of the consumer. They even stated that was the objective. Neither brand has released their newest '70's' yet - so the correct comparison, available benchmarks or no, is the newest gen from AMD to the newest gen from Nvidia.

As for the raytracing topic, (and not directed to anyone in particular), I can't help but chuckle every time someone claims it's a "gimmick." That's like saying stars are just dots in the sky because you don't have the right telescope.
 
  • Like
Reactions: KyaraM
The h… you say someone tried to sell you …t even if it was free would you take it? I guess you could be paid to take it.
Medieval gongfarmers sold literal human s*** for money, and its modern equivalent, the "biosolids" industry, is worth several billion dollars worldwide.

Of course, I'm no farmer, so *to me* human excrement (raw or treated) has negative value ;-)
 
1. Nope, you are in the minority if you think gamers don't care about RT.
2. You're contradicting yourself in the second half of your run-on sentence. If it's useless, why are they adding more RT cores?
3. 7900XTX performed like dog turds with RT enabled, despite having very good raster.
4. Okay, well if you want to pay more for less, then $500 it is, and get decimated in sales, again.
Lol what bunch of nonsense. Ray racing is a gimmick, paying more for less is the exact definition of buying a Nvidia GPU. You apparently never played on a 7900XT vs anything Nvidia. At $500 virtually matching an Nvidia $1000 GPU only a f.. blind would choose to pay the Nvidia price
 
Medieval gongfarmers sold literal human s*** for money, and its modern equivalent, the "biosolids" industry, is worth several billion dollars worldwide.

Of course, I'm no farmer, so *to me* human excrement (raw or treated) has negative value ;-)
So I guess the question isn’t price its market. Everything has value to someone … so the issue is getting product to and finding that someone.
 
Well, give how RDNA3 is not a bad architecture, it's all a matter of price.

Outside* of the stupidity Jensen said on stage, the 5070 is going to be an ok card at ~$580, but we all really know it'll shoot up for the first weeks, or even months, to over $650 I'm sure.

AMD just needs to read the room. In the interviews, at least Mr Azor seems to be aware this is not going to be a tech challenge, but a pricing one. Or I hope they do.

All in all I don't see RDNA4 being a bad product, but it'll definitely live or die on day1 reviews based on their pricing.

Let's see how AMD fumbles the ball this time around 😀

Regards.
I’m interested how AMD, given more time to develop a coherent marketing strategy after deciding to delay till after the bombshell Nvidia presentation, how AMD will market RDNA against the “fake frames” performance metrics Nvidia used. Do they ignore Nvidia and show that RDNA 4 is more powerful in raw raster/RT than 4000 series skus and risk losing the laymen’s market to Nvidia’s “5070 is faster than 4090”? Or does AMD match Nvidia and direct compare frame gen-frame gen metrics? Or does AMD invent 6x frame gen just to win the already ridiculous “fake frames” duel?

At any rate, AMD needs to get the marketing right to have any hope of gaining market share.
 
Maybe I'm missing something here, but comparing it to the previous gen Nvidia and getting mediocre results seems a bit disheartening. By AMD's own word, they changed the naming convention in order to help consumers make logical comparisons between brands - which means AMD drew a line in the sand when they went with the "70" on the end of the nomenclature. People are naturally going to compare the 9070 and the 5070... which means that in order to be successful, AMD will have to beat the 5070 either in performance or price. Based on the numbers shown here, I'm not sure if they can do either.

On another note - I care about Raytracing... I want to be able to run a game with the absolute best visual experience possible, therefore it is a big consideration for my future purchase.
The problem AMD has right now is there is no real raw performance numbers on the 5000 series until 3rd party testers can publish their findings after the review embargo, so unless AMD wants to market simply on “fake frames” numbers against the 5000 series, they have to compare to the 4000 series.
 
I think the 5070 results are just OK, but its AI frame gen is really promising. These 9070 results are not impressive to me and just seem like more of the same. If FSR4 matches or exceeds DLSS4 I would give it a hard look. I would have scooped a b580 already if I could find one, but now maybe Intel will give us a taste of the b770 right around the time more about the 9070 is revealed.
 
  • Like
Reactions: KyaraM
Lol what bunch of nonsense. Ray racing is a gimmick, paying more for less is the exact definition of buying a Nvidia GPU. You apparently never played on a 7900XT vs anything Nvidia. At $500 virtually matching an Nvidia $1000 GPU only a f.. blind would choose to pay the Nvidia price
My fellow, if you can't look at RT on/off comparison images for the Indiana Jones game and not see an obvious difference, then you shouldn't be the one accusing others of being blind. 😉
 
  • Like
Reactions: KyaraM