News AMD RX 9070 XT allegedly tested in Black Myth Wukong and Cyberpunk 2077 — RDNA 4 flagship purportedly lands 4% faster than the RTX 4070 Ti Super pe...

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
unless AMD wants to market simply on “fake frames” numbers against the 5000 series, they have to compare to the 4000 series.
Valid point, but seems like a bad look for AMD to be par for course against the competition's previous gen.

These 9070 results are not impressive to me and just seem like more of the same.
I think this sums up my thoughts exactly.
 
My fellow, if you can't look at RT on/off comparison images for the Indiana Jones game and not see an obvious difference, then you shouldn't be the one accusing others of being blind. 😉
Can you turn off RT in the game? I thought one of the annoying things of the game it was the "always on" requirement for RT? At least the shadows, no?

Also, the complaint is not so much "I don't see a difference", but more like "when I see a difference, it's usually worse visuals". HUB's Tim did a very good in-depth on current gen games with RT and subjective analysis on differences and their impact on visuals talking about the performance trade offs.

From that perspective, and considering that specific investigation, the conclusion is that there's at most 3 games where RT really makes a somewhat positive impact: Monkey game, CP2077 and one more I can't remember.

I like the idea of RT, but I have to side with the "it's still a gimmick" crowd for this one. I mean, look at City Skylines 2. It has given RT a very bad name.

EDIT: To add a bit more to this. When talking "visuals" in a more holistic manner, there's one area where nVidia "just realized" they can improve, which is textures. That's why VRAM and texture "AI filtering" was a topic in their presentation. Any game will look way way better as soon as you improve its textures, even with a lower poly count (complexity) and fancy lights/shadows (to an extent and when not detriment).

Regards.
 
  • Like
Reactions: TCA_ChinChin
Can you turn off RT in the game? I thought one of the annoying things of the game it was the "always on" requirement for RT? At least the shadows, no?

Honestly I don't know... I don't own the game, just seen the comparison images and footage on YouTube. It seems like an odd comparison for them to make if you can't turn off certain RT settings.

Also, the complaint is not so much "I don't see a difference", but more like "when I see a difference, it's usually worse visuals"

Makes me think of my dad... He thinks any image looks "bad" if it doesn't have major contrast. He and I would disagree 90% of the time on which comparison photo looked better. We like what we like, and it's an advantage to have less expensive taste.

I think Raytracing has its place just like PhysX had its place. Remember in the mid-2000s when everything was "PhysX this and PhysX that"... well,... we still get the same effects today, but only because the hardware advanced to the point where talking about it is no longer relevant. The same will eventually happen for RT.
 
  • Like
Reactions: valthuer and KyaraM
The double standard against AMD will never not make me shake my head.

AMD says the 7900xtx will be 50% faster and it ends up 20% faster and suddenly RDNA3 is the worst generation of cards ever.

Meanwhile Nvidia said the 4080 would be 4x faster than the 3080. Hub, GN, and several other reviewers found that even when using frame gen the 4080 was no more than 80% faster and when frame gen was not enabled it was usually around 25-35% faster.

Both companies lied horribly, but only AMD gets roasted over the coals.

Having owned many Geforce and Radeon cards i've never understood the rampant fanboyism. Ive seen just as many issues with Geforce as Radeon. Heck multi-monitor support has always been better on AMD. Drivers have always been easier to use and I don't need a 3rd party cleaning tool to remove them when stuff goes wrong... I've never seen that with Nvidia. Yes more games have DLSS than AMD's not really good for anything FSR and in some games it is noticable how much better the game feels with DLSS, but then I probably don't have a good enough monitor to truely take advantage of the full scope.

But then that's true of most of us.
The vast majority of gamers do not have the top end monitor, or cpu, or GPU...

If AMD actually releases these at the $480 price point then that is a huge win for these sorts of consumers.

Whether consumers are smart enough to take advantage of it or not is yet to be seen as it seems the vast majority of you are sold on shiny labels and perception of superiority rather than actual experience when using the hardware.
 
  • Like
Reactions: TCA_ChinChin
The double standard against AMD will never not make me shake my head.

AMD says the 7900xtx will be 50% faster and it ends up 20% faster and suddenly RDNA3 is the worst generation of cards ever.
The 7900 XTX AMD-edition had cooling issues, initally, and was underpowered. AIB partner cards get at least an additional 10% (sometimes 15%) on top of the 20%.
You're really cooking your TIM though. Many of these cards will start to have issues or need underclocking after 1-2 years, unless you repaste. AMD hotspot and memory junction temp issues are real. I kinda like to think of the 7900 XTX as the Vega 64 II. Can it perform great? Yes, if you stick a waterblock on it!
 
  • Like
Reactions: KyaraM and Taslios
The 7900 XTX AMD-edition had cooling issues, initally, and was underpowered. AIB partner cards get at least an additional 10% (sometimes 15%) on top of the 20%.
You're really cooking your TIM though. Many of these cards will start to have issues or need underclocking after 1-2 years, unless you repaste. AMD hotspot and memory junction temp issues are real. I kinda like to think of the 7900 XTX as the Vega 64 II. Can it perform great? Yes, if you stick a waterblock on it!
Is that more or less detrimental to the life of the card than the melting connector? (I've personally seen a connector melt soooooo I may be a bit biased on that point... heh and I didn't even put that in my initial post as an Nvidia negative....)

That said I think there are several reasons why AMD went back to monolithic this generation.
 
  • Like
Reactions: TCA_ChinChin
Is that more or less detrimental to the life of the card than the melting connector? (I've personally seen a connector melt soooooo I may be a bit biased on that point... heh and I didn't even put that in my initial post as an Nvidia negative....)

That said I think there are several reasons why AMD went back to monolithic this generation.
If your GPU's power connector melted then that is obviously worse.
The 16-pin 12VHPWR connector issue NVIDIA has been dealing with is due to a combination of factors -
Faulty design
Severely lowered safety margins
User error

NVIDIA's blasé, "You're using it wrong," response fueled anger even more. (rightly so, in my book)

For any GPUs (AMD included), constantly bumping up against the thermal limits could eventually cause performance degradation and possibly failure. Only the magic 8-ball knows when though.
 
Except when you factor in the power draw after adding a second monitor...

That said, as you pointed out, neither company is perfect. I think it's fair to say there's some reasonable cynicism regarding this upcoming generation.
True. Some of the people posting are reasonable. Others are just mad that it won't compete at the high end... as though they would have bought it if such existed.

People need to stop expecting AMD to keep NVIDIA Honest. the only thing that can do that is our own wallets.

Which would have been easier if there was full proper competition but alas... AMD barely survived the 2010s and they are still digging themselves out of a huge hole their lack of ability to fund development created.

I oft wonder what the world would be like if the initial Nvidia-AMD merger had actually happened.... if Hector had put his pride aside and let Jensen become the CEO. but alas not in this end of the multiverse.
 
So this card is about 4% faster than a 4070Ti Super, which in turn is maybe 5% faster than my original 4070Ti, and maybe equal in RT to that one. And that's impressive how, exactly? It doesn't even come close to their previous flagship, and still cannot scratch the current Nvidia gen in RT, let alone the upcoming one; which might be quite a bit faster or at least equal in rasterizing, too. This thing won't run against the 5070; it will run against the 5070Ti considering it is the XT variant and there will be a regular RX 9070, too. If the 25% raster improvement is real, this card won't come close. Even a 20% or 15% increase will be enough. AMD dropped the ball sounds more like it...
 
  • Like
Reactions: Jabberwocky79
Because of the stupid branding redo, AMD is intentionally trying to make its new '70' series compare to Nvidia's newest '70' series in the mind of the consumer. They even stated that was the objective. Neither brand has released their newest '70's' yet - so the correct comparison, available benchmarks or no, is the newest gen from AMD to the newest gen from Nvidia.

As for the raytracing topic, (and not directed to anyone in particular), I can't help but chuckle every time someone claims it's a "gimmick." That's like saying stars are just dots in the sky because you don't have the right telescope.
How is AMD supposed to compare their cards against unreleased hardware? Of course they will be competing against each other, newest gen vs newest gen, and reviewers will give us benchmarks once they are released. But its completely normal for any company, when releasing their new product, to compare against the previous generation of products especially when the NEXT GEN ISN'T RELEASED YET. That stands for Nvidia, AMD, and Intel.