News AMD announces RX 9000 event on February 28, shoots down rumors of 32GB model

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The most recent "decent" Nvidia rival was the 6900 XT (and 50 XT version I guess). It was the fastest GPU at 1080p so it was used for many CPU benchmarks by reviewers and was very competitive at 1440p and only lost on average at 4K to the 3090 but was still competitive unless heavy ray tracing was on. As drivers got better it started to beat the 3090 at 4K more often which prompted Nvidia to release the 3090 Ti. If Radeon wasn't a threat or competitive with the 3090, the Ti model would've never come out. Jensen doesn't like to lose.
That's great for those with deep pockets, but how many folks bought a 3090 Ti? Not saying I don't want to see competition at the high-end, but more important right now is more competition at the mid-end.

I agree -- Jensen REALLY doesn't like to lose, lol.
 
  • Like
Reactions: Imaletyoufinish
I am once again seriously considering an AMD GPU for my next build. I don't find any value in RT so there's no need for Nvidia and their prices are out of control and require massive amounts of power. Now we are reading that apparently burned GPU cables are once again a thing.
Same here. I don't need a self igniting video card. I have been happy with my Nvidia RTX 3600. But am looking to upgrade.
 
The only 20 GB card available is the 7900 XT and that's an "entry level" high end card roughly equivalent to the 3090 Ti which was $2000 and the top enthusiast card 2.5 years ago. 4K cards aren't midrange. Anyway, midrange cards these days have anywhere from 8-16 GB of VRAM (4060 Ti to 7800 XT).
I was originally going to say 8-16, but figured I'd get the complaint that 8GB is entry-level now.

Then again, disappointment in no 32GB mid-level card just seems bizarre to me. Not even the top halo cards from Nvidia, which people seem to think MUST be matched by AMD, had that until the 5090.
 
  • Like
Reactions: Imaletyoufinish
What are the chances the RX 9070 and RX 9070 XT will be available at their MSRP on launch?

If unlikely, how long then until prices will return to MSRP or near?

Is it just a matter of waiting for demand to drop below supply?

I have a EVGA GeForce GTX 1070 SC Gaming ACX 3.0 8GB & have been considering upgrading for the last few years but am in Australia & would rather wait for prices to return to MSRP or near.
 
Then again, disappointment in no 32GB mid-level card just seems bizarre to me. Not even the top halo cards from Nvidia, which people seem to think MUST be matched by AMD, had that until the 5090.

Maybe that's exactly where the disappointment lies.

A mid-range GPU, combining a mid-range price with the VRAM of a flagship, would be a rare thing to see.

A lot of people would be hoping to get their hands on such a card.
 
Apparently you forgot about the HD4770, HD5870, HD7970, and R9 290/X eras where their halo CPUs were as fast or faster than the nVidia competition. Now the 9700/9800 series from 2002/3 were AMAZING GPUs for that time. I personally had a 9800 Pro with 128MB RAM and it was great.
Agree, the HD7970 was faster than the GTX680 but recieved little fanfare. Maybe because people were conscious about power consumption. Nowadays 600w ftw, power bill be damned, climate change whaat
 
People think way too small for their own good.

These "mid-rangers" are roffl-stomping on flagship cards of old for one.

Secondly, this hardware is used for way more than just playing games. Always has been.

And when it does come to games, a metric boatload of us play modded. And mods love VRAM!

Bet you anything, this mysterious 32 gig variant is a workstation variation, so indeed technically not a gaming card.
 
  • Like
Reactions: Imaletyoufinish
Maybe that's exactly where the disappointment lies.

A mid-range GPU, combining a mid-range price with the VRAM of a flagship, would be a rare thing to see.

A lot of people would be hoping to get their hands on such a card.
But would it be worthwhile? I mean, look at the performance difference, or, rather, lack thereof, between the 8GB and 16GB versions of the 4060 Ti.
 
8 years ago the top GPU was the GTX 980. It had 4GB of 7Gbps GDDR5 RAM for 224GB/sec RAM bandwidth. The RTX 3050 8GB (the 6GB came out later and was severely neutered) has 8GB of GDDR6 on a 128bit bus for 224GB/sec RAM bandwidth. That was the lowest end card released in the 3000 series for desktops and it was released in 2022. Going by the current 4000 series or RX7000 series we have the same amount of RAM (8GB) in the lowest tier but increased bandwidth. Basically a low end card today has at least double the RAM and at least equal bandwidth to the top tier 8 years ago. Going halo cards and it is VASTLY different. You have 6x RAM and 4X bandwidth on the 4090 vs the 980.
I had a AMD vega card 8yrs ago that was 16GB at 483.8 GB/s.
 
  • Like
Reactions: Imaletyoufinish
But would it be worthwhile? I mean, look at the performance difference, or, rather, lack thereof, between the 8GB and 16GB versions of the 4060 Ti.
Uh Hardware Unboxed Steve did a video on just that (and there are others, one by Tech Notice on how much the 16GB is needed for creators) . 8GB slows the 4060 Ti down in quite a few games and even when the frame rate is fine you have to deal with missing or inconsistent textures.
View: https://youtu.be/2_Y3E631ro8?si=utHAwroYNDoS_3f8


On average the 8GB 4060 Ti looks like a rebadged 3060 Ti and even gets beat by the 3060 12 GB sometimes. If you're OK with using it at 1080p medium then you'll be fine for pretty much all games but why would you do that with a $400 MSRP card? Up to 15% faster at 1440p Ultra in the Last of Us Part One with much better frame times is a big difference for just VRAM. That game also used 10.5 GB on the 16 GB card. At 1080p Max settings in Resident Evil 4, the 8GB card stuttered a lot and dropped frames badly with a 1% low of just 4 fps compared to 80 fps in that same section of the game with the 16 GB card. That's not a playable experience. Of course there's how bad it performs in Hogwarts Legacy but you can see for yourself. Maybe you didn't see these results when these cards launched? I've known about it since day one.
 
  • Like
Reactions: adbatista
That's great for those with deep pockets, but how many folks bought a 3090 Ti? Not saying I don't want to see competition at the high-end, but more important right now is more competition at the mid-end.

I agree -- Jensen REALLY doesn't like to lose,
I was just responding to someone saying AMD hasn't taken it to Nvidia at all in 22 years smh. Only reason why I mentioned high end cards since that's what they were talking about. Midrange is a different story obviously and much more competitive on a regular basis.
 
Agree, the HD7970 was faster than the GTX680 but recieved little fanfare. Maybe because people were conscious about power consumption. Nowadays 600w ftw, power bill be damned, climate change whaat
It received little fanfare because it was significantly slower and less power efficient than the 680 at launch. It wasn't until the 7970 ghz edition and many months of driver improvements that GCN1 became competitive. It was a forward looking architecture for its time, though, which meant after a few years it managed to pull ahead of the 680 in newer games.