News AMD announces RX 9000 event on February 28, shoots down rumors of 32GB model

I think it would be silly anyway, all the games are made for 16GB or less, the highest amount of RAM that has any serious market share. The only GPUs over 16GB are Nvidia 90 series, and the 7900XT and XTX, which hold too little market share to be worth optimizing games for.
 
"32GB might be too much for a mid-range consumer GPU"

Well of course it's too much for a mid-range consumer GPU. I don't want mid-range, I want a flagship! C'mon AMD, take it to Nvidia!
 
  • Like
Reactions: aberkae
This is not a terrible thing. Prices are already bananas at the current levels. Let's just keep trying to bring these prices down to the earth realm. 😬
 
Memory size and speed on video cards has essentially stagnated for 8yrs now.
They do this because the consumers tolerate it.
The 5090 has quadruple the speed and triple the capacity of the 1080ti. Accounting for inflation, price has doubled, though. Considering they can't keep it in stock, there's no incentive to lower the price.
 
they want you to buy the Instinct card for that.
AMD makes the cards that professionals want to get hard to get, or―if you somehow obtain it―no support to get it working. They won’t even outright disclose whether a card supports SR-IOV and give it some marketing name which comes with no further elaboration.

I would love to have an AMD card, but the software support for my use cases is not really there. PCIe pass-through to virtual machines is one of them, and it hasn’t been pretty for the last couple generations.
 
But what about 24GB?
24GB is also gonna be wasted on the performance they are targeting, it would also just add to the costs which would be passed onto the consumer. The 9070XT needs to match 5070Ti in raster at least and be $100+ cheaper to attract those keen on Nvidia's offering and that won't happen if they go for needless expensive features.
 
That's because they're saving 32GB for the RX 9080XTXTX. :nan:
Yup, they got a little too specific with the denial, the rumors are here to STAY.

Yeah, 32GB for a 70XT would be a lot of wasted VRAM for no particular gain.

Unless it's running an AI model, but I think they want you to buy the Instinct card for that.
99% of people don't need more than 16 GB of VRAM at the moment, and whatever premium they would charge for a 32 GB model would be a waste of your gaming PC money. But it's a card with performance likely between the 7900 XT and 7900 XTX while having 4-8 GB less than those cards. So I can see the attraction for some.

As for AI, Strix Halo with even more memory (64-128 GB) will be better for LLMs than the Radeon PRO W9000 version of the 9070 XT would be. And for something like image generation, it's unlikely you really need more than 16 GB.
 
"32GB might be too much for a mid-range consumer GPU"

Well of course it's too much for a mid-range consumer GPU. I don't want mid-range, I want a flagship! C'mon AMD, take it to Nvidia!

Too bad the last time they really took it to Nvidia, was 22 years ago, with Radeon 9800 Pro.

That was where ATi/AMD established its hardware as being clearly superior in every meaningful metric. It wasn't just the value alternative, it was the leader.

We 've been looking for a decent Nvidia rival ever since.
 
Last edited:
  • Like
Reactions: SonoraTechnical
I am once again seriously considering an AMD GPU for my next build. I don't find any value in RT so there's no need for Nvidia and their prices are out of control and require massive amounts of power. Now we are reading that apparently burned GPU cables are once again a thing.
 
I am once again seriously considering an AMD GPU for my next build. I don't find any value in RT so there's no need for Nvidia and their prices are out of control and require massive amounts of power. Now we are reading that apparently burned GPU cables are once again a thing.
Maybe you should consider 7900 XTX. It’s the cheapest 24 GB card you’re gonna find out there.
 
Memory size and speed on video cards has essentially stagnated for 8yrs now.
They do this because the consumers tolerate it.
8 years ago the top GPU was the GTX 980. It had 4GB of 7Gbps GDDR5 RAM for 224GB/sec RAM bandwidth. The RTX 3050 8GB (the 6GB came out later and was severely neutered) has 8GB of GDDR6 on a 128bit bus for 224GB/sec RAM bandwidth. That was the lowest end card released in the 3000 series for desktops and it was released in 2022. Going by the current 4000 series or RX7000 series we have the same amount of RAM (8GB) in the lowest tier but increased bandwidth. Basically a low end card today has at least double the RAM and at least equal bandwidth to the top tier 8 years ago. Going halo cards and it is VASTLY different. You have 6x RAM and 4X bandwidth on the 4090 vs the 980.
 
Too bad the last time they really took it to Nvidia, was 22 years ago, with Radeon 9800 Pro.

That was where ATi/AMD established its hardware as being clearly superior in every meaningful metric. It wasn't just the value alternative, it was the leader.

We 've been looking for a decent Nvidia rival ever since.
Apparently you forgot about the HD4770, HD5870, HD7970, and R9 290/X eras where their halo CPUs were as fast or faster than the nVidia competition. Now the 9700/9800 series from 2002/3 were AMAZING GPUs for that time. I personally had a 9800 Pro with 128MB RAM and it was great.
 
  • Like
Reactions: valthuer
What exactly would mid-range be? I mean, so far, haven't mid-range cards recently been in the 10-20 GB range?
The only 20 GB card available is the 7900 XT and that's an "entry level" high end card roughly equivalent to the 3090 Ti which was $2000 and the top enthusiast card 2.5 years ago. 4K cards aren't midrange. Anyway, midrange cards these days have anywhere from 8-16 GB of VRAM (4060 Ti to 7800 XT).
 
Too bad the last time they really took it to Nvidia, was 22 years ago, with Radeon 9800 Pro.

That was where ATi/AMD established its hardware as being clearly superior in every meaningful metric. It wasn't just the value alternative, it was the leader.

We 've been looking for a decent Nvidia rival ever since.
The most recent "decent" Nvidia rival was the 6900 XT (and 50 XT version I guess). It was the fastest GPU at 1080p so it was used for many CPU benchmarks by reviewers and was very competitive at 1440p and only lost on average at 4K to the 3090 but was still competitive unless heavy ray tracing was on. As drivers got better it started to beat the 3090 at 4K more often which prompted Nvidia to release the 3090 Ti. If Radeon wasn't a threat or competitive with the 3090, the Ti model would've never come out. Jensen doesn't like to lose.
 
This was an easy rumor to shoot down as dead in the water. No reason AT ALL for 32 GB of VRAM on 256-bit GDDR6 GPUs; 32 GB is top-end gaming GPU territory, which AMD clearly and officially state they weren't competing at this generation. The VRAM doubling over 16 GB would really only benefit AI and other professional & creative workloads, which is really Radeon Pro W series space. In fact, if anything, this rumor is really about the RDNA4 successor to the W7800 32 GB workstation card or thereabouts:
https://www.amd.com/en/products/graphics/workstations/radeon-pro/w7800.html
 
  • Like
Reactions: King_V