News AMD deprioritizing flagship gaming GPUs: Jack Hyunh talks new strategy against Nvidia in gaming market

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Avoiding competition for the high-end of consumer GPUs, has been a terrible decision on AMD’s part.

You’d think several consecutive years of declining market share would’ve tipped them off.

But it turns out no.
while true.....if they can make better mid tier cards then that may get them the share they want long term.

Nvidia has effectively removed its mid tier fro anything under near $600.

and history shows they are goign to constantly crank price for low/mid tier far outpacing the generational improvements.


the most popular gpu's are the 60 & 70 tier cards (outside ai/server) becasue for majority of people they are affordable & give you performance to play what you want w/o making you sell an organ.
Nvidia basically took that way w/ the 40 series however.
If AMD could replace that then they will gain the marketshare.


High-end establishes brand wins with benchmark wins
which amd hasn't ever won in many generations.
Think the last time they actually won the high end was the time they made the Radeon VII w/ its HBM2 memory....and that thing flopped due to its insane cost due to that HBM2.

and as more or the dev space for gaming and stuff utilize DLSS we go further away from rasterized performance which is only place AMD's ever won recently.

High-end buyers aren't price sensitive. For AMD to go high-end, it has to beat 5090 on performance. Bang/buck is not a thing for high-end

If AMD needs to stop chasing high end for a few yrs that is fine...look their CPUs...they were awful for many yrs, then they too ktiem to focus & we got Zen...and hwile Zen1 wasn't best they did improve on the stuff they had made in past and gained marketshare & now AMD cpu's are some of the best.

and honestly...so long as AMD can have great drivers & features w/ performance at a cost for the mid tier ppl? thats a win. Let the minority (yes they are the minority overall) of ppl who have no limit go nvidia...they were already doing that anyways so that doesnt change anything for AMD.
 

acadia11

Distinguished
Jan 31, 2010
965
26
19,010
There was a time a top a line GPU was $200, it’s insane a top of the line GPU cost the price of an old beater car. If this is AMDs decision a top end consumer GPU costing anywhere from $3-$5 grand is not far behind. AMD took over the data center market by simply having a better product with Opterons which were clearly better than competing Xeons. It’s never been the case AMD has been clearly better in the consumer desktop or GPU market except perhaps in the early athlon days and early days of ATI acquisition. Maybe it’s just a bottom line approach as you mentioned and putting money into the breadwinner data center/server business in terms of silicon acquisition as I’ve not seen being behind in performance working to gain market share in this space. Didn’t work for bulldozer and desktop mindshare or actual market share. So, Im curious as to where this strategy will lead , “outright conceding performance or halo product or the ability to compete” will put them in the mindshare department, because NVidia is clearly making products that are better up and down the pricing tiers in the consumer GPU space?
 
Last edited:

AndrewJacksonZA

Distinguished
Aug 11, 2011
596
106
19,160
Here is the easy way to win their market share : launch a GOOD and CHEAP GPU!!! Remember when the GTX 1070 launched for $350 actually beating the previous flagship GTX 980 TI that launched for almost double the price? Just release a 7900XTX at the same $350 price point!! Who in their sane mind is going to pay $600 (or $650/$700) for the RTX 5070 which will prolly reach GTX 4080 performance, when you can raster the same at half the price?? AMD is NOT going to win marketshare underpricing NV by $50 or $100.
Agree, but nothing in life worth having is "easy." It won't be easy for the shareholders who think short-term. For the shareholders who have longer-term thinking, it won't be easy either, but it might just be worth it.
 
Last edited by a moderator:

Papusan

Distinguished
Jul 26, 2016
59
48
18,560
AMD is afraid for Intel's advancements with their graphics cards. No one beats Nvidia and AMD is first one to see this hence AMD's main focus now will be to compete with Intel's upcoming graphics cards. AMD, good luck. You will need it if max profits in low/mid end GPUs is your goal.
 

usertests

Distinguished
Mar 8, 2013
928
839
19,760
"Nvidia gave up on gaming GPUs, let's do that too"

Classic AMD
Read the room. They competed near the top with the 6900/6950 XT and 7900 XTX. Their market share remains atrocious.

The RDNA4 lineup likely includes something around 7800/7900. That's more of a gaming GPU than most people on the planet need.

I'd still like to see a 32 GB variant of the flagship for AI. Something they can probably do if they're using cheaper GDDR6 again.
 

usertests

Distinguished
Mar 8, 2013
928
839
19,760
Unfortunately, halo products (eg 4090) do matter, for sales of mid and budget. To wit Steam's GPU survey for Aug'24. Saying it's an Nvidia dominance would be a drastic understatement.

https://store.steampowered.com/hwsurvey/videocard
You might be right about the halo effect. But it works better if you actually have the winning halo. And a perceived software lead is part of it, unfortunately for AMD.

As we all should know, AMD will switch strategies and naming schemes after just one generation. So if they think they can do better with some RDNA5 chiplet monster, they might try to grab the halo again. Until then, focusing on cheaper products when people are grumbling about high prices could be a good idea. If they can even allocate wafers to it.
 

magbarn

Reputable
Dec 9, 2020
154
148
4,770
Apart from the unsaid low-priority issue--AI being more profitable--also left unsaid is that AMD likely doesn't have the capability to compete against the 5090/4090 at the high end. So the question isn't "should AMD compete at high-end," as posed by THW, but rather "should AMD try anyway, even if it can't compete."

Per the 7900XTX's lackluster reception, the answer would be a big fat NO.

If anything, 7900XTX's weak performance vs 4090 did more to hurt the Radeon brand than to help it. If you can't win a fight, it's better to avoid the fight in the first place, than let the world know you're a loser.

The response is so obvious that I'm surprised the question is even asked. But I suppose it must, since the whole crux of "enthusiast PC" revolves around gaming these days. The GPU is now considered more important than the CPU, costing multiple times more, using up much more power, taking up much more space. The CPU's role is as a second banana, to not bottleneck the GPU.

So, it's understandable but at the same time somewhat amusing that J.Huynh had to repeat his "don't worry" three times. It's like a politician handing out campaign promises to constituents. You have to talk nice to everyone, even if you don't have much substance to offer.

Huynh's "I'm for scale" is of course a plausible rationale, just as THW's "halo products matter." But rationales tend to be after-the-fact excuses, and the fact is that AMD has neither the capability nor the motivation to compete on high-end GPU. No need to overthink it.
7900XTX failed because it was really a 7900XT with the 7900XT the true 7800XT. Only Nvidia gets away with things like that because they own the market.
 

acadia11

Distinguished
Jan 31, 2010
965
26
19,010
Yeah, I got that. It's just that I don't quite understand what exactly do they need the developers for? Is it the game engines themselves so it's the game engine developers we're talking about here or the game developers?

My naïve ass was assuming that DirectX is some standard interface between GPU and the game.
I would expect the optimizations in ispecific AMD GPU hardware vs NVidia GPU Hardware and their associated API calls, not to mention drivers. Example, while both Intel and AMD offer x86 processor's and they understand the same language, more or less, they do have differing implementation and additional functions that the other may not therefore you optimize for both the different implementation and then the difference in functions available, for game engine developers and the games themselves to take advantage. But this isnt' my area so probably some of the game developers can answer.
 
  • Like
Reactions: Sluggotg

acadia11

Distinguished
Jan 31, 2010
965
26
19,010
You might be right about the halo effect. But it works better if you actually have the winning halo. And a perceived software lead is part of it, unfortunately for AMD.

As we all should know, AMD will switch strategies and naming schemes after just one generation. So if they think they can do better with some RDNA5 chiplet monster, they might try to grab the halo again. Until then, focusing on cheaper products when people are grumbling about high prices could be a good idea. If they can even allocate wafers to it.
Yup, believe the
AMD is afraid for Intel's advancements with their graphics cards. No one beats Nvidia and AMD is first one to see this hence AMD's main focus now will be to compete with Intel's upcoming graphics cards. AMD, good luck. You will need it if max profits in low/mid end GPUs is your goal.
Intel has other problems. GPU's aren't going to matter for sometime for them. What's interesting is that are getting their lunch eaten in the server /data space by AMD where the big bucks are, ARM is becoming a stronger in this space, and the PC/laptop market is shrinking which is where they are strongest. As consumers start consuming their compute in various new ways and devices. They missed the boat on the crypto craze, and becausethey are lacking in strong GPU tech, they will be lacking in the AI department. I'm not quite sure anyone is worried about Intel in the GPU space for many many other reasons. But Intel has a ton of IP and still a very very very large business and brand mind share, betting on Intel could be a big winner or a heck of a bust. But I want bet on them in GPU space anytime soon. Maybe they'll do something in NPU space but I think Intel is the ford/gm of the microprocessor industry at the moment.
 
Last edited:
>so long as AMD can have great drivers & features w/ performance at a cost for the mid tier ppl? thats a win.

Unfortunately, halo products (eg 4090) do matter, for sales of mid and budget. To wit Steam's GPU survey for Aug'24. Saying it's an Nvidia dominance would be a drastic understatement.

https://store.steampowered.com/hwsurvey/videocard
I really don't think the halo effect is as big a deal as people make it out to be. AMD since the 5700XT has launched price competitive with nvidia and nvidia has the mindshare. If AMD launches the 7600 as a $200-225 part and the 7900 XT as a $700 part this entire generation starts to look different. Instead they both launched price competitive and people kept buying nvidia because the raster performance/$ was close enough with nvidia having the feature advantage. Launch pricing dictates so much of the narrative and how reviews come across which in turn dictates the generation.
 

vijosef

Upstanding
Feb 26, 2024
111
113
260
It was the software what gave Nvidia the edge.
Nvidia is always introducing the new stuff, like gsync, AI and DLSS. AMD plays catch up, and is always late and lower quality.
 
  • Like
Reactions: KyaraM

mhmarefat

Distinguished
Jun 9, 2013
66
76
18,610
They competed near the top with the 6900/6950 XT and 7900 XTX. Their market share remains atrocious.
Success of 6900 XT was insane. It actually drove Nvidia so insane that they created a GPU consuming 450 Watts of power, fearing they may lose to next generation of AMD if they do not adhere to such methods .

Yet AMD was not wrong in making high end cards such as 6900 XT/7900 XTX. Only they priced them moronically following blindly into Nvidia's footsteps.

Nvidia is always introducing the new stuff, like gsync, AI and DLSS.
DLSS has done nothing but to degrade modern gaming quality. 7 years since DLSS introduced and even the best implemented versions of it have many visual bugs/degradation in quality. You may say but it has increased gaming FPS greatly, but that was what developers would've pursued to accomplish. Instead they are like "user will turn on upscaler anyway, no need for optimization!".
Upscalers have resulted in almost complete death of gaming optimization.
 
I'm not sure there is any overcoming the obscene (at times unwarranted) Nvidia mindshare at this point. This strategy of abandoning the high end, while it is the right and worthwhile thing to do on paper, didn't really pay dividends for AMD when they tried it before with the RX 480+rebrands to RX5000 series, and the 'technology' discrepancy wasn't even that extreme at the time (except perhaps the power consumption).

It'd be nice to be proven wrong because I'm sick of the state of the consumer GPU market lately, but unlike Intel, Nvidia haven't taken their foot off the pedal. I just can't see it happening within a reasonable number of generations unless AMD somehow manage to produce a product that's absolutely stunning from a price/performance perspective - and even then I rather doubt it. Mindshare is a powerful thing. Not to mention AMD are likely still heavily focussed on directing wafers to the nonsense that is AI.


amd needs to prioritise drivers and better technology in terms of image quality and they desperately need to go back and make better hair rendering tech as nvidias hair tech is visually better. they need not to litter the market with loads of gpus but they need to make like a good solid 4 gpus with better driver support

amd have been associated with bad drivers and pushing power consumption just to compete with nvidia.

while it doesnt matter in some countrys 100 watts or 150w watts diffrence matters in countrys like uk.

developers also arent developing games round a 4090 more mid tier

e.g more games in 2025 will probly need 12gb of vram, cause we have cards like the 3060 12gb that are quite cheap.
 
Last edited:
  • Like
Reactions: KyaraM

qwertymac93

Distinguished
Apr 27, 2008
115
57
18,760
AMD is afraid for Intel's advancements with their graphics cards. No one beats Nvidia and AMD is first one to see this hence AMD's main focus now will be to compete with Intel's upcoming graphics cards. AMD, good luck. You will need it if max profits in low/mid end GPUs is your goal.
Yeah I can't believe nobody else is bringing up Intel. AMD doesn't have the resources to compete with Nvidia at the top and Intel at the bottom at the same time. Keeping Intel down by targeting the one place they have a hope of competing in (mid-range) is the smart long-term strategy. If AMD tries and fails to beat Nvidia at the high-end (which is pretty much a given at this point) they give Intel a chance to get a foothold in the market where they could push their powerful developer relationships and frankly, superior feature set to decimate AMD's already meager market share.

It's going to be a really tough fight if AMD continues to try and maintain ASP (average selling price) if Intel is still in "gain market share at any cost" mode.

On another note, I realize this article is about gaming but more and more owners of mid-range gaming cards are dabbling in local AI (image generation for example) and Nvidia is WAY ahead in that space. Getting AI running on AMD in Windows is more difficult and when you manage it, it's slower. ONNX is helping but that's not an AMD initiative, it's benefitting Intel just as much (if not more). Even now all the latest AI models are available in CUDA first and ONNX some time later. Not to mention all the scientific papers and optimized formats Nvidia is "donating" to the AI field which only further fortifies their position.
 
Yeah, I got that. It's just that I don't quite understand what exactly do they need the developers for? Is it the game engines themselves so it's the game engine developers we're talking about here or the game developers?

My naïve ass was assuming that DirectX is some standard interface between GPU and the game.


the issue is quite simple developers of a game are not going to focus on high end cards to run there games on they want a big audience to buy there games.

so they will devote there time and resources to implementing the game will run smoothly on hardware like what we had with say hogwarts legacy.

i had a 2060 super 8gb
and a 3060 12gb both run well.

game developers need to devote time to get the engine into running hardware smoothly so there going to dedicate what say steam hardware survey gives them a idea of what they are most likely going to have to program for since all pcs are diffrent. unlike consoles which consoles will have like maybe 2 hardware versions to program for.
 

rocketchatb

Distinguished
Nov 27, 2009
9
6
18,515
I don't normally post on Tom's Hardware but this is a good interview. ATi/AMD/Radeon is in a unique position, if they play their cards right with this strategy, they can gain valuable marketshare which can allow them to formulate a larger strategy for future generations.

Now that their GPUs also have L3 cache, I wonder if 3D stacking them on the die is the plan for the future. Imagine X3D CPU + X3D GPU.
 

usertests

Distinguished
Mar 8, 2013
928
839
19,760
Now that their GPUs also have L3 cache, I wonder if 3D stacking them on the die is the plan for the future. Imagine X3D CPU + X3D GPU.
The rumor mill thought this was doable with RDNA3, double stacking the L3 from 96 to 192 MiB. That didn't happen and the Infinity Cache amount actually went down for everything above the 7600 XT, without much fanfare. I guess 32 MiB is the minimum you want for a 1080p/1440p card.

https://www.notebookcheck.net/Repor...vidia-is-going-crazy-with-power.640545.0.html

In RDNA3 Navi 31/32, each 16 MiB of Infinity Cache is distributed with each 64-bit of memory controller, 4 GiB of VRAM. I think each is physically separated onto different MCDs, but I'm not sure. But if it's not all in one place, maybe there is a clear benefit from doubling it wherever it is. Not enough of a benefit for AMD to do it yet.
 

YSCCC

Commendable
Dec 10, 2022
566
460
1,260
IMO for the recent generations they aren’t that bad compared to NVIDIA, I was about to pull the trigger for 7900xtx to replace the 3070ti, but hey, you sell the card slightly below the Nv part equivalent to yours at equal Raster performance but much worse in RT, one except die hard fans can’t pay that bill. And I don’t believe the bs that wafer cost almost tripled post covid scalping. To gain market share they need to offer C/P radio, and not just edging out the big player
 

shady28

Distinguished
Jan 29, 2007
443
314
19,090
It makes sense, but in order to beat Nvidia even in midrange, they're going to have to do better than 5-10% lower cost for similar performance.

That's what they've been doing for years now, and it doesn't work. The buyer just looks and says, 5-10% is the premium for better support and better drivers.

They need to undercut Nvidia by 25%+, otherwise nothing changes.
 

Heat_Fan89

Reputable
Jul 13, 2020
504
266
5,290
Reading some of the comments and my take is that AMD doesn’t really care about high-end GPU’s and taking the fight to Nvidia.

If anything, I would bet that AMD is much happier designing and making parts for Sony and Microsoft gaming consoles. It has a much higher volume into the 10’s of millions.

Nvidia has burned its bridges with most companies including Apple, Microsoft, Sony to name a few. Although rumors suggest that Nintendo might re-up with Nvidia for their next gaming console.

What Nvidia needs to understand is that there will be pushback by gamers as prices continue to climb and power consumption continues to increase. Just like inflation at the grocery store, expect gamers to opt for mid-tier solutions and developers having to design their games around lower spec machines. Most of my gaming is done on the XBOX Series X and PlayStation 5. The games on both look good enough where I no longer feel the need to shell out $2-3K to either build a rig myself or buy a prebuilt.
 
Last edited:

rluker5

Distinguished
Jun 23, 2014
901
574
19,760
AMD is afraid for Intel's advancements with their graphics cards. No one beats Nvidia and AMD is first one to see this hence AMD's main focus now will be to compete with Intel's upcoming graphics cards. AMD, good luck. You will need it if max profits in low/mid end GPUs is your goal.
AMD will get more rasterization frames without upscaling than Intel, even with Intel's significant improvements, but Intel is sneaking in a victory on the features side with smoother fixed frame delivery, better upscaling and raytracing. If AMD is looking to win with only rasterization/$ they may face the same outcome with Intel that they have had with Nvidia. To a vocal minority max framerate rast/$ is most important, but that is a minority. Just sems like Intel has snuck ahead on other things.
 
  • Like
Reactions: KyaraM
>so long as AMD can have great drivers & features w/ performance at a cost for the mid tier ppl? thats a win.

Unfortunately, halo products (eg 4090) do matter, for sales of mid and budget. To wit Steam's GPU survey for Aug'24. Saying it's an Nvidia dominance would be a drastic understatement.

https://store.steampowered.com/hwsurvey/videocard
halo products dont matter to the people who don't spend that much on them.

and your link proves my point.
16th place down before you encounter a 80 tier gpu & 13 more down from there until next (which is a 90 tier)

most people dont buy halo products.

and if you dont make a halo product (thus rely on selling high for small customer base) you can make more lower tier and sell more even if cheaper and potentially make more than spending the resources/dev on a halo product that may never sell well.