News AMD deprioritizing flagship gaming GPUs: Jack Hyunh talks new strategy against Nvidia in gaming market

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

acadia11

Distinguished
Jan 31, 2010
965
26
19,010
halo products dont matter to the people who don't spend that much on them.

and your link proves my point.
16th place down before you encounter a 80 tier gpu & 13 more down from there until next (which is a 90 tier)

most people dont buy halo products.

and if you dont make a halo product (thus rely on selling high for small customer base) you can make more lower tier and sell more even if cheaper and potentially make more than spending the resources/dev on a halo product that may never sell well.
Halo effect is about brand cache and the quality of the products the manufacturer produces not the actual top end product itself, or buying it. Because Nvidia produces the best GPU therefore any GPU from them must be good. It’s not necessary to buy their top end product because their brand recognition is synonymous with high performance and quality. It drives the consumption of that companies lower tiered products. Which as you say someone who would never buy their top end product would buy their lower end product. That is halo effect in a nutshell.

Right now it looks like AMD can’t produce better GPU as whole than nvidia in the consumer GPU spaces. Regardless of tier. True or not. That’s the message some may see from conceding the top end tier.
 
Last edited:
  • Like
Reactions: TesseractOrion

Giroro

Splendid
Read the room. They competed near the top with the 6900/6950 XT and 7900 XTX. Their market share remains atrocious.

The RDNA4 lineup likely includes something around 7800/7900. That's more of a gaming GPU than most people on the planet need.

I'd still like to see a 32 GB variant of the flagship for AI. Something they can probably do if they're using cheaper GDDR6 again.

Well let me put it this way, all AMD knows how to do with their GPUs is how to copy Nvidia. That's why they always announce their prices last.
AMD thinks they can gain market share by giving a very slight discount over Nvidia on perf/$ for raster, despite the fact Nvidia had conclusively beaten them in every other category, and routinely invented new categories in which to beat AMD. They want to drive demand, but they also want to share in Nvidia's laughably high prices.
Even a child understands this doesn't work, because that child is saving up for Nvidia.

At least that's how it's been, but now the market is about to change dramatically, and I think AMD is still unprepared to capitalize on that.
Nvidia is in a position where they have little to no justifiable business case to waste silicon on gaming cards when it's 100x more valuable in their backordered AI products. Nvidia is competing with itself, and they do not want gaming cards to cannibalize sales of their higher end products. It would be foolish for Nvidia to release any new gaming cards until the AI bubble pops.

Maybe Nvidia will release a $2,000ish product called RTX 5090 at some point. An "entry level" AI card upsell to their 4090 customers - offering something like 15% more performance for 25% more money - probably still 24GB. Importantly, this RTX 5090 will absolutely not be a product targeted at gamers. Arguably neither was the 4090, but Nvidia paid to buy-off a bunch of shill gamerz influencers for the 4090. I didn't think Nvidia would have a reason to bother pretending that a hypothetical 5090 is a halo gaming card.
But will thier next generation go as low as an RTX 5070?
I sincerely doubt it, and definitely not this year.

AMD is set up for their top end card to be the gaming performance leader by default, due to a lack of any "gaming" competition from Nvidia whatsoever. AMD is going to have absolutely no idea how to price their cards. Will they ship first, for once, or just wait forever for a generation that never starts?

Not that a first-move advantage would matter for AMD. Nobody's going to release a AAA game good enough to convince people to upgrade their GPU in the next couple years. The open source community isn't going to suddenly volunteer to help AMD develop ROCm into a functional CUDA alternative.
If the only choice is between a shiny new $500 ""midrange"" AMD card, or an old Nvidia card that plays the same boring derivative games... I think a lot of people are just going to stick with their old card.
Maybe a lot of people would go for AMD if they brought the midrange back down to $200, but AMD could have done that to gain market share at any point since the RTX 2000 series, and so far they've refused.
 
  • Like
Reactions: acadia11

parkerthon

Distinguished
Jan 3, 2011
109
125
18,760
This is spin making excuses as to why they can’t compete this generation. It’s not like they can’t make a generational architecture that scales and competes both high and low to win marketshare. AMD’s issue is they can’t beat nvidia to market with each successive generation. They put up good value GPUs, but they are always running a distant second. It just happens that mid-range gpus is where people care about value so sure amd sells better there. Maybe they are saving some resources only manufacturing affordable chips, but people buy base model luxury cars all the time just so they can say they own/drive a certain brand. Can’t understate how having a high end reputation drives consumer lust for a brand. Many pc gamers exist out there that buy nvidia in prebuilts that have no clue how much they are overpaying. They just know nividia makes the fastest gpus and rules the world at moment. That’s their big volume they are chasing. If hypothetically PS5’s were sold with an option of nvidia or amd gpu models, Nvidia would sell better even if it performed similarly or slightly worse overall.
 

8086

Distinguished
Mar 19, 2009
112
46
18,610
The solution to this probelm should have been making flagship gpus but undercutting Nvidia by 40-50%. So a 7900xtx would have cost $600 vs an NV 4080/4090
 
  • Like
Reactions: eichwana

acadia11

Distinguished
Jan 31, 2010
965
26
19,010
Ge
Well let me put it this way, all AMD knows how to do with their GPUs is how to copy Nvidia. That's why they always announce their prices last.
AMD thinks they can gain market share by giving a very slight discount over Nvidia on perf/$ for raster, despite the fact Nvidia had conclusively beaten them in every other category, and routinely invented new categories in which to beat AMD. They want to drive demand, but they also want to share in Nvidia's laughably high prices.
Even a child understands this doesn't work, because that child is saving up for Nvidia.

At least that's how it's been, but now the market is about to change dramatically, and I think AMD is still unprepared to capitalize on that.
Nvidia is in a position where they have little to no justifiable business case to waste silicon on gaming cards when it's 100x more valuable in their backordered AI products. Nvidia is competing with itself, and they do not want gaming cards to cannibalize sales of their higher end products. It would be foolish for Nvidia to release any new gaming cards until the AI bubble pops.

Maybe Nvidia will release a $2,000ish product called RTX 5090 at some point. An "entry level" AI card upsell to their 4090 customers - offering something like 15% more performance for 25% more money - probably still 24GB. Importantly, this RTX 5090 will absolutely not be a product targeted at gamers. Arguably neither was the 4090, but Nvidia paid to buy-off a bunch of shill gamerz influencers for the 4090. I didn't think Nvidia would have a reason to bother pretending that a hypothetical 5090 is a halo gaming card.
But will thier next generation go as low as an RTX 5070?
I sincerely doubt it, and definitely not this year.

AMD is set up for their top end card to be the gaming performance leader by default, due to a lack of any "gaming" competition from Nvidia whatsoever. AMD is going to have absolutely no idea how to price their cards. Will they ship first, for once, or just wait forever for a generation that never starts?

Not that a first-move advantage would matter for AMD. Nobody's going to release a AAA game good enough to convince people to upgrade their GPU in the next couple years. The open source community isn't going to suddenly volunteer to help AMD develop ROCm into a functional CUDA alternative.
If the only choice is between a shiny new $500 ""midrange"" AMD card, or an old Nvidia card that plays the same boring derivative games... I think a lot of people are just going to stick with their old card.
Maybe a lot of people would go for AMD if they brought the midrange back down to $200, but AMD could have done that to gain market share at any point since the RTX 2000 series, and so far they've refused.
This guy gets it. Consumer GPU market is pretty irrelevant in the wider context of where the money is going to be made. Microprocessor capacity has a global limit and both Amd and Nvidia will likely make decisions accordingly. AMD may be just saying our resources are going to be put towards our money makers the far more lucrative data center and AI market. All that said it’s a curious strategy to have no top end GPU and believe one can gain market share in the GPU space, it might be misdirection. He Did mention that they would take a chiplet approach and that may be more telling, designs that can be scaled much like athlon/opteron and zen/epyc to allow capture of a more lucrative market?
 
  • Like
Reactions: TesseractOrion

umeng2002_2

Respectable
Jan 10, 2022
265
246
2,070
lol. They never prioritized them before. AMD's GPU division is ran by clowns.

And it's not just about the high end. Their RT is a generation behind nVidia. They still don't have AI hardware acceleration for features like DLSS. Always a day late and a dollar short with them.
 

vinay2070

Distinguished
Nov 27, 2011
294
85
18,870
All Bull, they cant achieve nvidia level of performance and they know that. And thier card manufacturing cost was higher than nvidia. If they price thier card low, nvidia can low ball them and still make more profits than AMD. They also need to work on thier software front and bring FSR to DLSS level and CUDA like features and fix the hardware RT performance.

And yet again, with all these talks, they will once again shoot themselves in the gems next gen by pricing thier cards 5% below nvidia and keep wondering why nobody is buying them.
 
  • Like
Reactions: KyaraM

usertests

Distinguished
Mar 8, 2013
928
839
19,760
AMD, kick this guy out ASAP. The moment he said AMD will not fight to be #1 the company will start to decline and will never return back
AMD did the same thing during the RDNA1 generation, and we've already known about RDNA4 targeting somewhere below 7900 XTX (raster) for months.

AMD needs to focus on improving their software, and maybe then they can go for the crown and get good results. But in truth they need to chase enterprise AI sales immediately while there's still big bucks to be made.
 

ccoonansr

Honorable
Jan 10, 2018
20
7
10,515
Random thoughts on this...

The 7900XTX was never meant to compete against the 4090. It was aiming for 4080.

AMD tried very hard on a chiplet GPU strategy that failed. No word about if they will still chase this. This would be the fastest route to a halo product without breaking the bank on R&D which would take too much from AI. It's why they tried to begin with. The fact that it didn't work is testament to how little they care about the consumer GPU space right now.

Huynh is lying through his teeth. With AMD in consoles they don't have to beg devs to optimize for AMD. They already are.

AMD INTEL and NVIDIA all participated in what I like to call "The Great Stack Pushdown" while raising prices. Adding 9 series CPUs and GPUs where their 7 series were to drag more money out of people without having to innovate. The bottom line is the bottom line. Gamers are the very last thing any of these companies care about. Just like it used to be before the first gaming generation grew up.

They are using profits to chase AI, not consumer GPUs. It's so obvious it's ridiculous.

AMD has deals with both MS and Sony. They are not going to produce much outside the mid range volume segment due to it. Simply not enough fab space to waste silicon on halo products even if they could which they cannot as R&D on consumer GPU has been radically slashed due to AI focus.

AMD is not trying to claw back market share in consumer GPU. They have no need so long as they are in consoles. It's a nice to have thing as far as they are concerned but not necessary from a financial standpoint. They can keep putting out mediocre GPUs without killing their fab space and still slowly see their market share rise. Not a great long term strategy but for the time being it works.

Based on limited fab space they will not mass produce a halo product that would never beat Nvidia. It is a waste of resources with the current volume being eaten up by the chips going into the pro consoles and the rest of their product stack as is.

We may not see another halo GPU from AMD for a while if they can't fix chiplet GPU. If they end up in next gen consoles again it may be longer.

This last one is purely theoretical...

Each time a CPU has been used in a console it was at the very end of it's foreseeable life cycle in mainstream products. 8 Bit 6502 and derivatives, Z80, last seen in consoles before being put out to pasture.

16 bit and 32 bit M68k and derivatives last seen in consoles.

PowerPC last seen in consoles.

When a product fully matures along with it's eco system, it is then dropped for the next fresh new thing. When this happens the price bottoms out and the console makers grab them up at a discount. The people programming for them and those making them are all too happy to cooperate as they extend the life of an otherwise dead product line.

The exception to this has been Arm risc designs. They have found their way into everything since their inception and they will be coming to consoles and gaming devices as the primary source of power soon enough. Barring older PCs, the 3DO used one as does the Switch and Switch II will be as well. Nintendo has used these chips in most of their handhelds since day one. The mobile gaming market, the largest market by install base, already use them. Being as they still have not hit their zenith in performance per watt at a high performance level, they will be around a while longer. I expect to see x86 and Co. In the trash heap after the console makers are done with them and I fully expect custom ARM and or RISC V taking their places. The majority of people on this planet find the performance of their smart phones, tablets and Mac products to be more than enough when software is optimized. x86 has always had a "more power" attitude and the instruction set is not customizable at this time. This will hurt it long term. Even if Intel's 18A is a smash hit and AMD finds similar success at TSMC, they still do not allow fully customizable designs. AMD has let the console makers dictate needs but that's as close as it's gotten. If x86 does not open up while shrinking enough to outperform ARM and Risc V designs in both power AND performance, it will die.

I shouldn't have to point out both HPC and Data Center models of these chips already in use, both customized and optimized.
ARM is slowly taking over the world while gamers chat and or argue over AMD GPUs.

I've already had people telling me this won't happen and that x86 is forever. But then the same attitudes were applied to steam locomotives, fossil fuel vehicles and of course, the Betamax. All three were the most powerful of their kind. Steam is more powerful and cheaper than diesel. Yet diesel won. Gas vehicles beat diesel but are going to fall to electric. Betamax had a higher resolution, smaller form factor and could record a similar amount of material per tape. Yet it lost to VHS. x86 is no different. If you don't believe me, ask one of the most prolific x86 designers in the industry, Jim Keller, and he will tell you. Even his new AI company with Raja Koduri is using custom Risc V for it's CPU needs. If that and the fact that Nvidia is using custom ARM as well doesn't wake you up? Nothing will.

Why is this important to AMD GPUs you wonder? If x86 dies there will be no more money for R&D for future AMD GPUs. AMD will be filing for bankruptcy if this happens. This is why Gelsinger is betting the farm on node superiority. They have to.
 
Last edited:

Penzi

Prominent
Nov 22, 2022
5
4
515
I’m torn. For only the second time I don’t have a top end GPU in my gaming WinPC but I still went with nVidia despite cheering for the underdog. Why? Because I value RT and DLSS… and Intel’s XeSS. Unless AMD joins that fray properly, I cannot see myself getting an AMD GPU (I did have a Vega57 eGPU on my Mac moons ago). I am silently optimistic for Arc going forward but since it seems only nVidia is doing high end GPUs I may be stuck with them… might pick up a Battlemage eGPU for the NUC, though… or perhaps AMD has something interesting up its sleeves…
 
Jul 19, 2024
1
0
10
No 8900 XTX? Then prepare for 10% market share. Crazy. I had the 7900 XTX and it was a monster! (Sold my whole PC, didn't swap to Nvidia) If they had brought the 7900 XTX out for £899 day one it would have sold loads!
 
AMD will target the AI dream and gives the middle finger to desktop users.
In these environments they don't need Rtx, AMD cannot compete with nvidia.
In a near future I see nvidia arm cpus flooding desktop/mobile market. Intel will survive with military contracts, fabs and other high end techs they have.
AMD traction will fall year over year.
 
Last edited:
  • Like
Reactions: KyaraM
So the main rumour being circuled since... almost 2 years ago now... seems to have been true, but for different purported reasons.

The rumour mill back then was because AMD had issues with RDNA3, to the point where they just couldn't see fixing the issues with RDNA4 would make sense both financially or commercially against nVidia, specially considering the TSMC allocation they need to fill.

Reading between the lines I can conclude the following:
1- AMD can't develop a high end generation right now due to CDNA needing more oomph.
2- TSMC has limited capacity to give to AMD, no matter how good the relationship between them is. TSMC is making nVidia pay dearly for the allocations, so money speaks first and AMD accepts it. There's also Apple fighting for allocation.
3- RDNA4 could not be fixed and they're leaving that for RDNA5 or just scrapping bigger multi-die GPUs in favour of multi-die CDNA variants instead, which would make more financial sense given the above two points.
4- For a true high end multi-die GPU they'd have basically no margin until nVidia decides to go that route as well, which should be after Blackwell. Stacking the cache was reported to be too expensive to make sense, so there's also that.

As a regular consumer, while I like to see people discussing which Ferrari or Porsche or Lambo is the best, I'm still happy with my Focus. I want Ford to make the Focus line the best it can because that's what I want to buy and that is the price range I want to pay for. I don't want anything higher than that and I don't want to be forced up either. That would suck. So, while cringe, I have to say I like AMD putting the pressure on nVidia and Intel on the segment that matters to me.

I hope AMD can repeat HD4870 and RX480 (not the rebrands) competition days, because that is where they sell the most and this fella is 100% spot on: OEMs won't care if they can't move inventory with AMD GPUs on the laptops or PCs. They need to compete on price, since brand perception is on the dirt.

Regards.
 

Papusan

Distinguished
Jul 26, 2016
59
48
18,560
Now that their GPUs also have L3 cache, I wonder if 3D stacking them on the die is the plan for the future. Imagine X3D CPU + X3D GPU.

Hmmm. Maybe re-read the article. AMD want to make cheaper cards to maximice their profits. They hope cheaper cards in low-mid end segment will boost their sales. You don't do that with more expensive features. It's for a reason X3D CPU's cost more than the vanilla Ryzen chips.
 
  • Like
Reactions: KyaraM

DS426

Upstanding
May 15, 2024
254
189
360
AMD is right that they need to improve on go-to-market and developer support; when devs are optimizing for not just nVidia hardware but AMD as well, that's where they can climb ahead on perf-per-$, otherwise it's just AMD having to kill margin to move GPU's, and that just doesn't make good business.

AMD almost had the jump on nVidia with high-end RDNA3 having a chiplet architecture, but with most things first-generation, the teething pains were worse than expected and 4090 was larger than anticipated. I also think about Fury X and others... indeed AMD takes the leap every now and then and as others have said, there's just no significant stealing marketshare from nVidia at the top-end.

I think similar to AMD's struggles against Intel in the past regarding market share and adoption, AMD needs to have better outreach and partnership programs.
 
  • Like
Reactions: usertests
Not exactly shocking news. But at least AMD isn't so delusional to think they are competing on the high end. At least for the next generation, probably the one after that as well.

They should pour their R&D money into getting the cost per unit down so they can undercut Nvidia on price while still maintaining a healthy profit margin. That's the only viable long term strategy to win on price. Just cutting the price without changing anything else could create a viscous downward spiral of not enough money to fund future R&D for the next gen cards.

Intel in the wings possibly coming after the low/mid tier as well is a serious threat. But Intel has its own major, MAJOR problems going on. They may drop GPUs entirely again, even though they absolutely need them as a long term strategy. At a certain point Nvidia isn't going to be able to squeeze any more money out of the GPU market and will do the logical thing and invade the CPU market.
 
  • Like
Reactions: hannibal

AndrewJacksonZA

Distinguished
Aug 11, 2011
596
106
19,160
OT:
"Last edited by a moderator: Today at 1:53 AM" Sorry for the OT, but I don't know who to direct this question to. What was wrong with the post that needed editing, please?


Agree, but nothing in life worth having is "easy." It won't be easy for the shareholders who think short-term. For the shareholders who have longer-term thinking, it won't be easy either, but it might just be worth it.
 

hannibal

Distinguished
5090=$2999 pre-scalped price.
It will sell like hotcakes!

The reality is that Nvidia can / could sell GPUs cheaper than AMD could. Nvidia sell so much more GPUs that they can divide the development cost among much more sold gpus. Nvidia profit margins are also thicker, so they can put more money to development...
AMD has no change in competition in high end... not even in middle range or low end, if Nvidia would chose to compete as low margins and AMD is forced to sell their GPUs. So as long as AMD does not sell their GPUs too low, Nvidia is happy. And if AMD drop prices too much... It will lose to Nvidia totally... Just because Nvidia can easily sell their GPUs cheaper... while AMD can not as much.
By dropping the highend AMD save some development money, so they can get a little bit higher margins in their middle and low end gars when highend is not hindering their development costs. Lets see if it pays of... But AMD has not much other things to do... They are battling uphill and unless Nvidia does not make any huge mistake... The situation is not getting better.

If the 5060 is $500+, 5070 is $899, 5080 $1500 and 5090 $2500.... There is room for AMD 8800XT at $700 and 8700XT at $550 and 8600XT at $450...
That way AMD can make some money... lets see if it does affect market share... Most likely nothing would. If Intel drop out from GPU market (again) there is some room for AMD to crow. After Intel share has been divided among Nvidia and AMD... the rest is much more difficult. AMD need a miracle or total Nvidia Meltdown. Seems not to be happening. Nvidia GPUs sell better even if AMD is same price and 30% faster in low end. Hard to see what could change that.
Unless people start buying the best bang for the buck GPUs and in the past... it has not happened. So hard to see why it would happen this time.
Lets predict that 8800XT is super good! At the level of 4080 super... and AMD start to sell it at $600... Nvidia just drop 4080 super to $800 and AMD does not gain anything... because people would buy "cheap" 4080 instead. To AMD it is lose lose situation. AMD drop price more... Nvidia would drop price more... AMD would start losing money with GPUs like Intel and Nvidia still would make profit... And still people would buy 4080 super more... so lose, lose unless either Nvidia does something super stupid... Does not seems to happen.
They can easily prize 5080 and 5090 high. No competition and put 5070 $200 higher than 8800XT and they are just fine!
 

vijosef

Upstanding
Feb 26, 2024
111
113
260
DLSS has done nothing but to degrade modern gaming quality. 7 years since DLSS introduced and even the best implemented versions of it have many visual bugs/degradation in quality. You may say but it has increased gaming FPS greatly, but that was what developers would've pursued to accomplish. Instead they are like "user will turn on upscaler anyway, no need for optimization!".
Upscalers have resulted in almost complete death of gaming optimization.
That's irrelevant to my argument. It's like if you complain that python became the most used language after CPU became so powerful, that they allowed languages to waste power.
It's true, but still the best CPU is the best CPU, and the best GPU is the best GPU.
 
  • Like
Reactions: KyaraM

mikey100tv

Distinguished
Jan 18, 2014
34
1
18,530
It'd be nice to be proven wrong because I'm sick of the state of the consumer GPU market lately, but unlike Intel, Nvidia haven't taken their foot off the pedal
Oh, that don't take much figuring out.

Jensen's 2 years younger than me. This puts him the wrong side of 60. He's not THAT far off retirement, right? He simply wants to feather his nest as far as he possibly can before calling it a day and handing over the reins to the younger generation. Can you blame him?

He's already one of the richest men in the world. As of May this year, somewhere in the region of US$90 billion. Yah, it's an obscene amount of money, by anyone's standards......but for a relatively poor immigrant to the States back in the early-to mid 70s, there's no denying the guy's done damn well for himself!

~~~~~~~~~~~~~~~~~~~~~~~~~​

Bear this in mind; Asian culture prioritizes "hard work" above ALL ELSE. When they DO retire, they are revered and enormously respected by the younger generations. This is "blanket" behaviour across the whole of the Asian Pacific basin, and is endemic to Far Eastern culture.

It's got precisely nowt to do with GPUs, or datacenters (or AI), or even computing in general. It's all about the race to his personal "finish line" now.

You do the math. And as to whether this has anything to do with the head of Nvidia being related to the head of AMD, well....... :LOL:


Mike. ;)
 
Last edited:
  • Like
Reactions: KyaraM

marcus_br_

Distinguished
Mar 3, 2010
3
0
18,510
Actually, if they can have a more efficient mid-range, they could as well have the best flagship.
If they cant compete on the flagship, their midrange will be just crap and they will never be able to beat in price x performance.

Midrange is always the stripped down version of something bigger and better...and efficiency counts a lot.
I guess we will see.
 
Actually, if they can have a more efficient mid-range, they could as well have the best flagship.
If they cant compete on the flagship, their midrange will be just crap and they will never be able to beat in price x performance.

Midrange is always the stripped down version of something bigger and better...and efficiency counts a lot.
I guess we will see.
That is factually wrong.

A bigger chip requires a non-linear higher cost due to defect spread and just how transistor count scales up. GPUs don't scale linearly upwards with transistor count, much like CPUs.

A card of any architecture you choose, requires specific designs and tuning for different power and performance envelopes. Otherwise, nVidia, AMD and Intel (with Alchemist) would just design 1 die and then just cut it down accordingly. It doesn't make economic sense.

Regards.