News Intel Claimed 6 Percent of Discrete GPU Market in Q4

The fact Intel grabbed 6% of the consumer dGPU market is still insane.

The fact it came at the cost of AMD GPU sales shows what a weak position AMD is in. AMD has always had issues with poor drivers, poor encoding performance, poor raytracing, etc.

Intel seems to be aggressively improving drivers, raytracing and encoding support.
 
Last edited:
  • Like
Reactions: KyaraM
Intel is doing better than expected, considering they put Koduri in charge and let him permanently ruin the Arc brand with gamers.
But maybe the brand will have a fighting chance with a bit of corporate restructuring and advertising focusing on their strengths for media creation/streaming and business use.
As it stands, good luck finding accurate information about whether you can use Arc graphics to run a plex server, which is something it should be very good at.
 
  • Like
Reactions: bit_user
I mean that still a heck of a turnout for a first run at it. It can only get better from here.
The fact Intel grabbed 6% of the consumer dGPU market is still insane.
As noted in our previous report, JPR data is based on dGPU sell-in to the channel and says nothing about dGPU sell-out to consumers and businesses that are actually using the dGPUs. I would still wager (heavily) that there are a lot of Arc GPUs in warehouses sitting in large containers rather than in PCs and laptops. Those will inevitably get moved out to consumers over the coming year, and I suspect Intel isn't ordering more Arc wafers from TSMC at this stage. Intel grabbed 6% of the sell-in market last quarter. That is not the same as 6% of the consumer GPU market.
 
Intel aside, it does show again that while AMD positions themself as a "value alternative" essentially to nVidia, people are still not flocking to buy their cards, something we see year after year. It could be exclusive features or it could be software stability (for me it's the latter), AMD needs to finally recognize that performance isn't what's holding them back.
BcUArQGauqSGPbdawqiB7C-1200-80.png.webp
 
  • Like
Reactions: KyaraM
it does show again that while AMD positions themself as a "value alternative" essentially to nVidia,
Is that AMD's positioning, or just how we're used to thinking of them?

Nvidia is definitely seen as the premium brand. So, it makes sense to me that when availability returned and prices dropped, people who might've "settled" for an AMD GPU were instead opting for Nvidia - even at a price premium.

It could be exclusive features or it could be software stability (for me it's the latter),
People expecting to hold onto their GPU for a while might also really be starting to weight ray tracing performance more heavily.
 
Glad Peddie caught it, because it certainly was not believable. But it does spotlight the problem with these companies that do estimates, like Peddie. Lots of stuff gets mixed up and things can be misunderstood or overstated easily. All such estimates should be taken with a grain of salt, imo.
 
Is that AMD's positioning, or just how we're used to thinking of them?

Nvidia is definitely seen as the premium brand. So, it makes sense to me that when availability returned and prices dropped, people who might've "settled" for an AMD GPU were instead opting for Nvidia - even at a price premium.


People expecting to hold onto their GPU for a while might also really be starting to weight ray tracing performance more heavily.

Yes, nVidia is definitely the premium-priced brand, no question...😉
 
It could be exclusive features or it could be software stability (for me it's the latter), AMD needs to finally recognize that performance isn't what's holding them back.

Right.

I use Adobe Rush for small video projects. I use DaVinci for larger projects.

Both support Nvidia CUDA perfectly. On AMD GPU they are a stuttering nightmare, the supposed OpenGL support is a mess.

I can not even trust AMD to give me a bug free render, let alone do it fast. With an AMD GPU I would have to rewatch every single video to make sure the render is clean. With Nvidia's CUDA I know it's clean.

In reality, I don't give a damn about either of these billion $ companies. I buy what works best, I'm brand agnostic about this kind of stuff. So, I would be more than willing to buy an AMD GPU if they show they care about other stuff than pure FPS. Driver stability and hardware acceleration in video programs is just as important to me as FPS.

But every single AMD presentation is a stupid slide about having 2 or 3 fps more than some Nvidia GPU in some select benchmark.

If you want to do video editing, AMD argues you should just buy one their $4,000+ threadrippers. I'm sorry, get lost AMD, video streaming and editing went mainstream with NVENC and CUDA, not with $4,000 64 core CPU.

I am editing videos on my $300 Nvidia RTX 3050 thanks to CUDA, and it is faster than any CPU.
 
Last edited:
I use Adobe Rush for small video projects. I use DaVinci for larger projects.

Both support Nvidia CUDA perfectly. On AMD GPU they are a stuttering nightmare.

I can not even trust AMD to give me a bug free render, let alone do it fast. With an AMD GPU I would have to rewatch every single video to make sure the render is clean. With Nvidia's CUDA I know it's clean.
Seems like a chicken-and-egg problem. As long as AMD has such issues, few people will use it with this software and that means fewer bug reports and less incentive for the software vendor to focus on doing their part to make sure the AMD backend works well.

I don't think the blame lies 100% on AMD's side, but AMD will probably have to do the lion's share of the work to rectify the situation. It's like with AAA-games, where AMD will probably need to loan its own developers to help the software vendor optimize their software to run on its GPUs.
 
Aha, calling John Pedddie research reputable is like calling a censored reputable. Sorry, but it is simply amusing. And the same goes for Steam HW survey.

And I'm not saying that any of these are in collusion with nVidia. It is just due to methodology. AFAIK JP follows shipments of a few select (mostly nVidia only) OEMs. Steam, on top of all the recurring weirdness (honestly, I've seen instance where no longer produced CPUs suddenly appeared with 2%+ of install base), it "multiple dips" users of internet cafes - so the same system is counted several times.

Look at the financial statements of both companies and you'll get a better insight over how the market is going.
 
Whenever a company such as Intel manipulates its own stock price by exaggerating fake sales (and then employees sell at the rise), they should be forced to pay tax, shipping etc on every single sale they CLAIM they made on items that don't exist.

Suddenly manipulation by doubling/tripling claimed sales wouldn't be in Intels best interests.

Followed by a full audit of every single company employee that bought OR sold stock, since thats market manipulation AND insider trading.
 
Whenever a company such as Intel manipulates its own stock price by exaggerating fake sales (and then employees sell at the rise), they should be forced to pay tax, shipping etc on every single sale they CLAIM they made on items that don't exist.

Suddenly manipulation by doubling/tripling claimed sales wouldn't be in Intels best interests.

Followed by a full audit of every single company employee that bought OR sold stock, since thats market manipulation AND insider trading.
They did state all the sales they made, not more and not less.
The statistical organization just mixed consumer and compute GPUs together because everybody else is doing it one way so they didn't bother checking if intel does it the same way before releasing their results.
 
Steam, on top of all the recurring weirdness (honestly, I've seen instance where no longer produced CPUs suddenly appeared with 2%+ of install base), it "multiple dips" users of internet cafes - so the same system is counted several times.
Steam HW Survey has had serious miscounting issues in the past, but AFAIK those are corrected now. Internet cafes was a big one, but it was fixed in May 2018: https://steamcommunity.com/discussions/forum/0/1696046342855998716/ I follow the Steam data basically every month, and for GPUs I look at the API pages so that I can see more details on the hardware — the main video cards page cuts off anything below ~0.15%, while the API page goes down to ~0.01%. There have not been any massive, unexplainable fluctuations in a long time. Again, internet cafes were a big issue for a while, because they were all being counted multiple times (every different user that got surveyed), but now Steam has a way to detect that it's a shared PC and it only gets counted once.

As for your assertion that, "I've seen instance where no longer produced CPUs suddenly appeared with 2%+ of install base," I'm going to call bunk. Sorry, that's just blatantly false. Show me one case of this happening in recent (past two) years, please. Because, oh, wait! Steam doesn't even report what CPUs people are using. You've got this page that shows manufacturers sorted by clocks, and this page sorted by numbers of cores. There's no way to see, for example, how many people are using a Core i7-4770K or a Core i9-13900K, or a Ryzen 9 5900X versus a Threadripper 1920X. And based on clocks topping out at "3.7 Ghz and above," if that's boost clock then just about any modern CPU falls into that category. And if it's not boost clock (which seems likely), it's even more idiotic. Intel has a lot of chips that will routinely stay at >4GHz, even on laptops, but with official base clocks in the low 2GHz range (and sometimes even below 2GHz). It's why the most popular category of Intel CPUs (in Windows) shows up at 2.3 to 2.69 GHz, because even though most far exceed those clocks, what Steam polls is the often meaningless base clock.

The most we can say for certain from the CPU pages is that CPUs with six physical cores (i7-8700K, i7-9600K, Ryzen 5 1600, etc. etc. etc.) are the most popular category, 4-core CPUs are the second most popular, and 8-core are third. Those three categories currently account for 81.78% of the surveyed PCs. What about something like 12900K? Well, it's a 16-core and is in a category with month over month growth, joined by the 13700K and seven other Raptor Lake CPUs, plus 11 additional Alder Lake CPUs, plus the 5950X and 7950X. The 12-core category includes the 5900X/7900X from AMD, but on the Intel side there are 19 Alder Lake and 10 Raptor Lake CPUs that have 12 physical cores. Most of the Intel stuff is laptop parts, but like I said, there's no way to pull those out of the equation since Steam doesn't list CPU model numbers anywhere.

I'd love for Steam to have a breakdown that shows laptop versus desktop use. That would really help clarify a lot of things! And if we had pages showing just desktops, it would be great to see how the GPU numbers look. Steam has a lot of things it chooses not to reveal for a variety of reasons, but it's still the best source of data on what PC components people are using. Even if it obscures a lot of stuff (again, mostly because laptops are probably at least half of all PCs these days and often have far weaker hardware).
 
  • Like
Reactions: bit_user and KyaraM
780k units sounds more realistic to me, now if someone wanted to argue only 280k of those units are actual sales I could at least buy the argument. A modern day company using last century style manufacturing methods to stuff the channel with 1.1 million units without demand for them didn't add up with how the manufacturing world works today.
 
Again, internet cafes were a big issue for a while, because they were all being counted multiple times (every different user that got surveyed), but now Steam has a way to detect that it's a shared PC and it only gets counted once.
What's funny about that is I'm not even sure it's the most relevant metric. What seems more important isn't the number of actual hardware units exist in the wild, but how many hours of gameplay they end up seeing. Or, from a publisher's perspective perhaps how many $ are spent by the users using them.

If big daddy logs a couple hours per month with his $5k gaming rig, or li'l grandma games for two 15-minute sessions with her ancient iGPU, should they count the same as the 70 hour per-week, basement-dwelling man-child? I'd argue if man-child spends more money on steam, he should count for more. Publishers should want to cater to who's actually paying their bills.

A flip-side of this is someone who wants to play the latest AAA but won't spend the cheddar, because they've read the reviews and know it'll play like garbage on their machine. So, that's why I'd think it better to be indexed by hours played rather than $ spent. Maybe publishers feel somewhat differently.

Steam doesn't even report what CPUs people are using. You've got this page that shows manufacturers sorted by clocks, and this page sorted by numbers of cores. There's no way to see, for example, how many people are using a Core i7-4770K or a Core i9-13900K, or a Ryzen 9 5900X versus a Threadripper 1920X.
Heh, Drystone MIPS or bust.
; )

Seriously, I'd want to know:
  • Single-threaded performance
  • Multi-threaded performance
  • Total # of hardware threads
The main problem is that the way CPUs boost these days, you'd need a long-duration test to properly measure those metrics in the way most relevant to games. So, you can't really do it without being intrusive.

Perhaps the next best thing would be for them to maintain a database of how different CPU models typically score, and just do a quick sanity-check to make sure your machine isn't misreporting itself.
 
  • Like
Reactions: JarredWaltonGPU
A modern day company using last century style manufacturing methods to stuff the channel with 1.1 million units without demand for them didn't add up with how the manufacturing world works today.
Except for how crazy supply chains have gotten, plus the whole "holiday rush" thing, might indeed mean that retailers, wholesalers, and PC builders were over-ordering in the 4th quarter of last year.
 
  • Like
Reactions: JarredWaltonGPU
Except for how crazy supply chains have gotten, plus the whole "holiday rush" thing, might indeed mean that retailers, wholesalers, and PC builders were over-ordering in the 4th quarter of last year.

Yes, I agree that the retail side could have accounted for that. However, the original argument from the original article that stated Intel had 9% of the market was that it was Intel doing it from the supply side in order to make their numbers look better, which in modern day manufacturing didn't add up.

At any rate, 1.1 million sounded off and it turned out it was.
 
At any rate, 1.1 million sounded off and it turned out it was.
So 1.1 million is unacceptably high and sounds "off" but 780K ("0.8 million" if we're rounding) isn't a problem? I mean, sure, it's enough for a 3% overall difference in channel market share, but I'm still wondering if Intel is actually anywhere near 3% total, never mind the JPR 6% figure.

Granted, a lot of Arc GPUs could be going into laptops in other markets around the world, but at least in the US, the number of people using Arc GPUs seems to be extremely small. I'd really love for Steam Hardware Survey to break Arc GPUs out from the "other" category to see if any of them are above 0.00%, but of course there's still the issue with knowing WTF Steam HW survey is sampling.
 
  • Like
Reactions: bit_user
So 1.1 million is unacceptably high and sounds "off" but 780K ("0.8 million" if we're rounding) isn't a problem? I mean, sure, it's enough for a 3% overall difference in channel market share, but I'm still wondering if Intel is actually anywhere near 3% total, never mind the JPR 6% figure.

The number still seems relatively high to me, but 1.1 million would be 320k more units which would be 41% more units than the 780k quoted here, all with presumably very few buyers, that's a big difference. The original argument from one of the posters was Intel was doing this from supply side, for which 1.1 million would make very little sense to produce to pre stuff channels from the supply side (which was the point of my post last time). Around 500k for cross fingers initial demand/fill regional channels I could buy, still seems high, but I could buy it (though who knows if these numbers are even accurate this time around either). If we were talking about retail pulled supply glut that's a different story, those would be orders for units, thus not Intel deliberately stuffing supply channels in order to brighten up their units shipped.

Granted, a lot of Arc GPUs could be going into laptops in other markets around the world, but at least in the US, the number of people using Arc GPUs seems to be extremely small. I'd really love for Steam Hardware Survey to break Arc GPUs out from the "other" category to see if any of them are above 0.00%, but of course there's still the issue with knowing WTF Steam HW survey is sampling.

Agreed. I doubt Intel has sold very many of these units to actual customers interesting in gaming, my best guess aligns with your thoughts on it being largely OEMs for whatever the actual number of sold units are. Having data from some sources would be nice to try to paint some kind of a picture of what is actually happening. Steam would be great to see if any are being used in the gaming space. My guess is probably 0.00% too, but at this point it's just a guess.
 
  • Like
Reactions: bit_user