News Intel's Raja Koduri Shrugs Off Rumors of Arc Demise

The only way to keep a TechTube channel going is to violently ally yourself with your current roster of rotating-door sponsors. Any tech channel can be infinitely replaced, for free. It's a constant war to see which channel can make their coorporate advertisers happiest by pushing the most product. It's like HSN, if HSN decided every informercial segment should be a cutthroat reality show structured like "The Apprentice". A YouTube pitchmen must feel pretty threatened if they resort to ripping their clickbait straight out of a taboola feed.

Which is a long way of saying that Intel is, apparently, not buying enough adspace from MLID.
 
  • Like
Reactions: KyaraM
It's such a strange idea (shared by at least a few other folks online, not just MLID) that they will, or should, give up because they won't match the other vendors' high end as soon as their first gen launches. It's like saying AMD should have given up on CPUs after their push to Zen 1 didn't catch them up to Intel.

I am curious if they price this gen so it sells despite the perf limits, which would probably get better reviews and more love from consumers, or if they price it as if it were awesome to avoid getting pigeonholed as the budget pick, which I feel like would amount to sitting out a gen while they continue to catch up.
 
Is Intel's executive team listening to their investors or long-term business planners? The investors want anything costly and underperforming to be cut ASAP. The long-term thinkers realize it takes money to make money and that all new product lines have rough starts. I don't think Intel should wind down or spin off their graphics business -- way too early for that. However, I can believe that the investors are screaming for Arc to be cut.
 
I am curious if they price this gen so it sells despite the perf limits, which would probably get better reviews and more love from consumers, or if they price it as if it were awesome to avoid getting pigeonholed as the budget pick, which I feel like would amount to sitting out a gen while they continue to catch up.
Intel already said that they will price the arc gpus according to the performance of the older games since those are the lowest performing ones.
So for newer games it should be pretty good at price/performance compared to the competition.
 
  • Like
Reactions: KyaraM
Consumer targeted GPUs for gaming may very well go away, seems they missed every deadline imaginable. With any luck there were no contractual obligations that Intel had to back out on, but they probably burned a lot of bridges with board partners by not launching products.

They could shift focus to the things that Arc does well such as encoding and professional workloads. (Would have been a somewhat logical approach if they had started that way, and than applied gaming after they had everything worked out)
 
(Would have been a somewhat logical approach if they had started that way, and than applied gaming after they had everything worked out)
With the infinite possible permutations of hardware and software configurations, it is doubtful "everything would be worked out" even if you gave Intel 10 more years, especially without end-user feedback to help identify missed edge cases earlier on and add them to the regression testing coverage.

If you want to do real R&D, you need real field data and the best way to acquire it is to start low-key with low expectations instead of aiming for the moon and smashing into the boardroom's ceiling.
 
  • Like
Reactions: KyaraM
They certainly could have put more effort into some of their technologies and software before making them public. Smooth Sync being one interesting example. Seems like making Arc pro cards only on the first generation would have benefitted them more. You know media and diehards would have used them for gaming anyway, just to see. And then you would have the same rough effect without having put yourself up as the third option for discrete gaming graphics.

Then my question would be where are the higher performing cards? Shouldn't be any reason to hold them back if they want field testing and feedback.

No, it seems to me they knew they had problems with gaming and were forced to launch anyway. And are holding the big cards back because they don't want to ruin what good will they have left. And if the investors don't see it the same way as the engineers, they might shift focus to datacenter only.
 
Then my question would be where are the higher performing cards? Shouldn't be any reason to hold them back if they want field testing and feedback.
Rushing into reputational suicide wouldn't do them any good.

The A380 failed to impress and revealed more teething issues than previously expected. You don't want to rush into failing more spectacularly at a higher and more visible price point with much higher expectations and lower tolerance to sub-par experience. The low-end where people are somewhat desperate to get something better than the current status-quo who are being grossly under-served if not outright ignored is where you test waters.
 
  • Like
Reactions: cyrusfox and KyaraM
I don't disagree about their tactics now, I think they shouldn't have bothered releasing the A380 as it is. If they had released the A380 Pro to test the waters and outright stated from the start that only newer APIs would be supported and skipping all the technologies that weren't ready for prime time, that would have looked better. News outlets would have reported correctly that the cards weren't suited to gaming but offered promise. Instead they have over-promised and under delivered.
 
I don't disagree about their tactics now, I think they shouldn't have bothered releasing the A380 as it is. If they had released the A380 Pro to test the waters and outright stated from the start that only newer APIs would be supported and skipping all the technologies that weren't ready for prime time, that would have looked better.
Releasing a 'Pro' version that promises even less than the A380 delivered sounds like a horrible idea. Remove the over-promising and announcement of missing or otherwise unusable launch-day features, the A380 would have been a perfect experimental vehicle as-is.
 
  • Like
Reactions: KyaraM
My hangup is the fundamental concept of these GPUs as an alternative for gaming. Even priced aggressively they make a poor choice. They just launched too late to make any impact, and it is only getting worse.

And I still kind of want one, but my target system actually wouldn't support resizeable bar, so might be a non-starter.
 
You know which other card was touted as Intel's come back and never materialized in the consumer segment? Larabee. The only difference between ARC and Larabee is that they actually hired GPU people, didn't try to shoehorn X86 in a GPU and have an actual product (one; performing like crap) out there for consumers.

Tom from MLiD doesn't want this to happen and he's been leaking about ARC ever since the ARC name wasn't even released officially. He wants to be wrong on this one and, if you read what Raja said, he did not refute the claim directly. If these rumors do not help (I agree there), he didn't do anything to really cut the situation form the root and say, with full confidence and no doubts, that ARC will not be removed from the Consumer space and they will launch all consumer cards as originally planned.

No one really wants Intel to exit the GPU market without even having made a dent in it, but reality of things may be cruel: the consumer market may not be big enough for Intel to justify the R&D + marketing expenses for much longer. The question here really is: where is the tipping point?

Regards.
 
Good, the rumor mill is in a whole swing. Exactly what Arc needs now. Until I see an official review of any Arc GPU, rumors are exactly what they are - rumors. I am hopeful of Arc A380. It shows pretty good performance at 1080p and that's exactly what I need. I am not going to pay the ridiculous prices for 6400 or 1660 Super. When it becomes available in my country, with reasonable price, I'm buying it. I know the hardware will be decent, the software will become better with time.
 
My hangup is the fundamental concept of these GPUs as an alternative for gaming. Even priced aggressively they make a poor choice. They just launched too late to make any impact, and it is only getting worse.
In the context of competing options at $140, the A380 is actually decent and provides a handful of unique features for people willing to take the leap of faith on drivers maturation in their reBAR-capable systems.

If I had to buy a new ~$150 GPU, I'd be very tempted to get an A380. Worst case, it should still be about twice as fast as the GTX1050 I'm using right now.
 
"but... let's just say MLID doesn't have the best track record."

What a pile of utter pathetic bulls**t... 😑 Tom/MLID's track record is arguably better than any single other person in the scene. Nobody's right 100% of the time, but Tom is right at LEAST 95 times out of 100 and has the respect and reputation to match.

If you REALLY think he "doesn't have the best track record" then I know that you've either never set foot on his channel until today or are a pathetically petty person with dogs**t technical analysis capabilities Mark. Neither of which is good.

It's good to know to immediately discount anything you post going forward though! Saves me some time avoiding the idiocy.
 
tomshardware had an article recently stating that the A770 Intel board provides 138TF FP16 in the matrix units alone. Their whole lineup of discrete GPUs provide 4x the matrix compute vs the non-matrix. How does this compare to the NVDA GPUs?
 
I am curious if they price this gen so it sells despite the perf limits, which would probably get better reviews and more love from consumers, or if they price it as if it were awesome to avoid getting pigeonholed as the budget pick, which I feel like would amount to sitting out a gen while they continue to catch up.
There was a leaked promotion page indicating that they may bundle the cards with games and software when buying them alongside higher-end Alder Lake CPUs. So, it's possible that the pricing might not be quite as attractive as it could be, with the hope being that game bundles will help sell the cards.

It's unlikely they would price the A770 more than $400 though, and even that might be pushing it, considering it doesn't look like the card will perform much faster than a 3060 in many games, even if the hardware can potentially outperform a 3060 Ti in other titles. Between the variable performance and the uncertainty surrounding drivers and software features, they may need to price the cards more attractively than that, perhaps in the sub-$350 range to increase their odds of a positive reception. And of course, they also need to price the cards with the expectation that their competitors will be launching faster cards around this price range within a few months or so. Even an A770 for $350 might leave people questioning its value if they expect Nvidia to launch an all-around faster 4060 with mature drivers at a similar price point in a matter of months.
 
"but... let's just say MLID doesn't have the best track record."

What a pile of utter pathetic bulls**t... 😑 Tom/MLID's track record is arguably better than any single other person in the scene. Nobody's right 100% of the time, but Tom is right at LEAST 95 times out of 100 and has the respect and reputation to match.

If you REALLY think he "doesn't have the best track record" then I know that you've either never set foot on his channel until today or are a pathetically petty person with dogs**t technical analysis capabilities Mark. Neither of which is good.

It's good to know to immediately discount anything you post going forward though! Saves me some time avoiding the idiocy.
The only 'reputation' MLID has is the same as wccftech, videocardz, etc.: throw enough crap against the wall, and eventually some will stick.
 
  • Like
Reactions: KyaraM and cyrusfox
The Arc situation reminds me of the S3 Savage from many, many years ago. Compared to the competition from nVidia and ATI, Savage was "promising", "if they could only improve the drivers". As I recall, S3 did fix many of the issues and the Savage managed to do decent business in the OEM market. Without high-end, high-margin products to bring in the cash though, S3 had no chance funding the necessary R&D to keep up with the big two.

Back then 3D graphic was still in its infancy. Innovations were happening at a breakneck pace. At this point, 3D graphic is at a plateau of sort. Not much room left for improvement aside from improving ray-tracing performance and upping the resolution. Software APIs are relatively stable too. Even if Intel significantly scales back on GPU R&D, relying on process improvements alone for future products, these would still be respectable and profitable. The existence of the consoles really helps the cause here, I think. Six, seven years from now AAA developers will still be targeting PS5-level hardware.

From a financial standpoint, shutting down Arc doesn't make sense. If VIA managed to milk the Savage for ten years, I don't see how Intel could do worse given all its advantages.

One thing that baffles me is the lack of an Arc equivalent of Kaby Lake-G. What was the point of that exercise if they aren't going to make use of the lessons learned?
 
The Arc situation reminds me of the S3 Savage from many, many years ago. Compared to the competition from nVidia and ATI, Savage was "promising", "if they could only improve the drivers". As I recall, S3 did fix many of the issues and the Savage managed to do decent business in the OEM market. Without high-end, high-margin products to bring in the cash though, S3 had no chance funding the necessary R&D to keep up with the big two.
Matrox had some decent GPUs based on paper specs and those were plagued with dodgy drivers as far as gaming is concerned too.

I disagree about GPU designers needing high-end models: where GPU designers make the bulk of their profit is the mid-range where the decent per-unit margins get boosted by high volume and economies of scale. If you are strapped for cash and need to bet the farm on a single die, you need to design something you can price to sell like AMD did with the RX470-580.

In retrospective, Intel should probably have focused on a single Alchemist die about twice as powerful as the A380 and made the full-die version of that the 8GB/128bits A580 for ~$200 aimed at the RTX3050.
 
  • Like
Reactions: KyaraM
In retrospective, Intel should probably have focused on a single Alchemist die about twice as powerful as the A380 and made the full-die version of that the 8GB/128bits A580 for ~$200 aimed at the RTX3050.

While there might be consumer interest in such a card, I don't think card makers are too keen to sell you one. All that would do is cannibalize sales from their nVidia/AMD models. In return they incur cost due to higher number of customer service calls and RMA's. Not a good deal, maybe not even if Intel were to hand them the GPUs for free. If you may recall S3 ran into this problem. To get their chips onto cards, the company had to acquire card manufacturers, first Number Nine then Diamond. 3dfx made a similar move if memory serves.
 
While there might be consumer interest in such a card, I don't think card makers are too keen to sell you one. All that would do is cannibalize sales from their nVidia/AMD models.
And the current A580-770 wouldn't? If Intel successfully launched those, they'd ruin the value of far more valuable AMD and Nvidia cards than a doubled-up A380 would at ~$200. The A580 as it currently exist is aimed somewhere between the RTX3050 and RTX3060 and may be as cheap as $200 too. With the RX6600 sometimes available as low as $220, Intel may have no choice but to low-ball the A580-770 lineup as well.

Had Intel focused exclusively on the $200-or-less market with a ~220sqmm die with twice the ACM-G11's grunt, the only sales AMD and Nvidia would have lost is products they don't want to supply since the net margins on decent sub-$200 GPUs are slim to nonexistent. That single die would have been the perfect test vehicle to serve the $140-200 range - at least far more so than the current G11 which is underwhelming in too many ways and the G10 which grossly under-performs for how large the die is.