News AMD Announces 4Q19 Earnings: Second-Gen RDNA in 2020, Quarterly Revenue up 50 Percent

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Looks very good. Debt gone down almost 2/3 compared to a year ago!! Things are looking very good for AMD right now.

But its not yet time for celebrations. We all seen how a single bad product (Bulldozer) can destroy all the hard work. So AMD must maintain the momentum.
 
  • Like
Reactions: Makaveli

bit_user

Polypheme
Ambassador
But its not yet time for celebrations. We all seen how a single bad product (Bulldozer) can destroy all the hard work. So AMD must maintain the momentum.
I think the more likely reversal of fortunes will come on the back of Intel's 7 nm EUV process and then as China stops buying outside tech and the cloud shifts towards RISC V. That could all happen over the course of the next 3-5 years.

Of course, AMD could certainly jump on the RISC V bandwagon, as well, but they won't have quite the advantages they enjoy in the x86 world.

I also wonder if AMD will ever give up on AI, since they've always been a day late and a dollar short.
 
Looks very good. Debt gone down almost 2/3 compared to a year ago!! Things are looking very good for AMD right now.

But its not yet time for celebrations. We all seen how a single bad product (Bulldozer) can destroy all the hard work. So AMD must maintain the momentum.

It wasn't just one bad product. K10 (Phenom) had a bad launch with a product that didn't meet performance expectations, clock speeds and the TLB bug which affected servers more than consumers and at the time servers was where AMD made its money. It also didn't help that Intel was swinging hard with better performing and value products like Core 2. They also overpaid for ATI and at the time had only a single FAB to produce products which hurt their ability to keep supply channels full.

Phenom II corrected a lot of issues with Phenom but it still was not competitive with Intels offerings.

Bulldozer was just the massive kick to AMD while they were down.

I think the more likely reversal of fortunes will come on the back of Intel's 7 nm EUV process and then as China stops buying outside tech and the cloud shifts towards RISC V. That could all happen over the course of the next 3-5 years.

Of course, AMD could certainly jump on the RISC V bandwagon, as well, but they won't have quite the advantages they enjoy in the x86 world.

I also wonder if AMD will ever give up on AI, since they've always been a day late and a dollar short.

I would think they wouldn't give up on AI and likely embrace it as its a pretty big emerging market. Even Intel has been pushing it and their 10th gen is supposed to have built in acceleration for AI.
 

bit_user

Polypheme
Ambassador
Also, RISC V would allow NVidia to enter the CPU market.

Wouldn't surprise me if they already had working silicon. After a quick google:

https://www.linkedin.com/jobs/view/...neer-at-nvidia-1411541499/?trk=jobs_job_title
They make embedded SoC's with ARM cores, currently, and are using RISC V as low-level controllers, in their GPUs. So, if they're not just continuing work on the controllers, then maybe they're trying to replace the ARM cores with it.

I doubt Nvidia will try to enter the fairly cut-throat CPU market. Be careful about what conclusions you draw from "a quick google".
 

bit_user

Polypheme
Ambassador
I would think they wouldn't give up on AI and likely embrace it as its a pretty big emerging market. Even Intel has been pushing it and their 10th gen is supposed to have built in acceleration for AI.
It's a tempting market, which is why everyone is going after it. AMD just lacks the focus, and has been building products that seemed to compete with Nvidia's previous generation, for a few years now. Yet, we might already be moving beyond the era where GPUs can really compete on AI.

Intel is going after AI with more than just GPUs. They have bought at least 3 companies that make purpose-built AI accelerators, in the past 4 years, not counting Mobileye.
 

calken

Honorable
Feb 18, 2014
18
1
10,515
Be careful about what conclusions you draw from "a quick google".

I never drew any conclusions, I merely provided a single data point and expressed that NVidia's attitude to markets in the semiconductor space wouldn't cause me any shock.
 
It's a tempting market, which is why everyone is going after it. AMD just lacks the focus, and has been building products that seemed to compete with Nvidia's previous generation, for a few years now. Yet, we might already be moving beyond the era where GPUs can really compete on AI.

Intel is going after AI with more than just GPUs. They have bought at least 3 companies that make purpose-built AI accelerators, in the past 4 years, not counting Mobileye.

I am sure FPGAs will do to GPUs what ASICs did to GPUs for bitcoin mining, make them useless.

I do agree AMD doesn't have the focus but also don't have the funds to put into it. Even with their sudden reemergence to a profitable company they still make very little in comparison to Intel. Nvidia is not a massive profit monster like Intel but they have one focus, GPUs, unlike AMD that is attacking on two fronts.

In all honesty it might help AMD to split off or sell RTG and just focus only on CPUs and HPC markets and let RTG compete with Nvidia and soon Intel.
 

bit_user

Polypheme
Ambassador
I am sure FPGAs will do to GPUs what ASICs did to GPUs for bitcoin mining, make them useless.
I'm not. If a GPU is a good fit for what you want to do, then a FPGA can actually be worse. This is evident by recent top-end FPGAs not even reaching the performance of Nvidia Turing GPUs, much less the power-efficiency, on certain AI tasks.

In all honesty it might help AMD to split off or sell RTG and just focus only on CPUs and HPC markets and let RTG compete with Nvidia and soon Intel.
This makes no sense. Without GPUs, how would they get console design wins? How would they compete in the thin, light, and low-cost laptop markets (i.e. where you don't normally have a dGPU)? How would they compete in the SFF desktop market, which is also dominated by iGPUs?

If you look at Google's Stadia platform, it seems clear they're using both EPYC CPUs and VEGA GPUs. If AMD didn't have both to offer as a bundle, maybe they wouldn't have won the contract. GPU-compute is instrumental, in HPC, and therefore AMD's GPUs were probably an indispensable part of their bid for the upcoming DoE supercomputer (I forget what it's called).

Finally, if you step back and look at how Intel is rushing into the GPU market, it makes even less sense for AMD to step back from it.

All I said was they should quit burning die area and valuable Engineering & QA time on putting AI in their GPUs. They were always second-tier, not least because their software support always lagged. Now, they look set to shift into 3rd tier, as an AI provider.
 
I'm not. If a GPU is a good fit for what you want to do, then a FPGA can actually be worse. This is evident by recent top-end FPGAs not even reaching the performance of Nvidia Turing GPUs, much less the power-efficiency, on certain AI tasks.


This makes no sense. Without GPUs, how would they get console design wins? How would they compete in the thin, light, and low-cost laptop markets (i.e. where you don't normally have a dGPU)? How would they compete in the SFF desktop market, which is also dominated by iGPUs?

If you look at Google's Stadia platform, it seems clear they're using both EPYC CPUs and VEGA GPUs. If AMD didn't have both to offer as a bundle, maybe they wouldn't have won the contract. GPU-compute is instrumental, in HPC, and therefore AMD's GPUs were probably an indispensable part of their bid for the upcoming DoE supercomputer (I forget what it's called).

Finally, if you step back and look at how Intel is rushing into the GPU market, it makes even less sense for AMD to step back from it.

All I said was they should quit burning die area and valuable Engineering & QA time on putting AI in their GPUs. They were always second-tier, not least because their software support always lagged. Now, they look set to shift into 3rd tier, as an AI provider.

Something that is specifically designed for it can always eventually replace the current. It may take time but GPGPU is limited to certain performance and power efficiencies. ASICs came out and destroyed GPUs in every way possible. Give it time. I am sure we will see a shift from GPGPU in AI tasks.

And the only reason why it may benefit AMD is they do not have the funds to compete with Intel and Nvidia at the same time. Intel has the money to start GPUs and it wont affect their CPUs. AMD does not.

If Intel is successful with their GPUs and is able to push into second place then AMD will be a third place product. Not saying it will happen but Intel has the money to push harder than either one.

AI is a good market to try to get into. Emerging markets tend to be good margin markets.

And AMD could spin it off with an agreement to continue to work together. It doesn't have to be a complete spin off. I was just thinking in terms of financials trying to compete with Intel and Nvidia at the same time is very hard when you are just finally turning the company around from near disaster.
 

bit_user

Polypheme
Ambassador
Something that is specifically designed for it can always eventually replace the current. It may take time but GPGPU is limited to certain performance and power efficiencies. ASICs came out and destroyed GPUs in every way possible. Give it time. I am sure we will see a shift from GPGPU in AI tasks.
You didn't say ASICs, you said FPGAs.

Something to know about AI is that it's different than crypto currencies, where there's a set algorithm that you can just implement in hardware. For AI, people are still developing new layer types, and programmability is key for that.

However, it's still the case that the most intensive parts of deep learning tend to be fairly consistent, and those parts have been virtually hardwired in Nvidia GPUs, not unlike how they have cast key portions of the 3D graphics pipeline in hardware blocks.

So, the story around AI really looks a lot like graphics, in that both programmability and hardware acceleration are key. Memory bandwidth, as well. Really, the key deficit of GPUs relative to purpose-built AI chips is that they tend to have less on-die memory.

I' m not saying purpose-built AI chips won't eventually win the day, but it's important to understand why GPUs have have staying power.

Also, I really don't buy the idea that FPGAs are going to displace GPUs, in AI. They don't have the same level of flexibility as GPUs, nor do they have the level of efficiency and compute-density of purpose-built AI chips.

the only reason why it may benefit AMD is they do not have the funds to compete with Intel and Nvidia at the same time. Intel has the money to start GPUs and it wont affect their CPUs. AMD does not.
Did you ever hear of a CPU called Zen 2, or a GPU called Navi? Those were both made while building both CPUs and GPUs. Not only that, but AMD was in weaker financial shape, at the time. So, please tell me again how they don't have money to do both.

The only thing I agree with is that they haven't had resources to properly focus on AI. But, GPUs are obviously about much more than that. A large amount of GPU compute still doesn't involve AI, even if AI is starting to work its way into HPC.

If Intel is successful with their GPUs and is able to push into second place then AMD will be a third place product.
There are a lot of assumptions in that. And yet, you expect AMD to just shut down their successful and strategically vital GPU division and run for the hills? Because Intel is coming, and they might be more successful than they were with Larrabee, Xeon Phi, or cellphone SoCs, or than they currently are with their 10 nm manufacturing process? With that kind of attitude, we would never have gotten Zen!

Not saying it will happen but Intel has the money to push harder than either one.
Not saying it will happen, but you are saying that AMD should kill of f their GPU business, just in case? As I showed above, the mere fact of it being Intel doesn't mean they will automatically be successful. And even if they are, there's still a lot of room for AMD to play, not to mention their partnership with Samsung to bring RDNA to mobile.

AI is a good market to try to get into. Emerging markets tend to be good margin markets.
It was a good market like 5 years ago. Now, the competition is so fierce that it's like the GPU business back in 1998, where there were like 20 companies all jockeying for position. AMD has no real presence, in this market. I wish it weren't the case, but they're hardly a footnote.

AMD could spin it off with an agreement to continue to work together. It doesn't have to be a complete spin off.
How is a smaller GPU player going to be more competitive? If AMD wanted to raise more money for it, they'd have a much easier time selling shares than a newly-spun-off, standalone GPU maker. And you didn't answer the questions I posed about all of AMD's products and business deals that involve both their CPU cores and iGPUs or dGPUs.

I was just thinking in terms of financials trying to compete with Intel and Nvidia at the same time is very hard when you are just finally turning the company around from near disaster.
The actual turnaround came in like 2017. AMD is now in fine shape. Companies don't need to have zero debt. And the debt is not a problem, as long as AMD's gross margins are growing, while its debt is shrinking.

Last year, AMD had GPU inventory overhang from the crypto-bust, their GPUs were years out-of-date (prior to the Navi launch) and most of their CPU sales were on Zen+, which was still struggling to be competitive. In both CPUs and GPUs, their product stack is now much more competitive, and they're getting lots more OEMs on board, especially in the laptop and server segments - both of the highest growth PC markets. 2020 looks like it's going to be a really good year, for AMD. Maybe some storm clouds towards the end, but I'm glad they have someone steering their ship with more guts than what you showed in that comment.
 
Last edited:
You didn't say ASICs, you said FPGAs.

Something to know about AI is that it's different than crypto currencies, where there's a set algorithm that you can just implement in hardware. For AI, people are still developing new layer types, and programmability is key for that.

However, it's still the case that the most intensive parts of deep learning tend to be fairly consistent, and those parts have been virtually hardwired in Nvidia GPUs, not unlike how they have cast key portions of the 3D graphics pipeline in hardware blocks.

So, the story around AI really looks a lot like graphics, in that both programmability and hardware acceleration are key. Memory bandwidth, as well. Really, the key deficit of GPUs relative to purpose-built AI chips is that they tend to have less on-die memory.

I' m not saying purpose-built AI chips won't eventually win the day, but it's important to understand why GPUs have have staying power.

Also, I really don't buy the idea that FPGAs are going to displace GPUs, in AI. They don't have the same level of flexibility as GPUs, nor do they have the level of efficiency and compute-density of purpose-built AI chips.


Did you ever hear of a CPU called Zen 2, or a GPU called Navi? Those were both made while building both CPUs and GPUs. Not only that, but AMD was in weaker financial shape, at the time. So, please tell me again how they don't have money to do both.

The only thing I agree with is that they haven't had resources to properly focus on AI. But, GPUs are obviously about much more than that. A large amount of GPU compute still doesn't involve AI, even if AI is starting to work its way into HPC.


There are a lot of assumptions in that. And yet, you expect AMD to just shut down their successful and strategically vital GPU division and run for the hills? Because Intel is coming, and they might be more successful than they were with Larrabee, Xeon Phi, or cellphone SoCs, or than they currently are with their 10 nm manufacturing process? With that kind of attitude, we would never have gotten Zen!


Not saying it will happen, but you are saying that AMD should kill of f their GPU business, just in case? As I showed above, the mere fact of it being Intel doesn't mean they will automatically be successful. And even if they are, there's still a lot of room for AMD to play, not to mention their partnership with Samsung to bring RDNA to mobile.


It was a good market like 5 years ago. Now, the competition is so fierce that it's like the GPU business back in 1998, where there were like 20 companies all jockeying for position. AMD has no real presence, in this market. I wish it weren't the case, but they're hardly a footnote.


How is a smaller GPU player going to be more competitive? If AMD wanted to raise more money for it, they'd have a much easier time selling shares than a newly-spun-off, standalone GPU maker. And you didn't answer the questions I posed about all of AMD's products and business deals that involve both their CPU cores and iGPUs or dGPUs.


The actual turnaround came in like 2017. AMD is now in fine shape. Companies don't need to have zero debt. And the debt is not a problem, as long as AMD's gross margins are growing, while its debt is shrinking.

Last year, AMD had GPU inventory overhang from the crypto-bust, their GPUs were years out-of-date (prior to the Navi launch) and most of their CPU sales were on Zen+, which was still struggling to be competitive. In both CPUs and GPUs, their product stack is now much more competitive, and they're getting lots more OEMs on board, especially in the laptop and server segments - both of the highest growth PC markets. 2020 looks like it's going to be a really good year, for AMD. Maybe some storm clouds towards the end, but I'm glad they have someone steering their ship with more guts than what you showed in that comment.

I did specify ASICs in my first post. you even quoted it:

I am sure FPGAs will do to GPUs what ASICs did to GPUs for bitcoin mining, make them useless.

I was stating what could happen.

And I am not assuming anything or saying AMD should just that it might not be a bad idea. ATI competed against Nvidia just fine for many years and was in no need of being bought by AMD at the time. In fact I would dare say ATI had more winning designs than AMD has had thus far against Nvidia.

I am only stating the facts though. Intel has the capital to shove itself into existing markets. I don't know how their GPUs will turn out but so far its looking like we will have a third player.

Yes I know Zen and Navi. I would not say Navi is as competitive as it should be in higher margin markets. SZen has its ups, more than Navi for sure, but also has a lot of work cut out to truly affect Intel. We all know Intel is in a better market position, their cuts to their HEDT line price wise show they have room to wage a price war. I doubt AMD has as much room to do so.

Again it was just an idea. Not saying it should but that it wouldn't hurt to be able to really focus on one product to be able to push harder.

Having guts its one thing. Having the financials to fully compete is another. They are doing better but they still lag far behind Intel since they are not pushing into markets, yet, that have vastly higher margins. Laptops are one thing but thats where they will have the hardest competition, besides servers, since Intel is intent of focusing there, obvious from them pushing Ice Lake and Tiger Lake there along with the new laptop designs.
 

bit_user

Polypheme
Ambassador
I did specify ASICs in my first post. you even quoted it:
But you said that FPGAs would beat GPUs at AI, even if you mentioned ASICS in the context of crypto. FPGAs and ASICs are not interchangeable terms.

ATI competed against Nvidia just fine for many years and was in no need of being bought by AMD at the time. In fact I would dare say ATI had more winning designs than AMD has had thus far against Nvidia.
It's hard to run the counterfactuals on that acquisition, without at least having some insider info on what effects the acquisition had on ATI.

I am only stating the facts though. Intel has the capital to shove itself into existing markets. I don't know how their GPUs will turn out but so far its looking like we will have a third player.
Yeah, and I'm only stating facts, too. You can't deny than in spite of their massive resources, Intel has had at least its share of high-profile failures.

Therefore, you really can't assume that it's going to dominate the GPU market, simply due to its deep pockets. That logic sure didn't apply for mobile, didn't work for Xeon Phi, and hasn't worked out for their 10 nm node.

their cuts to their HEDT line price wise show they have room to wage a price war. I doubt AMD has as much room to do so.
Looking back at both generations of Zen CPUs, AMD has done aggressive price-cutting, later in the product cycle. And the price-cutting has already started for the 3000-series.

With their ThreadRippers, the situation could be a little different, since they're so heavily dependent on high-binned chips. Still, they beat Intel's HEDT chips in perf/$, even after Intel's price cuts.

Again it was just an idea. Not saying it should but that it wouldn't hurt to be able to really focus on one product to be able to push harder.
Maybe think about your ideas a little harder, before giving voice to them. However, you're certainly not the only poster to suggest this, which is why I made such a long post to dismantle it.

And, yes, it would hurt them, as I clearly outlined. Your argument hinges on the idea that the benefits to their CPU products would more than offset all the various ways they'd suffer by not having graphics. I think that's highly dubious, as I explained.

Finally, your contention assumes that the CPU team is resource-starved, which we don't have evidence to support.

Having guts its one thing. Having the financials to fully compete is another. They are doing better but they still lag far behind Intel since they are not pushing into markets, yet, that have vastly higher margins. Laptops are one thing but thats where they will have the hardest competition, besides servers, since Intel is intent of focusing there, obvious from them pushing Ice Lake and Tiger Lake there along with the new laptop designs.
First, I'm not sure their penetration of those markets would be helped by a bit more money, assuming that were a consequence of killing/spinning off their graphics division (which is dubious, since it doesn't seem to be losing money).

Second, lack of integrated graphics would seriously damage their efforts to compete in the laptop market.

Third, you're writing the epitaph of Zen2 too soon. Give it a chance and we'll see. They keep racking up more server design wins, which is a prerequisite to deeper penetration of that market. Similarly, we're seeing more announcements of AMD-based laptops. In the end, the success of Zen2 could ultimately be limited by TSMC, and that's not something that a bit more money could really help.
 
But you said that FPGAs would beat GPUs at AI, even if you mentioned ASICS in the context of crypto. FPGAs and ASICs are not interchangeable terms.


It's hard to run the counterfactuals on that acquisition, without at least having some insider info on what effects the acquisition had on ATI.


Yeah, and I'm only stating facts, too. You can't deny than in spite of their massive resources, Intel has had at least its share of high-profile failures.

Therefore, you really can't assume that it's going to dominate the GPU market, simply due to its deep pockets. That logic sure didn't apply for mobile, didn't work for Xeon Phi, and hasn't worked out for their 10 nm node.


Looking back at both generations of Zen CPUs, AMD has done aggressive price-cutting, later in the product cycle. And the price-cutting has already started for the 3000-series.

With their ThreadRippers, the situation could be a little different, since they're so heavily dependent on high-binned chips. Still, they beat Intel's HEDT chips in perf/$, even after Intel's price cuts.


Maybe think about your ideas a little harder, before giving voice to them. However, you're certainly not the only poster to suggest this, which is why I made such a long post to dismantle it.

And, yes, it would hurt them, as I clearly outlined. Your argument hinges on the idea that the benefits to their CPU products would more than offset all the various ways they'd suffer by not having graphics. I think that's highly dubious, as I explained.

Finally, your contention assumes that the CPU team is resource-starved, which we don't have evidence to support.


First, I'm not sure their penetration of those markets would be helped by a bit more money, assuming that were a consequence of killing/spinning off their graphics division (which is dubious, since it doesn't seem to be losing money).

Second, lack of integrated graphics would seriously damage their efforts to compete in the laptop market.

Third, you're writing the epitaph of Zen2 too soon. Give it a chance and we'll see. They keep racking up more server design wins, which is a prerequisite to deeper penetration of that market. Similarly, we're seeing more announcements of AMD-based laptops. In the end, the success of Zen2 could ultimately be limited by TSMC, and that's not something that a bit more money could really help.

You need to re-read what I said. I was making a basic comparison that FPGAs have the ability to do what ASICs did to GPUs. I even re-quoted it for you.

And again spinning off does not mean they wont have an iGPU. They could still have and work with RTG just not be directly involved or reliant on them, good or bad. Besides short of mobile and low end desktops AMD doesn't utilize iGPUs as heavily as Intel who only doesn't use them in HEDT and special mainstream cases.

As for Intels GPU I am not saying they will dominate. I am saying they have the funds to push into a market and they can take a failure. AMD doesn't have that room. I personally think if Intel gets their idea to have a dGPU work with a iGPU it will push them to be a more competitive GPU vendor than AMD for Nvidia.

I haven't written anything off either.

Oh well. I am good with this discussion for now. No use in going back and forth if you continue to misread a sentence.
 

bit_user

Polypheme
Ambassador
You need to re-read what I said. I was making a basic comparison that FPGAs have the ability to do what ASICs did to GPUs. I even re-quoted it for you.
Yes, I saw the quote and your re-quote of it. I don't know why you keep coming back to that, because I clearly stated that what I disagree with is precisely encapsulated in that quote.

I'm not sure you really understand the fundamental differences between FPGAs and ASICs. If FPGAs were as good as ASICs, then nobody would build ASICs. FPGAs are like a middle-ground between general-purpose programmable cores and more hard-wired ASICs, although virtually everything called an "ASIC", these days, has some amount of programmability. For a long time, actually.

Anyway, my point is that the flexibility afforded by FPGAs doesn't come for free. If you actually cared to support your argument, you could go dig up some benchmarks of FPGAs on deep learning workloads. Except, then you'd find you're wrong.

In deep learning, FPGAs have failed to pose a credible threat to Nvidia's GPUs, and I don't expect that to change. Because, like I said, Nvidia basically hard-wired the most fundamental building blocks of deep learning, giving them the performance of an ASIC and the flexibility of a fully programmable solution - just like what they did for graphics.

It's only a corner case of some low-latency, hard-realtime use case where FPGAs can even challenge Nvidia. That is not the main use case for AI, and Nvidia is addressing it with purpose-built solutions, like their AGX platforms:


And again spinning off does not mean they wont have an iGPU. They could still have and work with RTG just not be directly involved or reliant on them, good or bad.
It just adds an extra layer between the two, which would probably result in more overhead, bugs, and an extra generation of lag between the latest GPU design and what actually makes it into the current APU generation. And, for what benefit? You haven't provided any evidence of a benefit, at all.

short of mobile and low end desktops AMD doesn't utilize iGPUs as heavily as Intel who only doesn't use them in HEDT and special mainstream cases.
You're talking about how most Ryzen models lack an iGPU? Two questions about that:
  1. Where's the volume? Especially, when you factor in laptops, I think APUs will comprise most of AMD's volume, outside of servers.
  2. When Intel has dGPUs, how will that affect their product mix? Might they start excluding iGPUs from the dies of their largest consumer CPUs, in order to better compete with higher-end Ryzen models?
I expect we'll see AMD having more reliance on their iGPUs (as they penetrate laptop and SFF markets), and Intel having less (as they start offering competitive dGPUs), in the coming years.

I am saying they have the funds to push into a market and they can take a failure. AMD doesn't have that room.
Hmmm... Bulldozer and (arguably) Vega could be considered failures, and they didn't kill AMD.

I don't foresee AMD's leadership repeating those missteps. However, I'm still not sure you fundamentally understand how business works. People don't say "Gee, a failure would really hurt the company, so we'd better not try". Silicon Valley is built on risk-takers and everyone understands that there's a greater risk taken on by growth-oriented companies (especially in tech), like AMD.

What's more, if AMD keeps GPUs and CPUs under the same roof, success in one area can help the other through a rough patch. In a very real way, it's the safer path. Investors sometimes like to see spin-offs, because if one is wildly successful, its profits aren't watered down by the other division, but that's really a risk-tolerant, profit-maximizing view, rather than the "safe route" that you're characterizing it as.

I personally think if Intel gets their idea to have a dGPU work with a iGPU it will push them to be a more competitive GPU vendor than AMD for Nvidia.
You're missing one thing: memory bandwidth. That, and iGPUs are vastly weaker than dGPUs, so it only makes sense for a very limited use case of someone adding in a low-end dGPU. And even then, maybe the iGPU contributes like 30% more to what the dGPU could manage, alone. But, it's not going to matter to the mid-range dGPUs or above.

And if I were a game developer, I'd only use the iGPU for things like AI or physics, if the system also had a dGPU. In a case like that, it doesn't even matter if the iGPU is the same brand as the dGPU. Trying to coordinate them to both perform graphics tasks is just going to be a pain to debug and optimize.

No use in going back and forth if you continue to misread a sentence.
Ah, so you blame me for our disagreement? That's disappointing. You're usually better than that.

Take your toys and leave, if you want to. Though I feel I've made my case, I will continue to answer your responses, to the extent that I can (hopefully) introduce more information & insight into the discussion.
 
Last edited: