Latest report claims that Apple is readying its 5nm GPU for its iMacs in the second half of next year.
Apple Reportedly Preparing 5nm GPUs For 2H 2021 : Read more
Apple Reportedly Preparing 5nm GPUs For 2H 2021 : Read more
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
Everybody's getting out of the desktop business - it's just a matter of time. and Apple is usually firstI wonder if Apple will get out of the desktop computer business altogether. Mac represents less than 10% of the company's revenue, most of which is probably laptop sales. Developing workstation-class CPUs and GPUs for use in the Mac Pro can't be economical.
Samsung uses their own OS for TVs. Just can't get them in cellphones though.I won't call it last step to self sufficiency here considering that they still need to buy a lot of components off the shelf and dependent heavily on their suppliers for a complete product. A company like Samsung is probably more self sufficient since they produce almost everything and run their own fab business. Of course they don't have an in house designed GPU, and no OS.
Nobody buys Apple products for the purpose of gaming so whatever Apple comes up with will be good enough for the overwhelming majority of their users. You don't need a 2080Ti for Adobe products or other content creation. Mac Pros are the only systems Apple sell that need any GPU horsepower and they can still outsource that, as it's a niche of a niche in their product stack and developing anything beyond IGP level performance for so few systems would be a waste of money.Yay! Now Apple users can pay Titan prices for 2060 performance!
We'll see how it performs. But I'm skeptical about the performance of a first gen hardware product against companies that have been making GPUs for decades. If anything, Apple's proprietary software written specifically for their hardware could perform on par with current gen hardware. But the more likely scenario is the product gets delayed for years because it is not competitive like the Intel Larabee was not competitive.
Everybody's getting out of the desktop business - it's just a matter of time. and Apple is usually first
I won't call it last step to self sufficiency here considering that they still need to buy a lot of components off the shelf and dependent heavily on their suppliers for a complete product. A company like Samsung is probably more self sufficient since they produce almost everything and run their own fab business. Of course they don't have an in house designed GPU, and no OS.
I wonder if Apple will get out of the desktop computer business altogether. Mac represents less than 10% of the company's revenue, most of which is probably laptop sales. Developing workstation-class CPUs and GPUs for use in the Mac Pro can't be economical.
Not likely. The GPU is highly parallel in nature and something that easily scales, depending on how many cores you want to add to it. Apple can scale performance exactly to whatever they feel their needs are. Early concerns were that Apple was going to try to do everything on a single SoC. The fact that they're doing a discrete GPU effectively raises the performance ceiling considerably.Yay! Now Apple users can pay Titan prices for 2060 performance!
We'll see how it performs. But I'm skeptical about the performance of a first gen hardware product against companies that have been making GPUs for decades. If anything, Apple's proprietary software written specifically for their hardware could perform on par with current gen hardware. But the more likely scenario is the product gets delayed for years because it is not competitive like the Intel Larabee was not competitive.
Not necessarily...First sentence: "If you think Apple was going to stop at producing its own CPUs, then you have another thing coming. "
It's actually, "another think coming."
The thrust of that argument seems to be "people have been saying it the wrong way and that's gaining traction." Why encourage the spread of the wrong usage?
It's sort of like the GIF "Giff / Jiff" pronunciation. Sometimes it doesn't matter what the "official" usage is if it doesn't match up with the popular usage. In this case, "think" may well be historically correct and even semantically more correct. But, it's hard to argue that it "sounds" better. Either way, I think we should agree that calling out such nonsense is rather pedantic and not really on topic with this thread.The thrust of that argument seems to be "people have been saying it the wrong way and that's gaining traction." Why encourage the spread of the wrong usage?
Mac Pros are the only systems Apple sell that need any GPU horsepower and they can still outsource that, as it's a niche of a niche in their product stack and developing anything beyond IGP level performance for so few systems would be a waste of money.
Not likely. The GPU is highly parallel in nature and something that easily scales, depending on how many cores you want to add to it. Apple can scale performance exactly to whatever they feel their needs are. Early concerns were that Apple was going to try to do everything on a single SoC. The fact that they're doing a discrete GPU effectively raises the performance ceiling considerably.
I'm not sure what your comment has to do with my post. Yes, I agree that GPU acceleration is valuable and helps pro applications. Apple knows this too. Apple isn't going to all of a sudden ignore the GPU because they have their own silicon. Just the opposite.When Adobe updated Premier Pro earlier this year to do GPU accelerated encoding, I know I noticed a dramatic decrease in export time. After Effects and Premier Pro also rely on the GPU for various effects. So while it may not be needed, a good GPU will definitely get the job done significantly faster.
Yeah but remember, some things are in the right place at the wrong time. The Xbox One's always-on DRM and banning of used games was 1 gen too soon. Looking at the PS5 all-digital, used games are gone. People have warmed up to the concept of PC-like digital ownership on consoles, and I wouldn't be surprised if the PS6 removes the disc entirely.I heard this prediction, about, what? Almost decade ago? Everything was going to be tablet/phone. Microsoft even released Windows 8, something that looked designed for mobile and touch screens far more than a desktop PC.
Then they backtracked because people hated it. I'm not betting on any sort of desktop exodus.
They didn't entirely. Some things like tile based deferred rendering is unavoidable. That is, if they still want to be fast and efficient. So, they negotiated a broad multi-year licensing deal.What I'm wondering is how they got round Imagination's IP.
I see. Poached the staff, then crushed the share price, so they couldn't fight it out in the courts. Reminds me of 2000s Intel or early Microsoft. Still, Canyon Bridge got a good deal. With Softbank and Arm, I wonder if Korea will join Japan and China in buying up British tech crown jewels?They didn't entirely. Some things like tile based deferred rendering is unavoidable. That is, if they still want to be fast and efficient. So, they negotiated a broad multi-year licensing deal.
https://www.bbc.com/news/business-50976759