News Apple Reportedly Preparing 5nm GPUs For 2H 2021

watzupken

Reputable
Mar 16, 2020
1,019
514
6,070
I won't call it last step to self sufficiency here considering that they still need to buy a lot of components off the shelf and dependent heavily on their suppliers for a complete product. A company like Samsung is probably more self sufficient since they produce almost everything and run their own fab business. Of course they don't have an in house designed GPU, and no OS.
 
  • Like
Reactions: gg83 and Rdslw

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
I wonder if Apple will get out of the desktop computer business altogether. Mac represents less than 10% of the company's revenue, most of which is probably laptop sales. Developing workstation-class CPUs and GPUs for use in the Mac Pro can't be economical.
 

jasonkaler

Distinguished
I wonder if Apple will get out of the desktop computer business altogether. Mac represents less than 10% of the company's revenue, most of which is probably laptop sales. Developing workstation-class CPUs and GPUs for use in the Mac Pro can't be economical.
Everybody's getting out of the desktop business - it's just a matter of time. and Apple is usually first
 

danlw

Distinguished
Feb 22, 2009
137
22
18,695
Yay! Now Apple users can pay Titan prices for 2060 performance!

We'll see how it performs. But I'm skeptical about the performance of a first gen hardware product against companies that have been making GPUs for decades. If anything, Apple's proprietary software written specifically for their hardware could perform on par with current gen hardware. But the more likely scenario is the product gets delayed for years because it is not competitive like the Intel Larabee was not competitive.
 

gg83

Distinguished
Jul 10, 2015
649
300
19,260
I won't call it last step to self sufficiency here considering that they still need to buy a lot of components off the shelf and dependent heavily on their suppliers for a complete product. A company like Samsung is probably more self sufficient since they produce almost everything and run their own fab business. Of course they don't have an in house designed GPU, and no OS.
Samsung uses their own OS for TVs. Just can't get them in cellphones though.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
Yay! Now Apple users can pay Titan prices for 2060 performance!

We'll see how it performs. But I'm skeptical about the performance of a first gen hardware product against companies that have been making GPUs for decades. If anything, Apple's proprietary software written specifically for their hardware could perform on par with current gen hardware. But the more likely scenario is the product gets delayed for years because it is not competitive like the Intel Larabee was not competitive.
Nobody buys Apple products for the purpose of gaming so whatever Apple comes up with will be good enough for the overwhelming majority of their users. You don't need a 2080Ti for Adobe products or other content creation. Mac Pros are the only systems Apple sell that need any GPU horsepower and they can still outsource that, as it's a niche of a niche in their product stack and developing anything beyond IGP level performance for so few systems would be a waste of money.
 

King_V

Illustrious
Ambassador
Everybody's getting out of the desktop business - it's just a matter of time. and Apple is usually first

I heard this prediction, about, what? Almost decade ago? Everything was going to be tablet/phone. Microsoft even released Windows 8, something that looked designed for mobile and touch screens far more than a desktop PC.

Then they backtracked because people hated it. I'm not betting on any sort of desktop exodus.
 
I won't call it last step to self sufficiency here considering that they still need to buy a lot of components off the shelf and dependent heavily on their suppliers for a complete product. A company like Samsung is probably more self sufficient since they produce almost everything and run their own fab business. Of course they don't have an in house designed GPU, and no OS.

While I don't disagree, all the extremely complex parts Apple now design in house which is a huge leap from a decade ago.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
Well this could increase Apple software speed by ten folds in the next desktops , If they design their GPU to calculate their exclusive software on the hardware level ... it will not be as fast for third party software houses but for exclusives ? it will be ten folds , and this is what is important mostly for people who will buy iMac or Mac Pros ...

Aperture , Logic Pro , Final Cut .. and i(stuff)
 

techconc

Honorable
Nov 3, 2017
24
9
10,515
I wonder if Apple will get out of the desktop computer business altogether. Mac represents less than 10% of the company's revenue, most of which is probably laptop sales. Developing workstation-class CPUs and GPUs for use in the Mac Pro can't be economical.

Just to add some context. 2019 Mac sales were $25.7 Billion. Even though that's a relatively small amount for Apple, it's still a fairly large and profitable business. Last checked, Apple had something like 90% marketshare for PCs costing more than $1000. Either way, the Mac is still a strategic part of Apple's business.
 

techconc

Honorable
Nov 3, 2017
24
9
10,515
Yay! Now Apple users can pay Titan prices for 2060 performance!

We'll see how it performs. But I'm skeptical about the performance of a first gen hardware product against companies that have been making GPUs for decades. If anything, Apple's proprietary software written specifically for their hardware could perform on par with current gen hardware. But the more likely scenario is the product gets delayed for years because it is not competitive like the Intel Larabee was not competitive.
Not likely. The GPU is highly parallel in nature and something that easily scales, depending on how many cores you want to add to it. Apple can scale performance exactly to whatever they feel their needs are. Early concerns were that Apple was going to try to do everything on a single SoC. The fact that they're doing a discrete GPU effectively raises the performance ceiling considerably.
 

techconc

Honorable
Nov 3, 2017
24
9
10,515
The thrust of that argument seems to be "people have been saying it the wrong way and that's gaining traction." Why encourage the spread of the wrong usage?
It's sort of like the GIF "Giff / Jiff" pronunciation. Sometimes it doesn't matter what the "official" usage is if it doesn't match up with the popular usage. In this case, "think" may well be historically correct and even semantically more correct. But, it's hard to argue that it "sounds" better. Either way, I think we should agree that calling out such nonsense is rather pedantic and not really on topic with this thread.
 
  • Like
Reactions: JarredWaltonGPU

King_V

Illustrious
Ambassador
I will admit, the "another think" vs "another thing" had me searching the origin of the phrase as well.

It turns out that the "eggcorn" (another thing I just learned today) of using thing is "recent" - well, if you consider four decades ago to be recent. And, that's just using Judas Priest as a reference, who, by the way, are English, not American. Also, I'm disinclined to disagree with Rob Halford. :LOL:

I am going back to childhood, though, but I am certain use of "thing" predates the song in question.


Still, part of the problem is that the word "coming" is after "think".

Try to say the whole phrase casually/fluidly with "think" without the final consonant of "think" being diminished/blended into the preceeding n and flowing into the hard-c from "coming." It's just the nature of human speech.

The n before the k and the hard-c after it pretty much dooms it into sounding like a g, whether or not that was intended.
 
  • Like
Reactions: JarredWaltonGPU

Chung Leong

Reputable
Dec 6, 2019
493
193
4,860
Mac Pros are the only systems Apple sell that need any GPU horsepower and they can still outsource that, as it's a niche of a niche in their product stack and developing anything beyond IGP level performance for so few systems would be a waste of money.

I suspect that Apple will try to satisfy the needs of content creators using the cloud. Not everything can be done on the cloud, but for something like 3D rendering, a cloud set-up can easily outpace a PC workstation equipped with Nvidia's best. Content creators are clustered around in areas with good 5G and fiber coverage, so I think it's something that Apple can pull off.
 

danlw

Distinguished
Feb 22, 2009
137
22
18,695
Not likely. The GPU is highly parallel in nature and something that easily scales, depending on how many cores you want to add to it. Apple can scale performance exactly to whatever they feel their needs are. Early concerns were that Apple was going to try to do everything on a single SoC. The fact that they're doing a discrete GPU effectively raises the performance ceiling considerably.

When Adobe updated Premier Pro earlier this year to do GPU accelerated encoding, I know I noticed a dramatic decrease in export time. After Effects and Premier Pro also rely on the GPU for various effects. So while it may not be needed, a good GPU will definitely get the job done significantly faster.
 

techconc

Honorable
Nov 3, 2017
24
9
10,515
When Adobe updated Premier Pro earlier this year to do GPU accelerated encoding, I know I noticed a dramatic decrease in export time. After Effects and Premier Pro also rely on the GPU for various effects. So while it may not be needed, a good GPU will definitely get the job done significantly faster.
I'm not sure what your comment has to do with my post. Yes, I agree that GPU acceleration is valuable and helps pro applications. Apple knows this too. Apple isn't going to all of a sudden ignore the GPU because they have their own silicon. Just the opposite.

Your comment stated: "But the more likely scenario is the product gets delayed for years because it is not competitive like the Intel Larabee was not competitive."

That's nonsense and there is nothing to support an assumption like that. Apple provided a timeline and there is no reason to suggest they won't hit their own targets.
 

attacus

Distinguished
Aug 28, 2011
184
2
18,685
I heard this prediction, about, what? Almost decade ago? Everything was going to be tablet/phone. Microsoft even released Windows 8, something that looked designed for mobile and touch screens far more than a desktop PC.

Then they backtracked because people hated it. I'm not betting on any sort of desktop exodus.
Yeah but remember, some things are in the right place at the wrong time. The Xbox One's always-on DRM and banning of used games was 1 gen too soon. Looking at the PS5 all-digital, used games are gone. People have warmed up to the concept of PC-like digital ownership on consoles, and I wouldn't be surprised if the PS6 removes the disc entirely.

What I'm wondering is how they got round Imagination's IP. They said years ago they wouldn't use them, to try and assassinate their share price. But I don't think Apple found a way to make GPUs without their IP. I thought it was the IPs that Nvidia and AMD sat on that prevented companies like Intel making competent GPUs, because Ampere and Nuvia can make competitive CPUs.
 
Apr 10, 2020
75
13
35
Now Apple could launch new lito every year or so and use this as selling argument.
What will happen at the end, 5 or 3nm?
 

attacus

Distinguished
Aug 28, 2011
184
2
18,685
They didn't entirely. Some things like tile based deferred rendering is unavoidable. That is, if they still want to be fast and efficient. So, they negotiated a broad multi-year licensing deal.
https://www.bbc.com/news/business-50976759
I see. Poached the staff, then crushed the share price, so they couldn't fight it out in the courts. Reminds me of 2000s Intel or early Microsoft. Still, Canyon Bridge got a good deal. With Softbank and Arm, I wonder if Korea will join Japan and China in buying up British tech crown jewels?