Aug 15, 2024
3
0
10
Read an article stating:

"Technically, 14% of PCs shipped globally in the second quarter contained a neural processing unit (NPU), formally making them AI PCs, reports Canalys"

Asking myself, what does "AI capable" really mean in layman's terms? As best that I know, AI is currently (ChatGP4, whatever) trained for languages and searching. I don't mind having better search engine capabilities, but what else does this do for me? I'm not about to start training a deep learning routine, nor do I have tremendous datasets to worry about.

Are all of the 14% of PCs being shipped used by the financial industry to "predict the future" just to make 0.1% more money than the other electronic trading algorithms?

Really would like to understand who / what / where / when, if anyone is interested in sharing some insights with a barely technical reader . . .
 
Last edited by a moderator:

Eximo

Titan
Ambassador
An NPU is just an optimized section of the CPU capable of heavy duty matrix calculations efficiently. The CPU and GPU are capable of doing it, they just use more power to do so.

14% is just the number of chips shipped, there is very little use of these chips in everyday use. The software developers will have to sit down and come up with uses for the NPU, implement it in their programs, and people will need to want to use those features.

ChatGPT is and other openAI solutions are mostly running in the cloud on large GPU accelerated servers. You could set up a local one, but it would be a smaller subset of data and wouldn't be entirely like the ones available online.

The idea behind the local chip would be something like using a Large Language Model to help with predictive text, doing image manipulation and other light end user tasks without immediately tanking the battery.

Suffice to say it is not a mature product, so the reason you don't know anything about it is because it really isn't a thing yet.
 

Ralston18

Titan
Moderator
Regarding:

"Technically, 14% of PCs shipped globally in the second quarter contained a neural processing unit (NPU), formally making them AI PCs, reports Canalys"

Curious:

How do we look for and confirm that a NPU is present?
 

USAFRet

Titan
Moderator
How do we look for and confirm that a NPU is present?
https://www.techradar.com/computing/cpu/what-is-an-npu

https://www.backblaze.com/blog/ai-101-gpu-vs-tpu-vs-npu/

How to Check If My Intel® Processor Has an Integrated Neural Processing Unit (NPU)
https://www.intel.com/content/www/us/en/support/articles/000097597/processors.html


 
  • Like
Reactions: Ralston18
Aug 15, 2024
3
0
10
An NPU is just an optimized section of the CPU capable of heavy duty matrix calculations efficiently. The CPU and GPU are capable of doing it, they just use more power to do so.

14% is just the number of chips shipped, there is very little use of these chips in everyday use. The software developers will have to sit down and come up with uses for the NPU, implement it in their programs, and people will need to want to use those features.

ChatGPT is and other openAI solutions are mostly running in the cloud on large GPU accelerated servers. You could set up a local one, but it would be a smaller subset of data and wouldn't be entirely like the ones available online.

The idea behind the local chip would be something like using a Large Language Model to help with predictive text, doing image manipulation and other light end user tasks without immediately tanking the battery.

Suffice to say it is not a mature product, so the reason you don't know anything about it is because it really isn't a thing yet.
Hmmm, wasn't the Intel 8087 coprocessor supposed to speed up matrix manipulations, and didn't all that get integrated into the Pentium CPU, (which included lookup table errors) and aren't graphics processor chips (GPUs) designed to do that same hardware function of rotating and translating 3 dimensional objects in space to create "realistic" motion of persons and objects in video "virtual realities"? So what is new and different about the functionality of an NPU, other than the name, and maybe the ability to say "We have new stuff"! ??
 

Eximo

Titan
Ambassador
I believe the bandwidth is significantly more then what was used in the past. But the key is to do it efficiently.

GPUs can be efficient, but they are also multipurpose these days as well. Powering up a GPU to access only a part of it is going to use more power.

Additionally it is better to think of GPGPU. They added General Purpose Graphics Processing Unit, since it has sub functions like encode/decode, ray tracing, etc. This was fleshed out as a concept around the time CUDA and OpenCL were taking off, being able to use the GPU for more than graphics.
 
Aug 15, 2024
3
0
10
Okay, I think some of this makes sense to me. Per this description, the optimization is targeted on a subset of general purpose GPU functionality and referred to as a Neural Processing Unit, to make it sound much more usable for AI, if anything is ever programmed to use the NPU. Otherwise, more wasted capability . . .
 

Eximo

Titan
Ambassador
Okay, I think some of this makes sense to me. Per this description, the optimization is targeted on a subset of general purpose GPU functionality and referred to as a Neural Processing Unit, to make it sound much more usable for AI, if anything is ever programmed to use the NPU. Otherwise, more wasted capability . . .

Yep, really have to wait on software developers to start taking advantage of it. Right now it would just be enthusiasts and developers installing things specifically to make use of it.