Aug 15, 2024
1
0
10
Read an article stating:

"Technically, 14% of PCs shipped globally in the second quarter contained a neural processing unit (NPU), formally making them AI PCs, reports Canalys"

Asking myself, what does "AI capable" really mean in layman's terms? As best that I know, AI is currently (ChatGP4, whatever) trained for languages and searching. I don't mind having better search engine capabilities, but what else does this do for me? I'm not about to start training a deep learning routine, nor do I have tremendous datasets to worry about.

Are all of the 14% of PCs being shipped used by the financial industry to "predict the future" just to make 0.1% more money than the other electronic trading algorithms?

Really would like to understand who / what / where / when, if anyone is interested in sharing some insights with a barely technical reader . . .
 
Last edited by a moderator:

Eximo

Titan
Ambassador
An NPU is just an optimized section of the CPU capable of heavy duty matrix calculations efficiently. The CPU and GPU are capable of doing it, they just use more power to do so.

14% is just the number of chips shipped, there is very little use of these chips in everyday use. The software developers will have to sit down and come up with uses for the NPU, implement it in their programs, and people will need to want to use those features.

ChatGPT is and other openAI solutions are mostly running in the cloud on large GPU accelerated servers. You could set up a local one, but it would be a smaller subset of data and wouldn't be entirely like the ones available online.

The idea behind the local chip would be something like using a Large Language Model to help with predictive text, doing image manipulation and other light end user tasks without immediately tanking the battery.

Suffice to say it is not a mature product, so the reason you don't know anything about it is because it really isn't a thing yet.