Qualcomm Snapdragon 820 Deep Learning SDK Nudges Local Neural Nets Closer To Mainstream

Status
Not open for further replies.
I was under the impression one needed some serious hardware to effectively implement such systems. I can understand Nvidia GPU's being up to the task, but not my mobile phone.
 
Part of the reason such heavy gpu's are used when mobile chips can do similar things is versatility. Much of the way speed is gained in these neural net chips is by using half precision (16 bits) while this works really well for neural nets, (more inaccuracy from small samples than floating point problems), it isn't very useful for graphics. Basically you are throwing very specialized hardware at a task to make it much more efficient. As a side note, part of it is that these are probably meant to start with a base model and incorporate new data in real time, which means it is never building a full model itself.
 
I'm far more excited about not needing the cloud and "hopefully ambivalent" services for voice recognition/natural language features. If that can be done on device... well, a lot of voice activated tech sounds much more private and appealing to me.
 
Deep Learning and Neural Nets? Seriously? Didn't anyone watch "The Terminator"?
You obviously don't know what neural nets are.

https://en.wikipedia.org/wiki/Artificial_neural_network

Hilarious. I read the Wikipedia article, and it sounds eerily similar to Skynet. You obviously have never seen Terminator or Battlestar Galactica.

I'm with LordConrad. This stuff scares me a little.
 
The prospect of decoupling natural language interaction from the cloud by itself is extremely exciting to me... I'd much rather not have to depend on network signal, low latency, and "hopefully benevolent" (not harvesting personal data) cloud services for something that can be done on-device.
 
Status
Not open for further replies.