The Rise Of Client-Side Deep Learning

Status
Not open for further replies.

I'd say it is still a few decades too soon to be scared since such an AI using foreseeable future tech would still be several times larger than IBM's Watson.
 

Nuff said. :)

It will be interesting how this all plays out. Who knows, maybe in the very soon future we might have an AI that is really close to thinking on it's own with all these deep learning networks. That's a scary thought.

I feel like an open source project by AMD and NVIDIA could replace the US Senate and the House without leaving us in too bad of shape lol
 
I'm glad they're recognizing that you need computing muscles to train these things... I remember 1997-98 when I was an exchange student in Spain taking an course in 'Redes neurales'; Neural networks, trying to train a very small neural network to control and inverted pendulum. I had the assignement run training nightly on a PII 200 I bought to get throught the course (amongst others).

I don't think I ever got a good network 'configuration' out of the training. Might have been due to bad input or the Italian guy I was working with...
 

I'd say it is still a few decades too soon to be scared since such an AI using foreseeable future tech would still be several times larger than IBM's Watson.
Yes, especially since current uP technology seem to be nudging against a physical barrier preventing us from getting smaller and faster as quick as before. Unfortunatley that feel comforting, since you don't know what the mad doctor would do...
 
This was rather silly marketing propaganda, not surprisingly. They compared the TX-1's GPU to the Skylake CPU cores. Had they used the i7-6700K's GPU, it would've stomped the TX-1 in raw performance, and had more comparable efficiency numbers.

The only real takeaway from that comparison is that deep learning is almost perfect for GPUs. But that's pretty obvious to anyone who knows anything about it.
 
Did you actually read the article?

He's not talking about machines performing abstract reasoning or carrying out deep thought processes, or anything like that. The whole article is basically about client devices becoming powerful enough to run complex pattern recognition networks that have been generated in the cloud. This really has nothing to do with Sky Net.

And there's not even any real news, here. It's just a background piece that's specifically intended to educate people about what these new capabilities are, that client devices are gaining. I'd hope people would be less likely to fear neural networks running in their phones, after reading this.

I'm starting to get annoyed by people reflexively trotting out that refrain, every time the subject of machine learning comes up. If we're to avoid a Sky Net scenario, then people will need to become more sophisticated in distinguishing the enabling technologies for true cognition, from run-of-the-mill machine learning that are the bread-and-butter of companies like Facebook and Google.
 
There is already a consortium that's developing invisible NPU/VPU SoC's that are imbedded in M-glass. The main developer behind this has worked some IP magic with Qualcomm and Intel as well as some big-whigs across the pond. You'd be amazed at the things they're doing with 500 MHz scaled products.
 
Link, pls. If there's a consortium, there should be some sort of public announcement, at least.

What are the intended applications - windows, eyeglasses, automotive. etc.?

IMO, it sounds awfully expensive, and I'm not clear on the point of embedding it in glass. I also wonder how vulnerable modern ICs are to sunlight. Heat dissipation would also be a problem. And then there's power.
 
As AI becomes better perfected as well as more practicable, your can be assured that DARPA is following the developments very, very closely.
 
Status
Not open for further replies.