Syntiant's NDP200 chip has learned to play Doom using just 1/1000th Watt of power.
Neural Chip Plays Doom Using a Thousandth of a Watt : Read more
Neural Chip Plays Doom Using a Thousandth of a Watt : Read more
Did you not read the article? It's not doing that at all. It had to learn to scan video frame, detect a "demon" within the frame, and kill it, while conserving ammunition in the process: a neural network of approximately 600,000 parameters.It's not suprising that it just uses 1 milliwatt, it's a 5 line algorithm with 3 inputs. It's turning left and right and randomly fires until it hits something.
You should avoid those horseless carriages also. They can kill people outright!Ugh. I want the home of the past. No gizmos pls
The more they push, AI, the more I consider it like Battlestar Galactica, we’re gonna need to revert to the old ways if we’re not careful
I don’t want anything in my house having Internet capabilities, unless I choose what it is like my modems pc and 📱
The devices are one thing.The home of the far future might have hundreds of thousands of such devices in it, some small and light enough to literally float in the air.
He doesn't care. His brand of trolling is to dismiss everything as derivative, unremarkable, or downright bad.Did you not read the article? It's not doing that at all.
Which is actually quite small, for object detection.a neural network of approximately 600,000 parameters.
Yeah, the key point is that it's low-power enough to embed object detectors in everyday electronics. That's a game-changer, since it means you could potentially have something like a doggie door that unlocks only for your dog and not racoons, squirrels, or even a neigbor's nosy cat that tries to enter. Just to give one example.Low-power chips like this are the future of computing. Once you get into the sub-mw range, you can harvest ambient energy from the environment, allowing IOT devices to operate without batteries or wires.
😬 ...struggling not to make a 640K joke. (and was that KB or really Kb?)all of which were squeezed into the NDP200's 640Kb of RAM
For comparison, when I boot up Stable Diffusion it's showing as using 859.52 Million parameters.He doesn't care. His brand of trolling is to dismiss everything as derivative, unremarkable, or downright bad.
He talks like he's been there & done that, but he's obviously never attempted anything like this. If he had, he might actually appreciate some of the challenges.
Which is actually quite small, for object detection.
Yeah, the key point is that it's low-power enough to embed object detectors in everyday electronics. That's a game-changer, since it means you could potentially have something like a doggie door that unlocks only for your dog and not racoons, squirrels, or even a neigbor's nosy cat that tries to enter. Just to give one example.
Right. I was really wondering just how sensitive the model is, and how well it can discriminate. If they haven't shown its raw output on real world video samples, then I'd be suspicious.I'm assuming if Elvis were to ride a tricycle through vizDoom he wouldn't get shot by their AI since that model probably isn't in its learning.
Or even scarier maybe it does shoot Elvis riding a tricycle knowing that object is suspicious and out of place and may pose an unknown threat.
I'm just skeptical of the people who look at the current AI implementations, and call it a finished perfect product.I am amazed that certain people throughout history acted the same way: They said the same thing about every invention, be it the steam train, the car, the airplanes, electricity, machinery, you name it, they found something wrong to complain about and deem it useless!!
And here we are in 2023 on a tech website and a 2023 version of those people are trolling in the same way as their ancestors!!
Are you referring to all applications of deep learning, or just specific ones? Because we can go down a litany of applications where deep learning has been very successful, if that's what's in contention.I'm just skeptical of the people who look at the current AI implementations, and call it a finished perfect product.
It's not hyperbole to say that deep learning has completely revolutionized the field of computer vision. Starting about 10 years ago, it began to outperform classical methods in solving various computer vision problems. Today, if you wanted to do object detection, classification, matching, or recognition using anything but deep learning technology, your product would be utterly noncompetitive.Yes, in it’s current state it’s not good and that’s not to say that it won’t get good when we have ultra super complex systems, most likely quantum systems that will be able to learn.
No.Are you referring to all applications of deep learning, or just specific ones? Because we can go down a litany of applications where deep learning has been very successful, if that's what's in contention.
But that answer is wrong -less- often than the answer from an average person would be. I'm frankly flabbergasted that you don't see what an enormous stride forward that is. And in 5-10 years, that ChatGPT answer will be wrong less often than, not an 'average' human answer, but one from a tech site SME....But also, the 'answer' is frequently wrong. And now we have an OP that is even more confused, and still has a broken PC.
'less wrong' than from the average person? Sure. But 'the average person' would not be replying to a problem they have no clue about.But that answer is wrong -less- often than the answer from an average person would be. I'm frankly flabbergasted that you don't see what an enormous stride forward that is. And in 5-10 years, that ChatGPT answer will be wrong less often than, not an 'average' human answer, but one from a tech site SME.
Well, that's not what this article is about, so you can perhaps understand my confusion and concern that people are perhaps even seeming to cast doubt on the abilities of deep learning in computer vision applications like this product is (mostly) focused.No.
I'm referring to the current public facing incarnations of ChatGPT and whatever the google/bing thing is.
My reply was to this comment, not the main article:Well, that's not what this article is about, so you can perhaps understand my confusion and concern that people are perhaps even seeming to cast doubt on the abilities of deep learning in computer vision applications like this product is (mostly) focused.
Understood, but it seems some people are so triggered by the term "AI" that every thread even loosely related gets pulled off onto the same tangents. I think there are enough threads about ChatGPT and related technologies that we can hopefully litigate them there, instead of having the same repetitive debates on like half the articles.My reply was to this comment, not the main article:
https://forums.tomshardware.com/thr...a-thousandth-of-a-watt.3799177/#post-22948626