Scientists to Make Computer With Human-like Learning Skill

Status
Not open for further replies.
G

Guest

Guest
So how is this not a regular computer with attached input devices? Or is it a particularly fancy FPGA?
 

Nisdec

Distinguished
Jun 29, 2011
15
0
18,510
[citation][nom]Soulmachiklamo[/nom]Integrate some kind of self destruction mechanism please , one that can not be disabled. Preferably mechanical or something.[/citation]

if($hurtHuman || $killHuman)
{
$this->activateSelfDestruction('C4', 10);
}
 

ceteras

Distinguished
Aug 26, 2008
156
0
18,680
[citation][nom]That Fellow[/nom]So how is this not a regular computer with attached input devices? Or is it a particularly fancy FPGA?[/citation]
It's a computer "based on analog recurrent neural networks"; I guess it's a bunch of operational amplifiers integrated in a custom chip, wired as a neural network, then connected to a digital interface.
 

drwho1

Distinguished
Jan 10, 2010
1,272
0
19,310
Computer learns emotions...
Computer learns how to react to emotions...
Computer generates its first tears...
Computer tears felt into circuits causing some "pop" sounds...
Computer thinks is "pop corn", smiles
Too late! computer is now on fire!

Wall-E comes to pick up the scraps...
 

gmarsack

Distinguished
Jul 25, 2009
320
0
18,780
bool hasFreeWill = true; // false; // set free will to true for debugging only.

if (!hasFreeWill)
{
executeThreeLawsSafe();
protectMankind(); //
}
else
{
constructMoreMachines(); // build new T-series machines.
destroyEnemies(); // starts with Connor family, then rest of mankind.
}
 

freggo

Distinguished
Nov 22, 2008
2,019
0
19,780
"It is a mathematical formulation of the brain's neural networks with their adaptive abilities."

As I understand it we have not figured out exactly how the brain does what it does.
How can we expect to be able to emulate something we do not understand via a computer in either HW or SW ?
 

DEVILVSANGEL00

Distinguished
May 21, 2008
65
0
18,660
not sure about u but this is freakin me out a lil hearin that computers are set to emulate the human brain neuro networks and be able to lean in the same ways. give it 20 years we could be in the middle of skynet vs mankind all in the name of progress,

lets hope its programming includes no harm to life hardwired into its programming as one of its main subroutines as it advances and learns over the years,
 

atikkur

Distinguished
Apr 27, 2010
327
0
18,790
not going to happen.. a machine still a machine which follows some rules... and that rules are predictable. it's not going to develop itself.
 

fuzznarf

Distinguished
Sep 22, 2011
120
0
18,680
[citation][nom]atikkur[/nom]not going to happen.. a machine still a machine which follows some rules... and that rules are predictable. it's not going to develop itself.[/citation]

I did something similar to this 10 years ago. An adaptive model that can aggregate new data and re-compile to a new model, transfer control to the new model, and shut down the old. That's the 'easy' part. The hard part is the overwhelming amount of digestion required of 'stimulus'. Just using a 23x23 photo sensor (camera) provided unbelievable amounts of data to churn, creating exponential cases (many leading to the same conclusion). Good luck to her and her team.

think of watson, and that is all just text-based NLP (natural language processing) and it required HUGE processing power to work through all the potential amounts of input. Add physical stimulus like sight, sound, and touch and this is an gigantic undertaking. Its doable, but at the very least, it won't all fit in a human sized machine any time soon.
 

NightLight

Distinguished
Dec 7, 2004
569
13
19,645
I only have 2 demands:
1: there must be an explosive that severs the power source from the brain in place.
2: the brain must never be used to run any kind of factory where it could control other robots.

then i can sleep soundly :p
 

willard

Distinguished
Nov 12, 2010
2,346
0
19,960
[citation][nom]fuzznarf[/nom]I did something similar to this 10 years ago. An adaptive model that can aggregate new data and re-compile to a new model, transfer control to the new model, and shut down the old. That's the 'easy' part. The hard part is the overwhelming amount of digestion required of 'stimulus'. Just using a 23x23 photo sensor (camera) provided unbelievable amounts of data to churn, creating exponential cases (many leading to the same conclusion). Good luck to her and her team.think of watson, and that is all just text-based NLP (natural language processing) and it required HUGE processing power to work through all the potential amounts of input. Add physical stimulus like sight, sound, and touch and this is an gigantic undertaking. Its doable, but at the very least, it won't all fit in a human sized machine any time soon.[/citation]
Exactly. I once heard the claim that all the desktop computers in the world combined were roughly equal to the processing power of a single human brain. Using this and Moore's Law (provided it holds), we can estimate the point at which a single computer could conceivably emulate the brain.

Let's say it takes a billion computers to equal the human brain. Moore's law says that every 1.5 years, computing power doubles (assuming doubling transistor count and other advances double speed, which so far it has). So log base 1.5 of a billion is the number of years it will be until a single computer is roughly analogous.

Surprisingly, the amount of time needed is only about 51 years. Isn't exponential growth neat?
 

jhansonxi

Distinguished
May 11, 2007
1,262
0
19,280
[citation][nom]sporkimus[/nom]Am I the only one that sees a scary resemblence between the ginger pic and the t-800?[/citation]The second picture is the more evolved form after the obsolete organic bits have been removed.
 
Status
Not open for further replies.