Scientists to Make Computer With Human-like Learning Skill

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
So how is this not a regular computer with attached input devices? Or is it a particularly fancy FPGA?
 
[citation][nom]Soulmachiklamo[/nom]Integrate some kind of self destruction mechanism please , one that can not be disabled. Preferably mechanical or something.[/citation]

if($hurtHuman || $killHuman)
{
$this->activateSelfDestruction('C4', 10);
}
 
[citation][nom]That Fellow[/nom]So how is this not a regular computer with attached input devices? Or is it a particularly fancy FPGA?[/citation]
It's a computer "based on analog recurrent neural networks"; I guess it's a bunch of operational amplifiers integrated in a custom chip, wired as a neural network, then connected to a digital interface.
 
Computer learns emotions...
Computer learns how to react to emotions...
Computer generates its first tears...
Computer tears felt into circuits causing some "pop" sounds...
Computer thinks is "pop corn", smiles
Too late! computer is now on fire!

Wall-E comes to pick up the scraps...
 
bool hasFreeWill = true; // false; // set free will to true for debugging only.

if (!hasFreeWill)
{
executeThreeLawsSafe();
protectMankind(); //
}
else
{
constructMoreMachines(); // build new T-series machines.
destroyEnemies(); // starts with Connor family, then rest of mankind.
}
 
"It is a mathematical formulation of the brain's neural networks with their adaptive abilities."

As I understand it we have not figured out exactly how the brain does what it does.
How can we expect to be able to emulate something we do not understand via a computer in either HW or SW ?
 
not sure about u but this is freakin me out a lil hearin that computers are set to emulate the human brain neuro networks and be able to lean in the same ways. give it 20 years we could be in the middle of skynet vs mankind all in the name of progress,

lets hope its programming includes no harm to life hardwired into its programming as one of its main subroutines as it advances and learns over the years,
 
not going to happen.. a machine still a machine which follows some rules... and that rules are predictable. it's not going to develop itself.
 
[citation][nom]atikkur[/nom]not going to happen.. a machine still a machine which follows some rules... and that rules are predictable. it's not going to develop itself.[/citation]

I did something similar to this 10 years ago. An adaptive model that can aggregate new data and re-compile to a new model, transfer control to the new model, and shut down the old. That's the 'easy' part. The hard part is the overwhelming amount of digestion required of 'stimulus'. Just using a 23x23 photo sensor (camera) provided unbelievable amounts of data to churn, creating exponential cases (many leading to the same conclusion). Good luck to her and her team.

think of watson, and that is all just text-based NLP (natural language processing) and it required HUGE processing power to work through all the potential amounts of input. Add physical stimulus like sight, sound, and touch and this is an gigantic undertaking. Its doable, but at the very least, it won't all fit in a human sized machine any time soon.
 
I only have 2 demands:
1: there must be an explosive that severs the power source from the brain in place.
2: the brain must never be used to run any kind of factory where it could control other robots.

then i can sleep soundly 😛
 
[citation][nom]fuzznarf[/nom]I did something similar to this 10 years ago. An adaptive model that can aggregate new data and re-compile to a new model, transfer control to the new model, and shut down the old. That's the 'easy' part. The hard part is the overwhelming amount of digestion required of 'stimulus'. Just using a 23x23 photo sensor (camera) provided unbelievable amounts of data to churn, creating exponential cases (many leading to the same conclusion). Good luck to her and her team.think of watson, and that is all just text-based NLP (natural language processing) and it required HUGE processing power to work through all the potential amounts of input. Add physical stimulus like sight, sound, and touch and this is an gigantic undertaking. Its doable, but at the very least, it won't all fit in a human sized machine any time soon.[/citation]
Exactly. I once heard the claim that all the desktop computers in the world combined were roughly equal to the processing power of a single human brain. Using this and Moore's Law (provided it holds), we can estimate the point at which a single computer could conceivably emulate the brain.

Let's say it takes a billion computers to equal the human brain. Moore's law says that every 1.5 years, computing power doubles (assuming doubling transistor count and other advances double speed, which so far it has). So log base 1.5 of a billion is the number of years it will be until a single computer is roughly analogous.

Surprisingly, the amount of time needed is only about 51 years. Isn't exponential growth neat?
 
[citation][nom]sporkimus[/nom]Am I the only one that sees a scary resemblence between the ginger pic and the t-800?[/citation]The second picture is the more evolved form after the obsolete organic bits have been removed.
 
Status
Not open for further replies.