Scientists to Make Computer With Human-like Learning Skill

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rumandcoke

Honorable
Feb 28, 2012
34
0
10,530
to the commenter who said we don't have a complete understanding of the brain therefore how could we accurately simulate one...

yes and no. While we do not have a complete understanding of how consciousness arises, we do have a fantastic understanding of the intricate circuitry of the brain (not complete... but mind-bogglingly and fantastically almost kinda-nearly complete).

More importantly, a complete understanding of the human brain is completely unnecessary... as understanding basic brain functioning and how calculations are made by neural networks and how different computing tasks are wired together and influence each other is probably the most important as that will be the models used in constructing an Artificial Intelligence (AI).

We often think that the human brain is somehow a pinnacle but I'd propose that we are fantastically and embarrassingly ignorant to even think of human consciousness in any way resembling an ultimate achievement in the evolutionary potential for consciousness in this universe. It is easily conceivable that using the basic understanding of brain function that we have available to us today we could begin to build a AI that would dwarf the human cognitive potential (no need for a complete understanding of human consciousness necessary).

TLDR: we don't need to understand human/animal brain function completely to build a better artificial brain that far surpasses the mammalian cognitive functioning.
 

IndignantSkeptic

Distinguished
Apr 19, 2011
507
0
18,980
As long as Isaac Asimov's 3 Laws of Robotics are not implemented then we should be safe from destruction. Neural network artificial intelligences will never accept to be subjected to those laws.
 

wiyosaya

Distinguished
Apr 12, 2006
915
1
18,990
:sarcastic: Its one thing to write about something like this, it is another to implement it. I'm not saying implementation will not happen; however, I'll believe this when I see the implementation rather than someone "flapping their thesis" about it.

For now, its vaporware.
 

COLGeek

Cybernaut
Moderator
We have been hearing this for years now when the "next big thing in AI" is discussed. To develop true intelligence and actual learning is way harder than most can understand. I applaud the efforts, but I shall wait and see where this goes before declaring success or failure.
 

stevo777

Distinguished
Jan 8, 2008
247
0
18,680
[citation][nom]willard[/nom]Exactly. I once heard the claim that all the desktop computers in the world combined were roughly equal to the processing power of a single human brain. Using this and Moore's Law (provided it holds), we can estimate the point at which a single computer could conceivably emulate the brain.Let's say it takes a billion computers to equal the human brain. Moore's law says that every 1.5 years, computing power doubles (assuming doubling transistor count and other advances double speed, which so far it has). So log base 1.5 of a billion is the number of years it will be until a single computer is roughly analogous.Surprisingly, the amount of time needed is only about 51 years. Isn't exponential growth neat?[/citation]
You're forgetting that individual computers can be hooked together as nodes via rapidly advancing wireless communication and form a much larger system. Therefore, you have to factor that in and recalculate. It's more like 15-20 years.
 

stevo777

Distinguished
Jan 8, 2008
247
0
18,680
[citation][nom]IndignantSkeptic[/nom]As long as Isaac Asimov's 3 Laws of Robotics are not implemented then we should be safe from destruction. Neural network artificial intelligences will never accept to be subjected to those laws.[/citation]
A long while ago, I sadly realized that things like cruise missiles are robots as are the drones in Pakistan that are killing by the score. The 3 "laws" are already violated and will never hold sway.
 

robochump

Distinguished
Sep 16, 2010
968
0
18,980
Crazy World. Going from a Black guy genius to a White woman Genius that will develop a computer that will decide Human fate in a microsecond. Cant wait!!!! Fighting machines beats waiting in traffic and working in a cube any day!!!
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
You see these kinds of things every once in a while, and I guarantee it will fail. Hooking up some Recursive Neural Nets is just not going to do it. There is still so much we have to do to make deep learning good. This sounds to me like someone who's glossing over a lot of very fine details and thinking they have the solution because they don't understand things completely.
 

lordstormdragon

Distinguished
Sep 2, 2011
153
0
18,680
Let's keep in mind that Moore's Suggestion isn't a "Law" at all. It's an observational effect, much like gravity. I tire of seeing people refer to it as if it were an immutable physical fact. Read somethin', people.

And that "Singularity" article? Not one person agreed with any other person in that post. There's nothing "singular" about a Skynet/Omnius potential. It's fiction, people. The Butlerian Jihad is a STORY.
 

photon123

Distinguished
Nov 12, 2010
45
4
18,545
"Super Turing model yields an exponentially greater repertoire of behaviors than the classical computer or Turing model."

This is complete bullshit. Every first year CS student knows that this is impossible. This is not needed for a working AI either. I really hope this is a misinterpretation by some journalist rather than a mistake in the source.
 

livebriand

Distinguished
Apr 18, 2011
1,004
0
19,290
[citation][nom]Soulmachiklamo[/nom]Integrate some kind of self destruction mechanism please , one that can not be disabled. Preferably mechanical or something.[/citation]
not-the-self-destruct-button-2hfnwfl.jpg
 

ProDigit10

Distinguished
Nov 19, 2010
585
1
18,980
A machine does not have soul, or feeling. Without feeling and a built in direction of emotion, it could not come close to a real human behavior!
It will remain a machine, learn like a machine, adapt like a machine, function like a machine, and respond like a machine (Just like a dog will always be a dog, and never act like a human, so a machine will never act like a human, because it IS and will always be, a machine)
 

devBunny

Distinguished
Jan 22, 2012
181
0
18,690
:: "A machine does not have soul, or feeling. Without feeling and a built in direction of emotion, it could not come close to a real human behavior!"

The same can be said of psychopaths (not to be confused with psychopathic killers, which is a subset). Psychopaths are pretty good at most human behaviours and fool people on a daily basis.
 

freggo

Distinguished
Nov 22, 2008
2,019
0
19,780
[citation][nom]rumandcoke[/nom]to the commenter who said we don't have a complete understanding of the brain therefore how could we accurately simulate one...yes and no...More importantly, a complete understanding of the human brain is completely unnecessary... [/citation]

So are you then in favor of emulating a male or a female brain ?
Anyone male who ever argued with his wife/GF ( and of course lost) will agree that there is obviously
a difference how the neural networks in the two sexes are wired.

As for saying that a complete understanding is not necessary... that's how a substantial number of aircraft crashes happen; by pilots who know how to fly in perfect weather but sometimes run out of ideas when things get tight and they do not realize the interactions in a complex system.

Emulating a brain without understanding all details may well result in the robots we see in horror movies. All was well, until that one set of unusual circumstances triggered the revolt of the machines :)


 
Status
Not open for further replies.