Is the Human Brain the Fastest Cpu

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Ok just getting back to the memory thing for the brain, what ur sayin is that the human brain has no hard disk, and a finite but large ram capacity but the cache is monsterious, and as u get closer to the nurons/cpu the higher the memory capacity

The other thing i wanted to say was to create an A.I. (it would be software) but u would need to grow it/updated it while it was running with more complex programing until it became selfaware, and i dont know where the threshold of selfawareness would be

If u want an EG of that, what is the earliest memory u remember, for me it was about when i was 4 years old. so 3 and a half years of acumulating experience befor i remember being selfaware.

Flame away (been up for about 35 hours!)
 
No. You see the brain has pathways that have arranged naturally based on the stimulous. Each cell in the brain reacts differently to stimulous. The right stimulous to the right cell type creates the signal necessary to complete the task. Example, you move your arm, the entire brain receieves that command, therefore, you remember your arm moving, your ears prepare to hear it, your eyes watch it, and your muscles move it.

What he means by the brain having no memory is that neural pathways are always being adapted, as the pathways move around, shrink, or grow, the "reception" to the surrounding areas of the brain reduces. The less you recall a memory the faster its state in the brain is reapplied to another memory. The brain is dynamic, in that it never stops changing, neurons are made every moment, and each one participates in a thought or brain activity, or it dies. So if you run a "process" all the time, it becomes imbedded, often to the point of autonomic, as in, breathing is done so often you dont have to remember to inhale or exhale. Calculating 2+2 does no require a pencil to figure out. The naturally created pattern of that though is made. Every memory is a pattern, the pattern of neurons creates a signal that the brain understand as "thats picture was taken when I was...").

To further explain how this works. If you make a pile of sand by pouring it from a bucket, you get a mound. That mount is a pattern, a naturally formed structure based on the medium used. Now imagine that an impulse from your frontal lobe sends a signal through the brain, the medium that is sends that signal creates a neural pathways in a natural way, the pulse itself creates the neural pathway, and the different "isotopes" of neural configurations react differently to each signal they receive. You see, the original brain developement is in the womb, the parts that are formed first involve survival and maintaining the body. All of the things the brain does without us noticing are pathways formed in the womb, prior to birth.

Time for work. I'll edit this and add more when I get a chance.
 
Ok just getting back to the memory thing for the brain, what ur sayin is that the human brain has no hard disk, and a finite but large ram capacity but the cache is monsterious, and as u get closer to the nurons/cpu the higher the memory capacity

The other thing i wanted to say was to create an A.I. (it would be software) but u would need to grow it/updated it while it was running with more complex programing until it became selfaware, and i dont know where the threshold of selfawareness would be

If u want an EG of that, what is the earliest memory u remember, for me it was about when i was 4 years old. so 3 and a half years of acumulating experience befor i remember being selfaware.

Flame away (been up for about 35 hours!)

my first memories begin at 4 yrs old as well
strange?
judging by that I would think that the brain is developed enough around the age of 4 to finally store long term memories
selfaware is an interesting concept though
 
Wow. People need to stop. There are too many people who really just don't know what they're talking about in this thread.

Allow me to summarize:

1) The human brain is NOT the fastest CPU.
This is because the human brain is not a CPU. It's a human brain. It's like saying the 747 is the fastest car you can buy.


People need to stop discussing HOW the brain works because most people are basing what they say off of something they read and (mis)interpreted online. There are so many blatent errors in this forum it's almost sickening to read.

Perhaps I digress. Perhaps I should post another topic as not to hijack this one, entitled, "Beef is the healthiest fruit." :roll:
 
i think this would be the best place to discuss this

in another thread the point was raised that the fastest CPU is the Human Brain not wanting to hijack a thread i thought i would post a new one

The Human Brain is the Fastest Cpu

now thats open to debate. seeing as a brain acts as both a hard drive and CPU also.

also what is 928346*98236?
im sure the computer can do that faster than the human brain hell try doing what super pi does in 30s in your brain.

i would suggest that a human brain is more like a 100 core processor with a lot of them dedicated to individual functions that wouldnt be much good at doing other things (cpu cant act as a Gpu)

what makes something the fastest CPU?


Well, the CPU's of today are no where close to the power of the Brain, (except for high level calculation involving numbers, etc) I am pretty sure that the power of the human brain will be achieved in about 50 years from now at the current phase. It is mostly not dependent on the hardware but rather the software. The current hardware is probably worth 10% of the Brain, but the software is about 2% in comparison with the brain. There are some things that the PC is still better than doing such as playing Chess. Just look at DeepBlue. The hardware was fully capable of doing every thing the Software told it to do, but it was not until the software was perfected did DeepBlue win. How many chess games can you win against the PC? Usually most of us will loose more to the PC than win. 🙁 . The hardware usually advances much faster than most of the software(with certain exceptions.) It also depend on the humans who create the Hardware and Software.In conclusion the hard ware and the software is NOT YET capable of achieving the power of the Brain, but just give it time.
 
I think that it is true that the CPU is faster and that is the only reason it can "seemingly" compete at all with the human brain. A computer can calculate Pi to man many places faster than a human, but a CPU cannot conceptualize Pi . A CPU might beat a human at chess only because it can iterate through EVERY possible option, while a human might reason a solution much quicker than having to look at every option. Human is effectively faster, CPU is sequentially faster.

Another point, the brain creates circuits as necessary and can change processing methods on-the-fly.

The human brain can do many things without interruption; breathing, heartbeat, digestion, sticking your foot in your mouth, etc. This is all done concurrently. A CPU never really does more than any one thing at a time. I guess dual/quad core is the exception, but a CPU goes through sequences; check for keystroke, wait for mouse move, update screen, loop, loop, the CPU just appears to things in parallel because it is doing the sequences so fast.
 
You've got it right. There is no comparison between the brain and a cpu. To even discuss what the brain is or does requires a doctor, a brain surgeon, or someone who has worked in the field for years and can offer a glimpse at what the brain is capable of. I'm not going to try to explain what the brain is responsible for because I don't have the expertise, nor does anyone here in this thread. Just move on to something else you think you know you can talk about.
 
Wow. People need to stop. There are too many people who really just don't know what they're talking about in this thread.

Allow me to summarize:

1) The human brain is NOT the fastest CPU.
This is because the human brain is not a CPU. It's a human brain. It's like saying the 747 is the fastest car you can buy.


People need to stop discussing HOW the brain works because most people are basing what they say off of something they read and (mis)interpreted online. There are so many blatent errors in this forum it's almost sickening to read.

Perhaps I digress. Perhaps I should post another topic as not to hijack this one, entitled, "Beef is the healthiest fruit." :roll:



O great one please save us from our inferior thoughts and posts. Please enlighten us and correct all of our mistakes as yours is the superior CPU> :roll:
 
Go to Artificial Life ...again. There are artificial snakes learning to hunt, and constantly improving their technique (so is the mouse). There are emerging cooperations between artificial life creatures. ...You just sound like the first guy who was told that man comes down from the chimp and just said NO coz he looked in the mirror. There are ANNs constantly adapting, memory-like behavior being simulated etc... Man, go to a philosophy thread (no offense, i suppose you're great at what you do, just don't tell us what we are supposed to be doing....just sit there and ....do whatever you do on a philosophy PhD)...
 
I would say the brain is not only the faster but also the better CPU.

Can a CPU do 1+1 quicker than a human brain? No. Can it do 5x5 quicker? No. Can CPU do 424+882 quicker? Again No (in a few cases at least).
The reason for all that is that we KNOW what 1+1 is, not because its a simple calculation, its because we dont have to calculate it, we just know it, a CPU doesnt. Run the Windows calculator and you can do 1+1 as often as you want, but the CPU will never learn that 1+1 is 2, our brain does.
Its this process of evolving itself that makes our brain so vastly superior.

I do bet that human brains can beat a X6800 at SuperPi, the thing is that our mind doesnt know how to use the resources of our brain effectively (we only use about 0.2% of our brain).
 
From the beginning I thought it was not worth to post on this thread, but now see it growing to 7-8 pages and got tempted. All I had to say is; do you guys realize that this is too much; even comparing Core2s with K8s is like comparing apples with grape, but human brain vs a CPU is like comparing an apple with a microwave oven.
 
just so we all can quit saying it, we use more than 10% of our brains. As far as science knows now, we use just about all of it. This is not an idea, it's where neuroscience is now.
 
, but human brain vs a CPU is like comparing an apple with a microwave oven.

I assume by Apple you mean the fruit and not the computer? And by microwave you mean the oven and not the actual electromagnetic wave?

You should be more clear in your post!

Actually, if you look a the microwave it is not nearly as complex as the apple. A microwave simply cooks at the power and for the time it is instructed to do. And flash 12:00 on the front likes it's chronological and soon to be defunct brother, the VCR. Unlike the VCR, the microwave usually has the right time on it.

An apple is a self replicating and nourishing object and can be used for more than just a single task unlike the microwave oven.

okay M25, see what one little reply can start? 😀
 
Don't forget too that the brain is not just a cpu it is a hard drive and GPU, GPS LAN, etc. etc. etc............................

The brain is not a CPU.

It's not a GPU, GPS or LAN.

I do not like green eggs and ham.

I do not like them, SAM I AM.
 
, but human brain vs a CPU is like comparing an apple with a microwave oven.

I assume by Apple you mean the fruit and not the computer? And by microwave you mean the oven and not the actual electromagnetic wave?

You should be more clear in your post!

Actually, if you look a the microwave it is not nearly as complex as the apple. A microwave simply cooks at the power and for the time it is instructed to do. And flash 12:00 on the front likes it's chronological and soon to be defunct brother, the VCR. Unlike the VCR, the microwave usually has the right time on it.

An apple is a self replicating and nourishing object and can be used for more than just a single task unlike the microwave oven.

okay M25, see what one little reply can start? 😀
You can feed yourself with apples but die of hunger with a microwave, even if both fall into the kitchen category; so a human brain is not comparable to a CPU, it's like comparing a diesel engine with a nuclear reactor; few basic concepts in common but so far apart.
 
Here is the abstract of a presentation given by Ray Kurzweil at SC'06. He gives us 25 years max before the AI catches up. In terms of simple petaflops the IBM P7 and Amd/Cray Cascade will get us to 2 petaflops sustained which is about 16% of the human brain by 2010. Sun's new Rock processor will probably be equal

The Coming Merger of Biological and Non Biological Intelligence

Session: Keynote

Event Type: Invited Speaker

Time: 8:30am - 10:00am

Session Chair: Barbara Horner-Miller

Speaker(s): Ray Kurzweil

Location: Ballroom A-D

Abstract:
The paradigm shift rate is now doubling every decade, so the twenty-first century will see 20,000 years of progress at today's rate. Computation, communication, biological technologies (for example, DNA sequencing), brain scanning, knowledge of the human brain, and human knowledge in general are all accelerating at an even faster pace, generally doubling price-performance, capacity, and bandwidth every year. The well-known Moore's Law is only one example of many of this inherent acceleration. The size of the key features of technology is also shrinking, at a rate of about 4 per linear dimension per decade. Three-dimensional molecular computing will provide the hardware for human-level "strong" AI well before 2030. The more important software insights will be gained in part from the reverse-engineering of the human brain, a process well under way.

We are rapidly learning the software programs called genes that underlie biology. We are understanding disease and aging processes as information processes, and are gaining the tools to reprogram them. RNA interference, for example, allows us to turn selected genes off, and new forms of gene therapy are enabling us to effectively add new genes. Within one to two decades, we will be in a position to stop and reverse the progression of disease and aging resulting in dramatic gains in health and longevity.

The fraction of value of products and services comprised by software and related forms of information is rapidly asymptoting to 100 percent The deflation rate for information technologies, both hardware and software, is about 50 percent per year, providing a powerful deflationary force in the economy. The portion of the economy comprised of information technology is itself growing exponentially and within a couple of decades, the bulk of the economy will be dominated by information and software.

Once nonbiological intelligence matches the range and subtlety of human intelligence, it will necessarily soar past it because of the continuing acceleration of information-based technologies, as well as the ability of machines to instantly share their knowledge. Intelligent nanorobots will be deeply integrated in the environment, our bodies and our brains, providing vastly extended longevity, full-immersion virtual reality incorporating all of the senses, experience "beaming," and enhanced human intelligence. The implication will be an intimate merger between the technology-creating species and the evolutionary process it spawned.

http://sc06.supercomp.org/schedule/event_detail.php?evid=5321 You can order the full paper from IEEE..
 
If you don't like this thread go some where else. We like talking about brains.


8O

Go eat some clam cakes along with your grinder and a cabinet.... :wink:


Then go paak ya caaa at beach and look for some shaaaks and throw a potty!
 
How fast can you calculate Pi to the 1millionth digit? You can't? Well, pretty damn slow comparted to a computer then.


The human brain is the fastest storage medium and fastest input device by far. You can't calculate Pi to 1m digits in you head, but you could memorize it. You can "record" HD video in your head at hundreds of gigabytes per second. Computers can't touch that.

Sure, robots and AI are clumsy right now, but it would take a team of ~100 engineers working 5 hours a day 7 days a week for 10 years to equal the amount of time that goes into "programming" a 40yo person. It's the software and storage devices on computers that are slow, the computational power is many orders of magnitude greater.

There are rudementary self-learning computer programs already operating. Working together as teams thinking analytically human beings are inherrantly capable of designing systems that operate better than us. This is what any invention is (well, any good invention at least). We design systems, tools, that are better at a task then we are, computers are no different. We're struggling with creating computer systems that are all-around smarter than us because we are stupid, reletively speaking, and highly prone to error. Human being come with a lot of "software" pre-written. Once we fully understand this base platform implementing, i think, on computers will be a snap and from there with increased computational power, near error-free operation, and the ability to upgrade both hardware and software such a system should be able to design self-improvements that take it well beyond human capabilities.

@Elbert
Rendering graphics for output is a good example of something that a computer seems to do slowly, but I must point out human beings can't do it at all. There is no "video out" on the human brain and we don't really "render" graphics when we "record" them either, we just memorize them. You can imagine graphics though... that's difficult to quantify but it could be considered the human brain creating and rendering HD video content which would be equivalent to say, re-encoding a an HD video but I think the human brain cheats and uses a lot of pre-rendered stuff, much like the nVidia drivers that were optomized for 3DMark. But impressive none the less *ponders*, you might have brought up the best point I've seen so far (feeling, learning, walking, decision making and math savants really aren't beyond the capabilities of current computer hardware) but very hard to quantify.

I must now go ponder this...
Here you getting into special purpose processing. For most number cruching a good old caculators is as fast as a CPU the differance is the CPU can do many other taskes. Comparing a caculator to a CPU is the same as comparing a CPU to the human brain.

A robot in a house finds its way around by math and sensors to let it know when it bumps into things. A human can see and process everthing in its way to find its path. A CPU could not process in a thousand years all the texturs, objects, and colors in real life to find its way though a house too even speak of going outside. A robot is clumsy because it cheats by not processing its surroundings and just uses math to find its way. A step up, uneven surfaces, or even unlevel surfaces can cause a robot to fall.

Most 2 year olds can walk yet CPU's have had near 50 years and cant match a young child.

Your incorrect about video out as we dream and for the most part its things we've never done that we dream about so memorizing is out. Our brains have video out just we are the only ones who can view the output. Everything we, as humans, have created has started from someone picturing it in our mind. Ever seen a elf, hell spawn, plama rifle, ect... in real life? No, but someone imagined them and draw them on a otherwise blank piece of paper. I admite sometimes what we picture doesnt work but we share the picture with other and they picture it differently to fix the problems.

Our brain may cheat on somethings but a CPU isnt smart enough to cheat. The CPU has to do positive and negitive addition to solve problems no matter how many times it does pi. Humans can atleast learn to multiply and divide and after we learn how we dont have to go back to add and subtract for them.

Feeling, learning, and walking are limited for today's CPUs as tricks are used such as gyros for balance while walking. CPU's cant even compansate for uneven\unlevel groud in most cases while using gyros. The brain must process all our surroundings for us to walk. A robot uses math for direction until it bumps into things.

The last test I seen of a computer attempting balace was a troting horse design using hydrolics. The horse like design had to be helped both starting and stoping. It only troted in a circle and would fall over alot.
 
I think we are confusing the software and hardware again. Human OS version 1,000,000,000,000,000 (or higher) has taken a long time (millions of years) to evolve. The amount of time we have spent evolving computers to walk around a house or find their way is tiny, give us time we will create machines capible of doing everything we do (and possibly regret it) not necessarily in our lifetimes but it will happen.
 
Here is the abstract of a presentation given by Ray Kurzweil at SC'06. He gives us 25 years max before the AI catches up. In terms of simple petaflops the IBM P7 and Amd/Cray Cascade will get us to 2 petaflops sustained which is about 16% of the human brain by 2010. Sun's new Rock processor will probably be equal

The Coming Merger of Biological and Non Biological Intelligence

Session: Keynote

Event Type: Invited Speaker

Time: 8:30am - 10:00am

Session Chair: Barbara Horner-Miller

Speaker(s): Ray Kurzweil

Location: Ballroom A-D

Abstract:
The paradigm shift rate is now doubling every decade, so the twenty-first century will see 20,000 years of progress at today's rate. Computation, communication, biological technologies (for example, DNA sequencing), brain scanning, knowledge of the human brain, and human knowledge in general are all accelerating at an even faster pace, generally doubling price-performance, capacity, and bandwidth every year. The well-known Moore's Law is only one example of many of this inherent acceleration. The size of the key features of technology is also shrinking, at a rate of about 4 per linear dimension per decade. Three-dimensional molecular computing will provide the hardware for human-level "strong" AI well before 2030. The more important software insights will be gained in part from the reverse-engineering of the human brain, a process well under way.

We are rapidly learning the software programs called genes that underlie biology. We are understanding disease and aging processes as information processes, and are gaining the tools to reprogram them. RNA interference, for example, allows us to turn selected genes off, and new forms of gene therapy are enabling us to effectively add new genes. Within one to two decades, we will be in a position to stop and reverse the progression of disease and aging resulting in dramatic gains in health and longevity.

The fraction of value of products and services comprised by software and related forms of information is rapidly asymptoting to 100 percent The deflation rate for information technologies, both hardware and software, is about 50 percent per year, providing a powerful deflationary force in the economy. The portion of the economy comprised of information technology is itself growing exponentially and within a couple of decades, the bulk of the economy will be dominated by information and software.

Once nonbiological intelligence matches the range and subtlety of human intelligence, it will necessarily soar past it because of the continuing acceleration of information-based technologies, as well as the ability of machines to instantly share their knowledge. Intelligent nanorobots will be deeply integrated in the environment, our bodies and our brains, providing vastly extended longevity, full-immersion virtual reality incorporating all of the senses, experience "beaming," and enhanced human intelligence. The implication will be an intimate merger between the technology-creating species and the evolutionary process it spawned.

http://sc06.supercomp.org/schedule/event_detail.php?evid=5321 You can order the full paper from IEEE..
How long after that until its the same size as the brain for both storage and processing? how long until it has the same input and output hardware to match humans in the same equal space? What about another important part of moores law where 25nm is the all stop point. True this stop point has been fixed but want this effect moores law? Maybe slowing moores law or stopping it all together at another point. I think we will get to that level of processing but not in our life times.
 
I think we are confusing the software and hardware again. Human OS version 1,000,000,000,000,000 (or higher) has taken a long time (millions of years) to evolve. The amount of time we have spent evolving computers to walk around a house or find their way is tiny, give us time we will create machines capible of doing everything we do (and possibly regret it) not necessarily in our lifetimes but it will happen.
No, the hardware is most important for without it changing it simply is not going to occure. True we have taken a long time but in that theory nothing help use along. I agree its not going to occcure in out lifetime and nothing a CPU can do now can even compare to the human brains processing capacity.
 
Status
Not open for further replies.