The Next Big Feature For Intel CPUs is Cognitive Recognition

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.


Processors and chips have instructions sets and firmwares, which are basically a kind of software too (just build-in).
 
"The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve. "

While I hate to admit it, I think he's wrong here. Far too many people have allowed the smartphone to be their centerpiece of computing and their lives. It won't ever be that way for me, and people will still have PCs and such, but they are living off their phones today. That will only get worse.
 
[citation][nom]Prescott_666[/nom]Why would you build this function into silicon? This is the kind of feature that should be implemented in software.I don't see the point.[/citation]

Correction..

Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the point

I hope I'll never need a machine to tell me what to do next..

 
I see this more as a power saver, the CPU can tell what processing capabilities the current user tasks need and power down everything else. No more music streaming keep all the cores and caches powered up.
 
[citation][nom]beayn[/nom]"The executive also mentioned that he does not believe that the smart phone or tablet can be the centerpiece of computing and that PCs will remain alive and continue to evolve. "While I hate to admit it, I think he's wrong here. Far too many people have allowed the smartphone to be their centerpiece of computing and their lives. It won't ever be that way for me, and people will still have PCs and such, but they are living off their phones today. That will only get worse.[/citation]

What I think he meant is that most of the more advanced computing capabilities will not be processed in smartphones. Even if smartphones will be the "centerpiece of computing" for most people and PCs will eventually die, most of the number-crunching will be done in servers and will be transferred to your device of choice. This will become more apparent as internet bandwidth and speed will grow, but you can already see some services offering advanced capabilities such as Dropbox and Onlive
 
[citation][nom]pat[/nom]Correction..Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the pointI hope I'll never need a machine to tell me what to do next..[/citation]

...now hit "submit" to continue or F1 if you need help 🙂

 
[citation][nom]pat[/nom]Correction..Why would you build this function into silicon? This is the kind of feature that should be implemented in HUMAN.I don't see the pointI hope I'll never need a machine to tell me what to do next..[/citation]
my thoughts exactly

skynet , silicon messiah , matrix and I Robt are a few things that come to mind when disccsuing a chip recognizing any thing past some numbers i throw at it. certainly don't need my computer deciding it's time to play solitaire just because my mom usually plays a game of solitare every day in the morning on my pc.
 
I agree with 'pat' above. We operate on at least two different levels, and this would make it difficult for most to operate on more than one level even when it is necessary to do otherwise. With regards to how it would be implemented; if this was an offshoot of memristors, then i can see it possibly being done completely in hardware, mimicking how the human brain habituates.
 
It shouldn't be "what the user wants next" it should be "what the software wants next" The cpu doesn't, and will never know what the user will want next because it is very, very, loosely coupled to the user. Instead, software prediction(branch prediction, on the fly loop optimization, etc etc) and doing that better is a better solution and has been worked on for several years now...

In the end, I think this guy paul is just bored and shooting crap out his rear end because humans are extremely unpredictable from a computer stand point...
 
There will never be enough transistors to implement something like this at the silicon level. This would be implemented at the OS level at the lowest, but likely at the application level. The most Intel could possibly do is add more cores to support doing this without slowing down the rest of the machine. They could also profile the workload and create new SSE/AVX instructions, but the workload won't be much different than existing database workloads, so there's not much to be done there.

Intel: Taking credit for software developer's work. Well done.
 
"Why would you build this function into silicon? This is the kind of feature that should be implemented in software.

I don't see the point."

Placing this function into the CPU makes it (the function) OS neutral.
 
[citation][nom]x3nophobe[/nom]Placing this function into the CPU makes it (the function) OS neutral.[/citation]
Not really since in order to make any predictions about what the user wants to do next, the CPU would need to receive information from input devices and other stuff managed by the OS and the CPU would also need ties into the OS to act upon its predictions since it would otherwise have absolutely no cue about whether it is processing a keyboard press, mouse movement/click, on-screen button, etc. nor have any clue about how to act on any of those.

Considering how complex the data-hoarding/mining algorithms may be and how intimately the prediction algorithms may need to interact with software, this sort of feature is fundamentally impossible to implement in hardware. At best, they may implement new instructions, multi-threaded and GPGPU libraries to optimize those types of algorithms at the OS and application levels.
 


IMO what is meant by "cognitive" is that the computer will be better able to interpret ambiguous input or commands from the user, not tell the user what to do. Sorta like anticipating the user's needs, instructions, wants, etc. This could prove useful in the near future when voice/command recognition is more widely used.
 
I think a lot of people are missing the point. The idea is not that the computer DOES what you want ahead of it, the idea is that the computer becomes more aware of itself and the user's habits so that it can set aside resources appropriately, and better understand the commands from the user (whatever the input). This is not so important for dedicated input systems such as touch, keys, and mouse where there is a very ridged meaning behind each command, but it is absolutely necessary for better speech control, and 'combined commands'.

The idea is that you could tell the computer to check a document for errors and send it to someone. The computer would be able to recognize that you mean spell and grammar check, export to PDF or some other document type that the other user can accept (because cognitive understanding can be implemented for other 'users' that are not the host user), and then send that document in a way that the receiver would prefer (email, file transfer, FTP, etc.). It means high level control of a machine instead of low level control, it does not mean that the computer is sentient, or makes 'moral' decisions, or any decisions outside of the realm of the task at hand.

It also means that the computer can be better self-aware. If you are doing a specific activity the computer will be better able to cache resources for said activity and predict the user's intent. It does not activate it until the command it given, but it would be a better end-user expierence for your computer to act one way during the work day, and annother way during hours that you would normally have as recreation. You could still do activities that the computer is not expecting, but at least it would be more effective at guessing your needs more of the time than not guessing at all.
 
Yes, Like the dates/version Of the laptops OEM's customized Intel HD graphics drivers are woefully out of date, initiate INTEL OEM warning to (laptop's OEM), Please update Your customized Intel HD graphics drivers, or face losing certification for your product!
 
Status
Not open for further replies.

TRENDING THREADS