The Next Big Feature For Intel CPUs is Cognitive Recognition

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I'd have to hear the intricacies of how this "Cognitive Recognition" works before I consider it anything but a "Branch Prediction version 2".
 
Cognition involves: understanding and producing language, solving problems and making decisions, memory, and attention (over time). The attention and basic memory parts are no problem from a computer stand-point (the hierarchy may need work). The understanding and producing language portion is already well on its way in software (and would probably be expensive resource-wise in hardware). Solving problems and making decisions is doable from a concrete terms/ symbolism/intuition standpoint, but true abstraction may be problematic...
 
This should have been done long ago b4 the casual user move to Mobile market.

This the only way to increase the casual usage of powerful desktop CPU
 
This concept makes no sense to me whatsoever.

You simply don't add a lot of transistors to a CPU for a task that can be done better through software because if software's flexibility.

Seriously, what am I doing as I sit here staring at my computer that the CPU has to be redesigned in order to keep up with my changing facial and body patterns?
 
[citation][nom]photonboy[/nom]This concept makes no sense to me whatsoever.You simply don't add a lot of transistors to a CPU for a task that can be done better through software because if software's flexibility.Seriously, what am I doing as I sit here staring at my computer that the CPU has to be redesigned in order to keep up with my changing facial and body patterns?[/citation]

Well, you can run Crysis solely on a CPU if you have a modernized 80's graphics/physics engine that doesn't use any GPU processing power.

The only problem is that the FPS would be less than .1
 
This is just an evolution of what processors already do, which is predict what code will be needed next and process it in advance. That way, when the code is question does need to be executed the task is already complete. Why not take this one step further and let the processor have access to information such as when a user plays a certain game every day. The processor can load the game into memory in advance so when that icon is double clicked the game just pops up with no load time.
 
so this is the start of advanced artificial intelligence.
Terminator, Skynet or maybe even the Matrix.
sooner or later it will lead to Star Trek and Borg assimilation..
RESISTANCE IS FUTILE...

and you think your 'anti-virus' is helping you right.?!
(insert evil laugh..)
 
[citation][nom]jungleboogiemonster[/nom]This is just an evolution of what processors already do, which is predict what code will be needed next and process it in advance. That way, when the code is question does need to be executed the task is already complete. Why not take this one step further and let the processor have access to information such as when a user plays a certain game every day. The processor can load the game into memory in advance so when that icon is double clicked the game just pops up with no load time.[/citation]
How would that be in any way a CPU feature? This sort of stuff is soundly in the software/OS realm, there is absolutely no need for CPU support there.

Intel's superfetch sort-of does that as well by loading at boot-time the code and data associated with your most frequently used software.
 
[citation][nom]InvalidError[/nom]How would that be in any way a CPU feature? This sort of stuff is soundly in the software/OS realm, there is absolutely no need for CPU support there.Intel's superfetch sort-of does that as well by loading at boot-time the code and data associated with your most frequently used software.[/citation]

CPUs can also render 3D graphics for games through software rendering, but GPUs mostly replaced them, and are starting to move into the movie rendering segment as well. Care to explain why?

Hardware rendering usually beats software rendering, if you have enough software that can take advantage of the hardware.
 
Status
Not open for further replies.