i8cookiemonster
Distinguished
Wow the level of ignorance I'm seeing here is really astonishing. I'm glad to see someone posted a Wikipedia link to Tim Sweeney...I hope a few people actually read it.
My belief is that he is correct in saying that soon we'll see an end in the seperation of GPUs and CPUs. Its already been happening, and this is the course that was set in motion ever since DirectX 8 and OpenGL 2.0 introduced programmable pixel and vertex shaders. Over the last several generations the GPU has been becoming more and more robust and coming closer and closer to being able to run 'general purpose' code. Larabee is the essentially the culmination of the GPU and CPU (being highly parallel like today's GPUs yet compatible with everyday x86 instructions) and it IS the natural progression of things. The benefits of this are numerous. It frees the programmer from the restrictions of the hardware...instead of having to know which functions a piece of hardware can execute, he needs to know how FAST it can execute them. It's truly a game changer. An example of what this means...take a look at Crysis. It's a great looking game. But, it's and engine created for and restricted by the capabilities of the hardware (DirectX 10 specifically). Now imagine a hypothetical future game engine. It's written totally with a custom software renderer. The implications of this is that the renderer is limited to the imagination of the programmer more than the capabilities of the hardware. DirectX 11 may be the last version before this happens (as it introduces a vast amount of general purpose capabilities and lifts many restrictions...bringing it closer to compliance with the more common CPU). AMD and Intel are both in a great position for this, and NVidia would like you to think it's not happening, but it is:
http://www.theinquirer.net/inquirer/news/1051248/nvidia-announces-x86-chip
My belief is that he is correct in saying that soon we'll see an end in the seperation of GPUs and CPUs. Its already been happening, and this is the course that was set in motion ever since DirectX 8 and OpenGL 2.0 introduced programmable pixel and vertex shaders. Over the last several generations the GPU has been becoming more and more robust and coming closer and closer to being able to run 'general purpose' code. Larabee is the essentially the culmination of the GPU and CPU (being highly parallel like today's GPUs yet compatible with everyday x86 instructions) and it IS the natural progression of things. The benefits of this are numerous. It frees the programmer from the restrictions of the hardware...instead of having to know which functions a piece of hardware can execute, he needs to know how FAST it can execute them. It's truly a game changer. An example of what this means...take a look at Crysis. It's a great looking game. But, it's and engine created for and restricted by the capabilities of the hardware (DirectX 10 specifically). Now imagine a hypothetical future game engine. It's written totally with a custom software renderer. The implications of this is that the renderer is limited to the imagination of the programmer more than the capabilities of the hardware. DirectX 11 may be the last version before this happens (as it introduces a vast amount of general purpose capabilities and lifts many restrictions...bringing it closer to compliance with the more common CPU). AMD and Intel are both in a great position for this, and NVidia would like you to think it's not happening, but it is:
http://www.theinquirer.net/inquirer/news/1051248/nvidia-announces-x86-chip