[citation][nom]Draven35[/nom]Or a CUDA program running on some of the cores to do the same thing.[/citation]
It may, but is only the tip of the Iceberg. The name changes in time and manufacturer but processing units like the z-culling, texture compression, Fast zeroes and more important :the thread dispatcher are not present in the GK110 or more precisely is not optimized for graphics.
A dispatcher on a gamming card must divide the frame into tiles and group the vector data (geometry), the scalar data (texture,especular map,etc) and the shader(s) program that may apply into that tile.Then send the dataset to the cores for the final pixel color calculation.
Has been perfected (spending millions of dollars) by trial and error to the point the pipeline is working on 2-3 frames ahead of the currently displayed.None of that is present in the GK110, it was not needed and Nvidia will not put that in the drivers.It will open his more valuable industrial secrets to the competitors (and the linux community).
Also GK110 do not have raster backends (ROPs). The output from the cores is useless if is not write as pixels. Technically 32 bits sets, in burst writes by tiles and displayed as full frames. If you try to write a frame using individual bytes, the memory bandwidth will sink faster than the Titanic.
If the technical hurdles are not enuff to ditch the idea of using binned chips, lest examine the economics:
a) Any card based on binned chips will have an unstable availability. It will nose dive after the launch.Early adopters of the GTX 680 remember how much "fun" it was to click the refresh button at NewEgg.
b) With the foundries working nearly at the top of their capabilities with no production runs to spare, Why anybody in the world wanna produce a very expensive die and then cripple it on purpose Just to meet a low profit demand ?. Idiotic
c) Why to produce a card that will be considered "Expensive" for a grand when you can produce a workstation for Two Grands+ and be considered a bargain?.Even if is a clearance sale.
In fact the GK104 double precision performance is so bad than if GK110 sees the day as a $1,000 card, I don't think gamers will get many.CAD users and Hollywood will eat them.
Paired with a 6<->8 cores PC it has to be the best deal for Raytracers since the Amiga+Video Toaster.
What I expect is a new design (call it GK111 or whatever) with all the missing bits and pieces than can be brought at will by gamers.
But I keep day dreaming of a GK110 on a workstation....