What if Intel...

GrimzardTech

Honorable
Jan 31, 2014
68
0
10,660
Hey, i've been thinking about Graphics Cards latley, how there is mainly Nvidia and Ati, I wonder, what if Intel made dedicated graphics cards, why do they ony do integrated gfx?
 
Solution
Intel licensed their graphics tech from Nvidia in a cross deal allowing Nvidia some of their tech to on CPUs (but without the X86 licence- Nvidia wanted it for their ARM chips they've been developing). Intels GPU tech is several generations behind AMD and Nvidia in some key areas- don't get me wrong it's *much* better than it was, however I doubt they'd be that competitive with discreet cards.
Intel licensed their graphics tech from Nvidia in a cross deal allowing Nvidia some of their tech to on CPUs (but without the X86 licence- Nvidia wanted it for their ARM chips they've been developing). Intels GPU tech is several generations behind AMD and Nvidia in some key areas- don't get me wrong it's *much* better than it was, however I doubt they'd be that competitive with discreet cards.
 
Solution
Long story short, the executives at Intel looked at the size of the discrete market, did a bit of math and concluded that the profit to be made didn't justify the investment and more importantly the risk (what is the Intel solution sucks? what if it's good but doesn't sell?). The number of people that actually go out of their way to buy a discreet video card is a huge minority compared to the number of people that are perfectly content with integrated graphics on the PC client side. The number is also shrinking as integrated graphics become more and more competitive. Enterprise servers (where most of the money's at) usually don't even have a screen at all, that market segment doesn't care about the latest and greatest 3D effects, they just want a command prompt... For HPC and massively parallel workloads Intel has the Xeon Phi line to go up against ATI Stream and NVidia CUDA.

Besides, the computing market has shifted to (ultra) mobile where it's about efficiency and dedicating just enough resources to get a job done without wasting energy. Discrete graphics wouldn't fit well into that roadmap.

Edit: Being a new entrant to the SOC market, Intel's part has to be almost strictly better for them to get any foothold, they're no longer the 800lb gorilla anymore. Thus, they have to improve their integrated graphics offerings to complete against Mali, PowerVR etc. The invested resource is also conveniently shared with their consumer desktop line (cost of development is also conveniently amortized too). Thus you see the huge increase in performance/efficiency in their more recent generations.

Edit 2: Also, it's not too long ago that Intel tried their hands (again) at discrete graphics. Look up the Larrabee. The hardware itself wasn't bad, it's just that the drivers absolutely sucked. Intel is first and foremost a hardware company, and the software side hurts them to this day. the Xeon Phi line is the direct descendant of the original Larrabee project.