Evolution Of Intel Graphics: i740 To Iris Pro (Archive)

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Something I think that needs mentioning is that the way people are counting GPU's kinda makes Intel always win regardless. Every Intel CPU marketed towards consumers will have an iGPU inside it, and that iGPU will be "counted" as market share even if the user isn't using it nor every planned on using it. I have an i7-4790K that comes with a HD4600 iGPU. I also have two nVidia GTX 980 TI's in a custom WC loop and while I have never used, nor will ever use, that HD4600 I'm still counted as having one. And thus every computer with an Intel CPU is giving Intel, on paper, graphics market share. I think a better metric would be how many systems have an Intel GPU that don't have any other graphics adapter installed.

Granted it isn't a perfect representation of the entire desktop/laptop gaming community, wouldn't steam's market share graphs lack this issue? They should only count GPUs that are in use and ignore unused integrated graphics, right?
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960

I agree that it would be more accurate if they could somehow distinguish between people who use the Intel iGPU and those who do not. It might be a lot harder than it sounds, though. When it comes to desktops with Intel core i-series CPUs, I assume that steam can tell when someone is using their GTX or Radeon to game and not their HD Graphics simply because of the driver installed. On desktops, if the discrete card was installed before the drivers were installed, there would be no HD Graphics driver installed on the system. I know in some cases like with Virtu MVP the HD Graphics might be used with the discrete GPU at the same time but most people are using only their discrete GPU and only have the discrete GPU driver installed.

When it comes to laptops, though, it is a little tricky since almost all of the ones that have the Intel CPU and Nvidia dGPU configuration will have Optimus switching back and forth between the iGPU and dGPU to conserve battery life. All of those laptop users are technically using their HD Graphics unit as a basic display adapter.

Even if they were able to accurately distinguish between people who are actually using their Intel iGPU, I think the number of people using it as their only GPU will still be very high even when you minus all the desktop and laptop discrete users.

I think that if Intel released an unlocked i5 and i7 that were both identical to the regular unlocked i5 and i7 but without an iGPU, it really wouldn't cut into their regular unlocked i5 and i7 with the iGPU sales as much as we would like to think. The only way I can see that happening is if the price difference was so significant that it would be cheaper for non-gamers and the like to buy the CPU without the iGPU in it and an entry level dGPU to use as a basic display adapter. On the other hand, if Intel decided to release an i3 and Pentium with the top end iris Pro iGPU in it for something like $20 or $30 more, it would eat up a significant number of i3 and Pentium with lower end HD Graphics sales. Also, imagine if that i3 was unlocked as well which I read a rumor Intel might do with Skylake i3's even paired with low end H- and B-series boards. That would make a pretty sweet entry-level gaming rig and it might even compete well with AMD's A10 APU line. If it existed... at least one part of that might become reality, though. A lot of people have pointed out that the people who need iGPUs like the Iris Pro most are not i7 users but entry-level budget users who might not be able to afford a discrete GPU or wouldn't want to buy an entry-level one since most of them have terrible price to performance ratios. Putting the flagship iGPU in a $150 chip instead of a $350 chip would be the wise thing to do IMO. The best part is that once they outgrow the iGPU or acquire enough money, they can simply slap in a decent mid-range GPU down the road and deactivate the iGPU.
 


I didn't think about the laptops that use Optimus; that's a good point.

Yes, its still the majority, I just meant that the numbers would be decently fair, especially from a gaming point of view where people often pay at least a little attention to their graphics performance. The number of unused IGPs shouldn't further skew the results.

I have long thought than an unlocked i3 would be an incredible addition to the lower end options for overclockers if it was priced well under the locked i5s. Add in the newest Iris Pro with 72 EUs plus 128MB eDRAM and you've got a winner. The models in Broadwell with much less execution resources that were also weaker were able to keep up with a Radeon 7750/ R7 250, so one that's probably around 50% (or more if we compare Skylake to comparable Haswell models in graphics and the improvements made in the same GPU configuration) better would be able to kill off the entry level graphics market almost completely.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960

Not only can I see something like that competing well with the higher spec'd desktop APUs and, depending on its price, probably killing off entry level dGPUs in the sub-$100 range, it would also make a really sweet entry level laptop chip that would be good enough to actually call those laptops entry-level gaming laptops. They would easily be good enough for decent 720p gaming and even 1080p with older or better optimized games. Of course, a chip with those specs would actually be called a mobile i5 since it is a dual core with hyperthreading CPU that can turbo boost. The non-turbo mobile i3 would still be a sweet, even cheaper alternative but an i5 version would be the lowest spec'd one that I would consider a gaming laptop. They would also sip power unlike Nvidia laptop GPUs and AMD laptop APU's in the same price range. Intel has never had a legit budget gaming GPU. Their HD Graphics line was okay for casual gamers but the frame rates in AAA games that were released within a few years of each GPU's release were always unacceptable and the ones that have come close so far are way too expensive for the gaming performance you get. It is too bad that neither the desktop nor laptop chip I am dreaming about here will likely ever exist. I think Intel is happy with giving their consumers a GPU that is not so low end that it will be completely useless in gaming in a year or two and good enough that it would have been considered lower mid-range a couple years before. They probably could give budget consumers the chips I am dreaming about but that would mess up their yields.
 
Status
Not open for further replies.