Evolution Of Intel Graphics: i740 To Iris Pro (Archive)

Status
Not open for further replies.

Larry Litmanen

Reputable
Jan 22, 2015
616
0
5,010
Can Intel ever compete on par with AMD and nVidia in this sector when there's such a big difference in size. GPU cards these day take two or even 3 spots on the mobo while Intel is keeping everything in a tiny package.
 

Achoo22

Distinguished
Aug 23, 2011
350
2
18,780
Can Intel ever compete on par with AMD and nVidia in this sector when there's such a big difference in size. GPU cards these day take two or even 3 spots on the mobo while Intel is keeping everything in a tiny package.

They don't have to compete with desktop boards, they only need to be good enough to compete with the under-powered and overpriced options available to laptops. The only hurdle is driver support, but all signs point to Intel nailing it.
 

megamanxtreme

Distinguished
May 2, 2013
59
0
18,630


You seem to ignore the fact that AMD makes APUs.
Also, Nvidia is pushing that Tegra line very fast.

Plus, that i7-5775C, although it's very pricey in comparison to a A10-7870K, it shows that Intel can deliver. (Of course, I know that nm size, and AMD will bring something later, but it shows Intel can still push.)
 

madtech01

Distinguished
Jun 2, 2011
52
0
18,630
The thing is intel buy volume has become the most common GPU in desktops and laptops by a very large margin over AMD and nVidia. That is why you do not really se the lower end Descrete GPU's on the market anymore, you don't need them. but for the High end GPU's intel at this time has decided not to compete there, for now at least, so nVidia and AMD are trying to really outdo one another there for the smaller volume but higher profit margin market. If you want to play video games with lots of bells and whistles in them you need the high end, but for almost everything else the intel graphics is more than enough. I can play a lot of games at 1080p with no noticable fluctuations on a Desktop Core i5 4690 and its HD 4600 Graphics no problem. But games that I wan lats of detail, eye candy, physics effects, etc..., I play them on my rig with a Geforce GTX 980. I am really waiting for Pascal and AMD's next high end to see what HBM2 memory will do for 4K gamming. Right now for the average consummer Blue givens them everything they need, and for high end users Red and Green will duke it out. It will be scary what will happen if Blue really tried for the high end add in GPU again.
 
It has been amazing to see how Intel has improved in such a relatively short time. I mean, their onboard solutions were often less than useless until ~5 years ago, and in that time they have gone from barely being able to run a 1080p windows desktop without glitches, to now being able to run games at respectable (low to medium... but still very playable) settings.

The funny thing now is that I have no real desire to upgrade my Sandy Bridge desktop for the sake of more CPU horsepower. But for the sake of better on-board graphics (I am doing more and more video work), better chipsets, faster buses and drive connections, the temptation to upgrade is almost there.
They are never really going to replace a gaming GPU, but Intel's graphics division really deserves some kudos for the work they have done the last few years!
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
Intel was forced to make halfway decent IGPs in recent years. The problem arose from AMD buying ATi, and the 780G. The base you get out of an AMD can actually play 3D games where an Intel could not. There might be people out there who choose the AMD over Intel despite the computational difference purely because of the cost difference.
It was an issue of an Intel could not do it, while an AMD was good enough paired with a GPU that could do it.
 


It's not so much about size but IP. Intel didn't have much to start with and has to develop everything themselves, including the hardest part which is drivers. AMD inherited their technology from ATI and has since kept them largely intact. The AMD APU's are just a fully featured ATI Radeon bolted onto the side of the x86 CPU. ATI had a very long time to develop graphics technology and refine drivers, second only to nVidia. This puts Intel at a severe disadvantage and makes them a few years behind ATI / nVidia in graphics innovation.

Iris Pro is kind of a cheat tactic since it gets most of it's performance advantage from a super fast graphics buffer located on CPU package itself. Remove that buffer and your back to regular Intel graphics, it's also the reason they only include it in expensive premium models. It's always been a mostly paper launch for publicity and tech demonstrations. The ATI Radon core inside the AMD APU's is a stronger graphics unit who's primary limitation is the memory controller and memory bus, which is why AMD is pursuing on-package memory technologies like HBM. Including a single stack of that stuff would give their APU's a ridiculous advantage, of course their lagging CPU technology is still an issue.

Anyhow things are going to be very interesting in another three years when high bandwidth local interfaces become cheap enough to be standardized on integrated solutions.
 

SteelCity1981

Distinguished
Sep 16, 2010
1,129
0
19,310
could intel compete with the likes of nivida and amd for the gpu crown? of course, they have the resources and the know how to do it, but they choose not to, because they see that mainstream igpu solutions are where the market is at, considering the vast majority of computer users are not hardcore gamers and are looking for something that can handle a little of everything with good power efficiency. intel knows that the bulk of computer devices now are portable devices and this is why we have seen intel make a big shift in recent years to an performance per watt method in many of its mainstream products..
 
Something I think that needs mentioning is that the way people are counting GPU's kinda makes Intel always win regardless. Every Intel CPU marketed towards consumers will have an iGPU inside it, and that iGPU will be "counted" as market share even if the user isn't using it nor every planned on using it. I have an i7-4790K that comes with a HD4600 iGPU. I also have two nVidia GTX 980 TI's in a custom WC loop and while I have never used, nor will ever use, that HD4600 I'm still counted as having one. And thus every computer with an Intel CPU is giving Intel, on paper, graphics market share. I think a better metric would be how many systems have an Intel GPU that don't have any other graphics adapter installed.
 

Aspiring techie

Reputable
Mar 24, 2015
824
9
5,365
I wonder what would happen if Intel would enter the discreet graphics market targeting gamers. A 14nm die that is as big as it can get as the GPU, with 8 GB Vram would make Nvidia and AMD cringe.
 

Tapsucker___

Reputable
Oct 26, 2015
1
0
4,510
In the early nineties, Texas Instruments built a processor family that was popular in high end graphics cards from companies like Number Nine. Intel released a processor to compete with these. I think it was called the i850. It failed to take off.

The i750 was not originally an Intel design. Lockheed Martin spun off a division called Real3D to commercialize their graphics technology and they began making 3D graphics cards. I think this might have been a bit of a joint venture with Intel, I don't recall, but it was eventually acquired by Intel.

Interesting trivia - Real3D also made on of the first commercial 3D scanners.
 

Candre23

Reputable
May 11, 2015
4
0
4,510
you do not really se the lower end Descrete GPU's on the market anymore, you don't need them.

Bingo. 6+ years ago, Intel's integrated GPUs were so awful that it was worth getting a discrete GPU even for basic productivity purposes. There were any number of <$100 cards that weren't particularly good at gaming, but were a huge improvement over intel's garbage. Those cards weren't a huge portion of the market, but they weren't negligible either.

Now, the <$100 entry-level card market practically doesn't exist. Intel's integrated graphics are definitely "good enough" for just about any non-gaming purpose, as is their driver support.
 


It isn't discussed because it isn't true. PowerVR graphics have been used with some low-end Intel Atom processors in the past, but Intel's graphics technology is completely separate. It isn't based on PowerVR, licensed from Imagination (which owns PowerVR, PowerVR isn't actually a company it is a series of products), or related in any other way.

Also, that Wikipedia link you are using to support your statement does state that Intel has licensed PowerVR graphics, it does not state that Intel's graphics technologies is based or licensed from PowerVR. What it is referring to is times in the past that Intel has licensed PowerVR graphics for use in systems such as the low-powered Atom computers mentioned above.
 
IInuyasha74 is correct.

While there have been some Intel SOC chips that have used tech licensed from PowerVR in the past, the graphic cores in their Core series (and their derivative Celeron / Pentium CPUs) are of Intel's own design. Those same core are used in Bay Trail / Cherry Trail Atom SOC chip.


I suppose it would have been interesting if Larrabee did actually go into production, but I am sure it would have simply been crushed by both AMD and nVidia. There would have been a lot of people who would have scoffed at the idea of buying an Intel discrete GPU.

Since hindsight is 20/20... integrating the graphics core inside the CPU was the best decision Intel could have made. An integrated GPU lowers the overall cost of a PC / laptop for the average non-gamer / office PC due to the fact that a dedicated GPU is no longer necessary. But lets not forget the fact that AMD planned on integrated a GPU inside a CPU which was part of the reason why they bought ATI Technologies.

Thanks to both AMD and Intel low end GPUs are fading out, though AMD still makes them for dual graphics purposes. But probably the biggest impact of integrated graphics for the average consumer is in laptops where it is possible to play some games in a budget laptops using only integrated GPUs.
 
But lets not forget the fact that AMD planned on integrated a GPU inside a CPU which was part of the reason why they bought ATI Technologies.

This is actually an interesting point you bring up, one which I plan to go into more detail on later. AMD actually jumped into this integration idea a lot earlier than most people realize. Most people have forgotten about Cyrix, but for a time in the late 1990s, they produced a CPU package which also contained a graphics and audio processor. AMD bought these and turned it into the Geode. AMD merged even more hardware into it, including the MCU around the same time they released Athlon 64. They were able to sell it at a relatively low price and were extremely successful in terms of sales, and a lot of this was because of the highly-integrated all in one design. AMD has been essentially working towards making a high performance version of that since 2000, and the Kaveri CPUs essentially have all of the same features (integrated GPU, MCU, DSPs for audio, etc.).

Just an interesting bit of information to think over.
 

Jalapenoman

Honorable
Feb 3, 2015
23
0
10,510
With their vast resources, and engineering talent, Intel could create a truly powerful enthusiast GPU. Instead they have always opted for good enough.
 

Xander Konrad

Reputable
Apr 15, 2014
57
0
4,630
Can Intel ever compete on par with AMD and nVidia in this sector when there's such a big difference in size. GPU cards these day take two or even 3 spots on the mobo while Intel is keeping everything in a tiny package.


wat.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960
A high end discrete Intel GPU would be very cool. Hell, any high end discrete GPU from a 3rd major company would be great for PC users in general just like a 3rd major x86 CPU maker would. It would definitely make the competition more interesting and probably bring prices down while accelerating the advancement of the technology. That would be especially true with desktop GPUs if it is Intel who is that 3rd company because they have their own fab plants. Right now, they would have the only sub 16nm GPUs and they aren't dependent on TSMC or Global Founderies. If Intel had a Titan or Fury class GPU out right now, it would be the most powerful and most power efficient of them all and in a perfect world cost less than both the Titan X and Fury X.
 
Status
Not open for further replies.