Financial Analysts Say Intel Killed the Discrete Graphics Card

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The average consumer may not need anything more than Ivy Bridge's junky GPU, but that doesn't mean I'm going to let the average consumers who come to me for help selecting a machine buy something that lacks discrete graphics. Lots of people regret passing on discrete graphics later. I get really annoying having to explain to people over and over again that the piece of software they're trying to use would work normally if their graphics chip wasn't an IGP.

Honestly, I see the discrete graphics market expanding in the coming years. We've got Valve looking to collaborate on a Steam box HTPC-type setup. We've got Apple looking for new ways to add value to their TV solution. We've got rumors of consoles that can be upgraded like PCs. We've got a blossoming indie game scene on PC that is about a year away from some killer PC exclusives. We've been in a lull for a bit thanks to some terrible ports by major publishers. I see that changing.

I want to reiterate that I think Nvidia and AMD should open up game studios to promote their PC graphics solutions. I'm really getting tired of them supporting companies that keep coming out with as sloppy a port as possible, like EA.
 
If the analyst had written, "Intel's Nehalem architecture marks the death of the MCH," we'd all be in agreement. Intel and AMD chips still need a PCH/SB chip, but the traditional NB chip with the memory controller that Intel used to make (e.g. X975, G945, etc.) - and that used to house the IGP for that matter - is dead. Buh-bye. But we don't protest this because I doubt many of us were ever excited about NB chips to begin with.

But most of the responders clearly care about Discrete GPUs - even the ones who agree with the analyst. The problem is, it's correct. ATi and NV relied upon the profits from the chipset IGP and low-end Discrete market for their bread-and-butter. The IGP market is completely gone, as are the profits either of those companies were able to get from it. And the low-end Discrete market is shrinking as an increasing percentage of PC buyers are able to play whichever games they might want to play - if indeed any games at all - with the GP cores on the APU.

If NV needed to sell e.g. 1 million 440M IGPs to fund the development of Fermi, then how are they supposed to fund the R&D for the successor to Kepler? Where will they get their R&D dollars? Without that bread-and-butter cash flow, they need to get R&D funding from another product (e.g. mobile), or the rate of development will slow down dramatically.

For AMD, the situation is a bit different because they are still selling bread-and-butter GPUs built into their APUs. But their lack of CPU performance enables Intel to strip away most of their profit, so again they too don't have enough R&D budget to fund future Discrete GPUs at historic rates of development.

Then, while both NV and AMD/ATi are slowing down their Discrete GPU R&D, Intel will continue to ramp theirs up. This will further eat away at the low-end and eventually the mid-range Discrete GPU market, until the point at which AMD and NV cannot afford to invest in future Discrete GPU technology at all. Consider that HD3000 alone is already comparable to high-end NV and ATi Discrete GPUs from 2005, mid-range GPUs from 2006, and low-end GPUs from 2007. If the rate of GPU R&D at NV and AMD slows by 75%, then Intel's rate of A/GPU advancement will outstrip theirs, and allow Intel to catch up to them in a matter of years.

Those of you who have played the board game "RISK" know what it's like when one player starts to dominate the board. They mercilessly strip away at their opponents' armies (analogous to features and performance advantages) and territories (analogous to GPU markets), and do so at an increasing rate while their army generation rates increase and their opponents' rates decrease. This is where Intel, NV and AMD now see themselves in the GPU market. It is unfortunate. It is saddening. And it is only a matter of time, unless AMD and/or ARM can find a way to take Intel's bread-and-butter.
 
A discrete graphics card cannot be killed by the inbuilt graphics as we require a discrete graphic card to get better performance in respect to the all the graphic works as it increases the throughput of the graphical processing.Which a embedded one can not do for the modern game titles.
 
[citation][nom]trumpeter1994[/nom]I bet they also think that 95 percent of computer users believe that OS X is immune to malware.[/citation]
It is actually a fact that 95% of OS X users believe they are immune to malware.
 
So he is saying that onbly HARDCORE gamers need discrete GPU? Oh I forgot... people playing facebook games are considered gamers to by this ignorant analyst. 95% of ppl could get by with integrated gpu for 3 or more years..
 
what a load of bull! ivy bridge cant even play a game at half the res that gamers are use to. when a chip can play at least 1080 then we will start talking about the death of the gpu market. intel as always trying to push up shares . if your the kind of person to buy a new next gen i7 chip then your the kind of person that already will have a killer gpu card!!!!
 
The market is changing but I dont see them killing off the discrete graphcs.
Its true that many average people have changed from sitting at a desk at home on a pc playing solitare, where they now use an ipad to play angry birds and be on facebook all the time from the comfort of their sofa....but there are many other users that are tech savvy and want a portable device with the power of a desktop pc to do their work , rendering, gaming, and also use as an entertainment computer.
Hard core gamers buy desktop pc's not laptops or ipads to play on. As the games get more and more graphically intense, Intels integrated graphics will always be struggling to keep up , so companies like Nvidia and ATI will be here to stay.
 
They also forgot about 3D modelers, architects, and engineers. We need discreet graphics in order to design and render our projects! We wouldn't dare choke our productivity with something designed for 95% of consumers!
 
Regardless of the article's title is correct or not. Intel (and others) did have a impact on discrete graphics market. Rather then just one or few companies shaping the said market. Its more likely of the changes the way people engage in their activities. With the current trend of mobile devices and box devices (consoles; etc.) the old desktop/workstation way is not going fit everyone. People have certain needs and will choose a platform/s base on those needs. Each market will have people that specifically chooses them. Markets shrink and grow but still maintain same type as they were before. Discrete 3D graphics is niche market way back when 3dfx, nvidia, etc. It's market grew some and shrank some but was mostly considered a niche market. There will be even more changes ahead as inventors and people changes the way PC and the NET is used.
 
The financial analyst must have tons of Intel stocks and wishes it will go through the roof.

I am both 95% and 5% of the users he mentioned, with a gaming machine (i7, gtx570...) and tons of other desktops, laptops, smart phones, iPad, and etc. There are no incentive for me to upgrade my 10 year-old laptop, which runs win-XP just fine. The only device I am willing to spend more money is on the gaming PC, e.g., adding another GPU. There are no Intel inside on the other mobile devices. So where will Intel's extra revenue come from, by selling more high-end i7xxx/i5xxx to the 5% gamers? or more low-end Atoms to the 95% of the mobile devices? Neither, so I think. You figure.
 
[citation][nom]Rusmurf[/nom]Why even post an opinion like this, from someone just looking for attention thats gained by starting arguements.Cell phones are dying btw, in 40 years we won't be using them anymore. We will have chips in our brains that enchance our telepathic abilities.[/citation]
But how will we play angry birds?
 
@monsta: I like what you're saying, but with regard to this:
Intels integrated graphics will always be struggling to keep up
can you tell me why you think this is the case?

@mr_tuel: I don't know very much about how commercial GPUs function to the benefit of such software and users. I am under the impression that rough-approximation renderings are done using consumer-esque parts of the HW, while full-fledged rendering (e.g. renderman), physics simulation (a la ANSYS), and other production-quality work is done using GPGPU technology. And in fact, this may be where Intel's domination is the least decided. They have Larrabie, but AMD's 7970 and NV's 580 technologies both offer better prospects at the moment.

Do you have thoughts on how you see this other market evolving? Is there an R&D differential? Are there clear market dominators in possession of prohibitive barriers to entry? Does Intel have a shot with it's Pentium core Larrabie strategy, or is it better served by combinations of APUs and GPGPUs? I'm not baiting; I'm honestly interested, and hopeful in fact that this may stave off the near-term demise of consumer GPUs.
 
I wouldn't say that "intel cant make it into mobile devices"(phone pads) their first attempt was better than expected and shows potential.
 
Three words: Thermal Design Point. All things being equal, you'll can't put a 300W -- or even a 150W -- GPU under the same cap as a 77-95W CPU. Needless to say, a certain segment of creative professions and gamers will probably be demanding something more than HD 4000 and its ilk for at least a few years to come. Unfortunately for the world's GPU purists, Intel's quick sync was really what put the nail in the coffin for low-end discrete graphics, and it did it a year ago. PC gamers will be PC gamers, but most PCs won't be doing anything more intense than Aero and high definition video, and even Sandy Bridge covers that. Further, next gen consoles seem to be shaping up about on par with a mid-priced gaming laptop, so integrated graphics will probably breeze through AAA titles in three-to-five years time -- albeit with anti-aliasing and the more serious eye-candy of course.
 
This computing sector expert then went on to opine about the implications of next-generation games like Angry Birds Space.

What an idiot.


Most people only use their computer for the Internet and silly games that do not require add-on video cards. The inclusion of add-on video cards in store-bought computers was helpful for ATI and Nvidia, but not needed for the vast majority of users who bought computers.
 



The vast majority of people who own computers do not purchase games for their computer; They use one of the game consoles for that purpose.
 
[citation][nom]Proxy711[/nom]Huh? Intel tried multiple times to get into the discrete graphics market. After spending millions on Larrabee and delays they realized releasing Larrabee a gen too late was a bad idea and scraped it. I don't see canceling a multi million dollar project they heavily promoted being a good way to pick their battles. If they were picking battles they would only win, they wouldn't have even tried to make Larrabee.[/citation]

Intel never scrapped Larrabee. They converted it into the professional compute cards called Knights Corner and such because professional cards have higher profits. Intel knew that the consumer graphics market is hardly worth the effort, so they left it.

[citation][nom]BulkZerker[/nom]Well this market pro probably doesn't know about the recent trend of multi display setups... at least here locally.After a few clients see my computer hooked to a 24" and a 22" monitors as well as a 32" HDTV, playing a demo of Stalker CoP Reloaded, a browser open with email, Excel(like program), and a Browser with 14 tabs on it, and a movie playing on the TV they all seem to almost go weak in the knees with awe. And subsequently "gotta have something like that". So until Intigrated starts supporting 3+ monitors natively at least the low end discrete GPU market is safe.[/citation]

HD 4000 is supposed to support three monitors.

[citation][nom]danny2000[/nom]Only if they are using old outdated equipment.[/citation]

Until HD 4000, I don't recall ANY modern integrated GPU that supported more than one or two monitors.
 
Hay! you folks in the gaming engine business, Think how many more games you would sell If you would start utilizing OpenCL more to tap into the Intigrated GPU general purpose compute ability while simultaneously using a descrete GPU for the graphics end of the game. I see no reason not to If the Ivy bridge processor supports OpenCL. Hay GPU folks start building Descrete GPUs with drivers That work with Intel Intigrated GPU and your Descrete GPU so they can both be used simultaneously for gaming on laptops!
 
Integrated graphics have been around for a very long time now....they just moved from the motherboard to the cpu. The on board graphics on a motherboard have always been good enough for 95%.

Financial analysts should probably stick to finance.
 
It does not matter if INTEL has HD2000,HD3000,HD4000,or HD10,000. If Intel continues to allow Laptop OEMs to Customize Their HD graphics drivers, and Intel probably will continue to allow Laptop OEMs to write customized HD graphics drivers. Laptop owners will be at the mercy of lazy Laptop OEMs who after they customize The Intel HD graphics drivers, will almost never, if ever, update these customized Intel HD graphics drivers. Laptop owners will continue to be surprised when they go to the Intel HD graphics Update website, only to learn that Intel can not update The OEM's customized HD graphics Drivers! Even light weight computer games, that may be playable on Intel's HD4000 Ivy bridge graphics can and do require driver updates, But if the laptop OEM is lazy or cheep, do not expect any HD graphics driver updates from the Laptop's OEM! Expect to have to buy a brand new laptop, just to get updated Intel HD graphics drivers from some laptop manufactures, unless you make sure that the laptop you are buying comes with Intel generic HD graphic drivers!
 
Status
Not open for further replies.