Financial Analysts Say Intel Killed the Discrete Graphics Card

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
while true that most consumers not of the gamer or professional category are easily covered by integrated graphics this doesnt mean an end to discrete graphics.

as stated before the professional sector heavily utilizes graphic cards for such things as 3d modeling, 2d cadd design and graphic design. the gaming sector is always staying up to date with graphic card technology to maintain high framerates in games. even the large average consumer sector occasionally upgrades to a discrete card from integrated when they pursue a computer upgrade.

i agree... asking financial analysts is like asking your plumber for medical advice.
 
Why even post an opinion like this, from someone just looking for attention thats gained by starting arguements.

Cell phones are dying btw, in 40 years we won't be using them anymore. We will have chips in our brains that enchance our telepathic abilities.
 
Nvidia and amd graphics sectors won't be in any trouble.. Intel is the one who should be worried.. I do 3D graphics, and for rendering, for a while now I almost entirely depend on the GPU -because it's incredibly powerful, costs a fraction of a CPU, and it's very easy to upgrade the processing power. and.. GPU is not just for 3D rendering, it's useful for 2D graphics too -filters and other operations in the graphics program are processed by the GPU these days, and not by the CPU.

When I will purchase my next computer, I will invest more in a motherboard (many PCIe lanes - for many video cards, and with highest PCIe version), and of course put a LOT more many in the video cards, the CPU won't play as important role as it use to.

Intel is the one that might be screwed in the future, not the GPU sector. AMD made a wise decision by purchasing ATI..
 
AMD's APU can do this. I've seen it. Intel has been claiming the same thing each generation for a while now. Until I see it for myself, I won't believe it. Look at the recent numbers, AMD's integrated APUs still stomp Intel's and Intel's was released more recently.
 
[citation][nom]aaron88_7[/nom]This all makes sense, but I still think a computer looks empty without a massive GPU inside[/citation]
And way too quiet. Real computers sound like vacuum cleaners!
 

the way i see it: intel started processor embedded gpu, but amd outperformed intel's gpu. so if intel 'killed' discreet gfx card, then amd 'kills' intel's graphics with their (integrated) gpus.
imo amd offered 'proper' igpu before intel did. only hd4000 comes close to amd's current top igpus.
 
That's funny Intel killed discrete cards ??

Come on, if you want to surf the net view some movies and play solitary you have enough graphics with the onboard gpu on the motherboard !!!!!


Also AMD chips are lacking lot's of things but graphics ? That the one thing they are better than Intel, AMD APU has been tested showing good results, if you aren't going to play crysis 2 XD
 
Well this market pro probably doesn't know about the recent trend of multi display setups... at least here locally.

After a few clients see my computer hooked to a 24" and a 22" monitors as well as a 32" HDTV, playing a demo of Stalker CoP Reloaded, a browser open with email, Excel(like program), and a Browser with 14 tabs on it, and a movie playing on the TV they all seem to almost go weak in the knees with awe. And subsequently "gotta have something like that". So until Intigrated starts supporting 3+ monitors natively at least the low end discrete GPU market is safe.
 
No one has "killed the discrete graphics market". There have been integrated video solutions since before the media GX came out in the mid 90's, and integrated graphics is a very..very...very... small market. I don't know a single person who doesn't have an ATi or Nvidia card in their computers at home, and only users that need to play solitaire, or 5 year old games will be happy with the Intel HD 4000's crappy performance which is on par with a Geforce 8600GT. Even an Nvidia 220 GT beats the crap out of it!

Did these "financial ANALysts" actually do any research? Cause it doesn't look like it.

P.S. Don't get me started on Intel less-than-lackluster OpenGL drivers.
 
What are they talking about? This 95% to 5% split has existed for at least 5 years now. I havent seen any non tech person going out and buying a descrete graphics card as long as I can remember. People that are just looking for a computer usually take what comes with the computer. What these financial analysts are confused about is that there is now no need for a separate graphics chip on the motherboard since it is on the CPU, but that should have no effect on people who have typically bought a gfx card. I guess it could have an effect on the low end graphics cards but I never understood that market anyhow. I also agree that from what I see so far, the fusion chip is way better at gaming than the Intel 3000, although the 3000 has AMD beat with the video transcoding using quick sync.

I think these financial analysts are sometimes paid money from hedge funds to show things a certain way in order to manipulate the market. It seems every week the analysts are changing their story "the job market is great", two weeks later, "unemployment is the hightest in the past 5 years", 2 weeks later, "the economy is bouncing back"..... Blah, Blah, Blah.

I actually have stock in Nvidia, AMD, and unfortunately sold Intel 6 months ago. With the mobile market, I still think Nvidia has a chance and people are just begining to see the benefit of the AMD fusion chipset.
 
i have like 4 computers,

I only need 1 REAL GPU for gaming and will be buying like 1 every 5 year (like 2 CPU/MOBO upgrade).

The rest is good with display which can render 1080p with DXVA (Thanks to Hi10p i am giving up DXVA anyway.).

So for 5 years i might already have bought 5-6 CPU w/MOBO but only 1 GPU around 200USD.

so... it makes sense to me. And think about typical office users, they did not need dedicated GPU.
 
Actually, they are correct. First of all, Intel's HD4000 while isn't a desktop monster, it IS a laptop graphics MONSTER. It is a totally different world on mobiles and HD4000 is king in that realm right now (until Trinity anyway). The desktop market is a SHRINKING market while the mobile market is GROWING. You don't need to be an analyst to figure out that desktops are dwindling and that even with desktops, HD4000 will do most of their needed graphics work. It does well in PCMarks... well enough to play low resolution games and plenty good enough for doing Adobe type work, rendering high res pictures, and obviously it is stellar in video encoding and decoding.

On the mobile side, there is little need for a discrete card. Few people game on the mobile side and HD4000 on the mobile platform is actually a better graphics chip than on the desktop platform. The 3920XM for instance clocks up to 1300 MHz... faster than the desktop version. The others clock up to 1200 MHz except the low end 3610's at 1100 MHz. A mid level laptop running IvyBridge will have exceptional graphics. Putting in a graphics card only wastes battery life and does little else for people not playing games. Most low end game titles played fine according to online tests. High end games played OK at lower resolutions. Problem is, people aren't buying those laptops for gaming. They are buying them for everything BUT gaming.

Therefore, the analysts are right. 95% of the market will not need discrete graphics. They'll suck up battery life on mobile computers and mobile computers will be the main buying market. HD4000 on the mobile side is a much better and different animal. On the desktop side, you'll still see a lot of discrete cards but they also will dwindle as people are satisfied with HD4000 and A10. It is a far cry from GM950 or whatever that was called. Even that old coprocessor did most everything needed for daily tasking and games for the kids.
 
This doesn't take the factors into account. Mainstream users (much of the 95%) want PCs that last a few years, at least. Even if Ivy Bridge graphics are almost adequate for current games, they will be less and less capable of putting acceptable framrates at acceptable graphical detail levels in the years to come.

Thus, in two years, when Ivy Bridge graphics can't plan anything current at all, a $50 add-in card will look very nice to make your PC last a few more years and still be relevant.
 
Perfect article for a good laugh. It belongs next to the article that said SSDs are a dying breed - in 2025. Its amazing people get paid to produce such rubbish.
 
I can see why this would be true to some extent. I think most people are disagreeing because people that actually use this site are comprised of gamers/tech craving people who use discrete GPU's. I mean that's what this site provides: reviews/benchmarks/articles about new tech/hardware that are coming out, and who actually views articles like that? gamers/tech people who want the best performing hardware out there, and as such they will be using discrete GPU's.

This article is true to some extent because of the significant improvement in Intel's IGP. While some might say that AMD has a better IGP and they are the ones who "killed" discrete GPU's, the percentage of people using Intel over AMD is significantly higher. So while AMD does have a better IGP, majority of the market is using Intel and the better IGP on AMD does not affect the discrete GPU as much as when Intel's IGP improves drastically like it did with Ivy Bridge. With significant improvements with IGP from both Intel and AMD, everyone now has access to improved IGP's whether they use Intel or AMD.

While most people are still using old generations of Intel and AMD, those users are most likely included in the "95%" estimate of people who will only need the IGP for everyday usage. The average user who only uses computers for web browsing/streaming/movies/etc, will not have the need for the latest and most powerful technology as their current system serves them well enough for what they use their computers for. When people eventually feel the need to upgrade, whether it's for performance reasons or marketing temptations, their choice of upgrade will be either Intel or AMD which both has significantly more powerful IGP's. This will eliminate the need for lower end discrete GPU's as the average user can get by with the IGP in Intel or AMD.

This only leaves the remaining "5%" of hardcore gamers who will be using discrete GPU's. When looking from a big picture perspective, if 95% of the market no longer has a need for discrete GPU's, does that not in fact warrant the phrase "killed discrete GPU's"? The remaining 5% is a very small percentage of the market and while they will still be using discrete GPU's, the majority of the market will no longer have a need for it. With the improvements for IGP, the discrete GPU market becomes a niche/focus for the remaining 5% hardcore gamers.
 
Actually the title is right, even tough it's too much, I wouldn't kill the discrete but will hurt them so much, why? Just saw the performance of HD4000 in notebookcheck.net and you will see that they have higher score than AMD HD7470 or GeForce 620m. So why we should use those discrete gpu anymore? If we can get higher battery life and cheaper notebooks with HD4000?
 
i just read a story about 9 out of 10 financial analysts are now flipping burgers or working at starbucks, apparently this guy will be working with them soon as well.
they forgot to study key factors:
1) how long resolutions like 640x480, 800x600, 1280x1024, 720p lasted before 1080p which should be an indicator of when pixel count will jump again shortly.
2) how long graphics applications will work with workstation graphics, such as news and advertising media, cad design, then games.
3) future proof: everything is going mobile and intel chips aren't exactly low power enough for mobile nor good enough for consoles (technically in my book intel graphics is because consoles are played on low resolution 60hz tv's, not monitors).
4) his numbers don't pan out with total graphics cards sold in the last 10 years. his numbers do pan out if you only compared a few game titles for PCs sales vs 2nd gen i-core sales. ( i am not sure where he got his sales numbers from but i am guessing retail or steam, but not both as it would skew numbers closer to 90% ).

anyways there are too many factors such as the decline in the economy as inflation surged and the large parts of the world sank into a 7year depression and a seriously lack of homework
 
Integrated graphics may have improved, but they still can't run most modern games effectively. Even Starcraft 2 (which is over a year old and not graphically intensive) wouldn't run well with integrated graphics. I'm guessing that these analysts just think that Angry Birds is the type of game most people play.
 
Status
Not open for further replies.