Financial Analysts Say Intel Killed the Discrete Graphics Card

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
95% of the consumer market is also excluding all those who buy professional graphics cards for 2D work. The "average consumer" who watches moves, views Flash-based content, and plays Solitaire, can get by with AMD's HD4250 or Intel's HD2000. That's been true for over a year now, and the gaming market is still boisterous enough to have AMD and nVidia developing and releasing new cards.
I smell some bovine fecal material here...
 
Intel actually knows how to pick its battles well. Why go for discrete graphics where there are already two mature giants duking it out for the favor of that 5%, when you can subsidize your own improved integrated graphics and reel in all the other fish?
 
This computing sector expert then went on to opine about the implications of next-generation games like Angry Birds Space.

What an idiot.
 
Five star Equities states that ivy bridge essentially kills the discrete graphics card because the intergrated graphics would be good enough for 95 percent of computer users.

Hmmmmmm I guess I'm part of the five percent

I bet they also think that 95 percent of computer users believe that OS X is immune to malware.
 
Not really. You don't need to be a hardcore PC gamer to need a discrete card. You just need to be a PC gamer, as most modern game will not run well with IB's IGP.

Let's face it, if you don't game or do any GPU intensive task, they will not get discrete GPU. This is true before IB's HD 4000 and and true after it. I don't see how that distribution change, so how exactly will HD 4000 kill the discrete GPU market?
 
Great, we've just got over the 'desktops are dying!' so it's just changed to 'standalone graphics cards are dying! integrated is the way to go!".

As long as someone wants to buy, someone will sell. End of.
 
How has Intel killed discrete graphics when AMD's Fusion processors still have significantly better graphics than HD4000? Shouldn't it be "AMD killed discrete graphics"?

I'm not even on the AMD bandwagon. Intel makes better processors right now. I just don't understand how it's HD4000 that "killed discrete graphics" (which aren't even close to dead anyway)
 
Also, it's not like developers wouldn't use more powerful graphics power if they could. If anything is killing discrete graphics, it's that they have to develop for outdated console hardware that integrated graphics has caught up to.
 
[citation][nom]heerherherher[/nom]Intel actually knows how to pick its battles well. Why go for discrete graphics where there are already two mature giants duking it out for the favor of that 5%, when you can subsidize your own improved integrated graphics and reel in all the other fish?[/citation]
Huh? Intel tried multiple times to get into the discrete graphics market. After spending millions on Larrabee and delays they realized releasing Larrabee a gen too late was a bad idea and scraped it.

I don't see canceling a multi million dollar project they heavily promoted being a good way to pick their battles. If they were picking battles they would only win, they wouldn't have even tried to make Larrabee.
 
Why post this article? NVIDIA and AMD will be making cards as long as people are buying laptops, desktops, and servers. A standalone GPU will always be bigger, stronger, faster than a dinky integrated GPU.

In other news electric cars killed all gas guzzlers
 
That title gave me a good laugh. What Financial analysts know about computers? Other than using financial programs, which they are also not good at that.
 
Excellent
hh.gif
 
[citation][nom]brickman[/nom]That title gave me a good laugh. What Financial analysts know about computers? Other than using financial programs, which they are also not good at that.[/citation]

Indeed. Citing financial analysts to predict technological changes doesn't even make sense. Like asking a butcher to tell us all about fashion designers.

There's however this tendency of the press to either pay attention to, or actually ask, opinions of people not qualified to give them. Financial analysts being one of the screaming examples. And this one just making it evident why we shouldn't listen.
 
I've been hearing and reading about the death of the dedicated graphics card for over a decade now. It's getting old. Heck, I remember reading how the PC will be dead in a few years (that was around the time of the smart-phone revolution), and I'm pretty sure that both the GPU market, and the PC market are doing just fine. Shrinking, sure, but market demands for any and all products shrink and grow in a cyclical pattern. If Intel would have been successful with Larabee, I'm sure we'd be listening to a different analyst with different opinions...
 
This title made me lol. Maybe for the average consumer who watches movies and browsers the internet on their PC, but the gaming area will always require power integrated graphics can't provide
 
Well this guy would be correct if people would be buying PC per "good enough" criteria. Fortunately for GPU makers people still tend to spend $2000+ on PC they will use to send occasional email and play Solitaire. Intel could kill GPU market if cared enough to make decent drivers and if people were able to be content on spending $500-$750 on PC that is good enough for what they need.
 
I have to agree with him. We PC gamers are a minority and the hardcore of us are even smaller in number. In-fact I plan on building a new rig for my folks soon, and it will be doing just as this article predicts because they simply do not need a dedicated GPU.

Even us hardcore gamers are finding little need to buy new GPU's as most games simply do not push them near their limit because of so few PC exclusive games and thus dumbed down graphics equal or not much better than consoles. I know its painful to admit but the guy actually is correct however this may not be a entirely bad thing. If dedicated GPU's continue to drop is sales they will have to lower prices, Which in turn may increase sales to make up for those lower price "losses". This could actually turn out as a good thing for everyone in the end.
 
I haven't read the report, but if the quote is anything to go by, I like how they seemingly forget that more than "hard core" gamers need discrete graphics cards. Like the professional sector who uses CAD modeling tools and others who need the compute performance.
 
Status
Not open for further replies.