Financial Analysts Say Intel Killed the Discrete Graphics Card

Status
Not open for further replies.
95% of the consumer market is also excluding all those who buy professional graphics cards for 2D work. The "average consumer" who watches moves, views Flash-based content, and plays Solitaire, can get by with AMD's HD4250 or Intel's HD2000. That's been true for over a year now, and the gaming market is still boisterous enough to have AMD and nVidia developing and releasing new cards.
I smell some bovine fecal material here...
 
G

Guest

Guest
Intel actually knows how to pick its battles well. Why go for discrete graphics where there are already two mature giants duking it out for the favor of that 5%, when you can subsidize your own improved integrated graphics and reel in all the other fish?
 
G

Guest

Guest
This computing sector expert then went on to opine about the implications of next-generation games like Angry Birds Space.

What an idiot.
 

trumpeter1994

Honorable
Mar 27, 2012
311
0
10,810
Five star Equities states that ivy bridge essentially kills the discrete graphics card because the intergrated graphics would be good enough for 95 percent of computer users.

Hmmmmmm I guess I'm part of the five percent

I bet they also think that 95 percent of computer users believe that OS X is immune to malware.
 
Not really. You don't need to be a hardcore PC gamer to need a discrete card. You just need to be a PC gamer, as most modern game will not run well with IB's IGP.

Let's face it, if you don't game or do any GPU intensive task, they will not get discrete GPU. This is true before IB's HD 4000 and and true after it. I don't see how that distribution change, so how exactly will HD 4000 kill the discrete GPU market?
 

Goldengoose

Distinguished
Jul 12, 2011
486
0
18,860
Great, we've just got over the 'desktops are dying!' so it's just changed to 'standalone graphics cards are dying! integrated is the way to go!".

As long as someone wants to buy, someone will sell. End of.
 

830hobbes

Distinguished
May 30, 2009
103
0
18,680
How has Intel killed discrete graphics when AMD's Fusion processors still have significantly better graphics than HD4000? Shouldn't it be "AMD killed discrete graphics"?

I'm not even on the AMD bandwagon. Intel makes better processors right now. I just don't understand how it's HD4000 that "killed discrete graphics" (which aren't even close to dead anyway)
 

830hobbes

Distinguished
May 30, 2009
103
0
18,680
Also, it's not like developers wouldn't use more powerful graphics power if they could. If anything is killing discrete graphics, it's that they have to develop for outdated console hardware that integrated graphics has caught up to.
 

proxy711

Distinguished
Jun 5, 2009
366
0
18,790
[citation][nom]heerherherher[/nom]Intel actually knows how to pick its battles well. Why go for discrete graphics where there are already two mature giants duking it out for the favor of that 5%, when you can subsidize your own improved integrated graphics and reel in all the other fish?[/citation]
Huh? Intel tried multiple times to get into the discrete graphics market. After spending millions on Larrabee and delays they realized releasing Larrabee a gen too late was a bad idea and scraped it.

I don't see canceling a multi million dollar project they heavily promoted being a good way to pick their battles. If they were picking battles they would only win, they wouldn't have even tried to make Larrabee.
 
G

Guest

Guest
Why post this article? NVIDIA and AMD will be making cards as long as people are buying laptops, desktops, and servers. A standalone GPU will always be bigger, stronger, faster than a dinky integrated GPU.

In other news electric cars killed all gas guzzlers
 
G

Guest

Guest
That title gave me a good laugh. What Financial analysts know about computers? Other than using financial programs, which they are also not good at that.
 

shasheng

Honorable
May 1, 2012
3
0
10,510
Excellent
hh.gif
 

Marfig

Distinguished
Apr 30, 2012
25
0
18,530
[citation][nom]brickman[/nom]That title gave me a good laugh. What Financial analysts know about computers? Other than using financial programs, which they are also not good at that.[/citation]

Indeed. Citing financial analysts to predict technological changes doesn't even make sense. Like asking a butcher to tell us all about fashion designers.

There's however this tendency of the press to either pay attention to, or actually ask, opinions of people not qualified to give them. Financial analysts being one of the screaming examples. And this one just making it evident why we shouldn't listen.
 

omnimodis78

Distinguished
Oct 7, 2008
886
0
19,010
I've been hearing and reading about the death of the dedicated graphics card for over a decade now. It's getting old. Heck, I remember reading how the PC will be dead in a few years (that was around the time of the smart-phone revolution), and I'm pretty sure that both the GPU market, and the PC market are doing just fine. Shrinking, sure, but market demands for any and all products shrink and grow in a cyclical pattern. If Intel would have been successful with Larabee, I'm sure we'd be listening to a different analyst with different opinions...
 

bavman

Distinguished
May 19, 2010
1,006
0
19,360
This title made me lol. Maybe for the average consumer who watches movies and browsers the internet on their PC, but the gaming area will always require power integrated graphics can't provide
 

hetneo

Distinguished
Aug 1, 2011
451
0
18,780
Well this guy would be correct if people would be buying PC per "good enough" criteria. Fortunately for GPU makers people still tend to spend $2000+ on PC they will use to send occasional email and play Solitaire. Intel could kill GPU market if cared enough to make decent drivers and if people were able to be content on spending $500-$750 on PC that is good enough for what they need.
 

NuclearShadow

Distinguished
Sep 20, 2007
1,535
0
19,810
I have to agree with him. We PC gamers are a minority and the hardcore of us are even smaller in number. In-fact I plan on building a new rig for my folks soon, and it will be doing just as this article predicts because they simply do not need a dedicated GPU.

Even us hardcore gamers are finding little need to buy new GPU's as most games simply do not push them near their limit because of so few PC exclusive games and thus dumbed down graphics equal or not much better than consoles. I know its painful to admit but the guy actually is correct however this may not be a entirely bad thing. If dedicated GPU's continue to drop is sales they will have to lower prices, Which in turn may increase sales to make up for those lower price "losses". This could actually turn out as a good thing for everyone in the end.
 

xenol

Distinguished
Jun 18, 2008
216
0
18,680
I haven't read the report, but if the quote is anything to go by, I like how they seemingly forget that more than "hard core" gamers need discrete graphics cards. Like the professional sector who uses CAD modeling tools and others who need the compute performance.
 
Status
Not open for further replies.