Financial Analysts Say Intel Killed the Discrete Graphics Card

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
the only silver lining I can see is that PC gaming will have a new generation of console ports to justify owning a graphics card.. ick. Also it isn't like browser games won't widely support GPU acceleration.. ick.
 
[citation][nom]alxianthelast[/nom]the only silver lining I can see is that PC gaming will have a new generation of console ports to justify owning a graphics card.. ick. Also it isn't like browser games won't widely support GPU acceleration.. ick.[/citation]

Several games, even console ports, are still very graphically intensive. For example, Crysis 2 (a console port) with the DX11 patch is very demanding, especially if you get a quality mod.
 
Did they miss the articles talking about how the GPGPUs were making really powerful CPUs obsolete? This sounds like the people who keep saying that consoles are decimating PCs when it comes to game. Yet a decade later the PCs are generating more revenue for game developers than consoles.
 
Oh my gosh! What are they on about!? Motherboards had been shipping with integrated graphics for years before Intel integrated them into their CPUs!! There's simply no connection between having integrated graphics in CPUs and a decline in discrete video card sales because the market really hasn't changed much by Intel making it integrated anyway!
 
95 percent of financial analysts are idiots who have no clue about graphic cards. Their only graphic intensive application would be their spreadsheets creating fancy pie charts.
 
finally!!! was time to kill the low-end graphics cards! on-board, mid or high end off-board looks better! Also think will make amd and ati not waste any time on crap and just give us more powa! =D
 
[citation][nom]DriverUpdateHellRaiser[/nom]DriverUpdateHell is a loser. You lost money to Toshiba because you are stupid. You will not gain a penny back posting the same stupid message on TomsHardware. LOSER![/citation]

He/she brought up a valid problem, even if it was done in a kind of whiny and repetitive way. However, you are an insulting jerk who shouldn't have said anything like this. There is absolutely no point behind what you said other than to mock someone who was fooled by the OEM companies through their unfair practices.

Furthermore, you didn't even have the courage to use your own account to voice your insult, you made a new account just to mock him/her.
 
I hope everyone on this thread realizes that the Intel IGPs still aren't as good as the Llano A8s, let alone cheapy video cards such as the Radeon 5550, 5570, or 6570. So, they accomplish nothing that is remotely ground-breaking.

For the vast majority of computer users, the older integrated solutions were enough. Hell, you can even play 720p video on the ancient garbage called the GMA 950 that shipped with many of the six or so year old Core 2 and Pentium Dual-Core systems. The HD 2000/3000 is enough for 1080p and even 3D 1080p playback. What does the average person do that is more intesive than that? Well, for them, there is now the HD 4000. They could go way further beyond even the A8s once Trinity comes out if they really care.

Regardless of all of this, the low end video card market is still more or less the same. All that this does is bump up the performance of the cards considered low end. IE, the 6450 will start to be abandoned as a low end offering and it probably won't be replaced with a similarly performing card and Nvidia probably won't make a card with similar performance to it ever again either.

Instead, cards as powerful as the 5550, 5570, and 6570 will become the minimum of the market considered the low end aka entry level graphics market.
 
discrete GPUs are like Formula 1 race cars. their technology will eventually trickle down to their "lesser" siblings. they are the platform to develop and mature a technology. just like race cars, there will always be market for discrete GPUs.
 
And why does anyone listen to these people? Do these people actually spit out information that is actually accurate, instead of trying to sell their ideas to justify their salaries?
 
Actually, Intel's IGPs are better than Llano A8s on the mobile side. Check out blazorthorn's link: http://www.notebookcheck.net/Intel-HD-Graphics-4000-Benchmarked.73567.0.html

From the link, ahem:
"Even the top AMD Llano chips cannot compete with the HD 4000, at least in our benchmark comparisons above. Intel has the upper hand by about 15 percent or more compared to AMD's Fusion Llano offerings."

"Perhaps more impressively, the Intel GPU beats the now ancient Radeon HD 7450 handily. Because of this, one may have to ponder if low-end dedicated GPUs from AMD or Nvidia would be a viable alternative at all."

Funny, blazorthorn's own links have completely different conclusions than blazorthorn about discrete graphics. They said the low end cards may not be viable alternatives. This supports the actual article posted saying that Intel is pretty much killing off those low end cards. Since laptops are outselling desktops these days, this is the conclusion that matters the most.

Most people are good with integrated graphics of that performance as they are not heavy gamers; therefore, the premise of the article is spot on. Intel, the biggest mobile CPU producer, will likely kill off the low and middle level cards as the IGP is good enough to not spend the extra money on those cards. Also, those cards will suck battery life as has been mentioned. Longer battery life, excellent performance all around, means death for low and middle level graphics card additions. The high end systems that most of us use will still have the cards of our choice.
 
Wait what...?

Intel killed it with the HD4000?

Uhm..Yeah lets forget about Llano's IGP being faster then the Intel's HD series and Trinity actually being ~50% faster then Llano. Analysts my ass.
 
Hay! Tom's hardware the next time you review a laptop with Intel integrated graphics please be sure to tell the reader if the graphics drivers are Intel genaric drivers, which can be updated at the intel graphics driver web site, or If the drivers are Laptop OEM customized Intel HD graphics drivers, which can not be updated by Intel. Letting the readers know this should always be part of the review of any laptop that comes with Intel integrated graphics.
 
[citation][nom]TruthSeekerVII[/nom]Actually, Intel's IGPs are better than Llano A8s on the mobile side. Check out blazorthorn's link: http://www.notebookcheck.net/Intel [...] 567.0.htmlFrom the link, ahem:"Even the top AMD Llano chips cannot compete with the HD 4000, at least in our benchmark comparisons above. Intel has the upper hand by about 15 percent or more compared to AMD's Fusion Llano offerings.""Perhaps more impressively, the Intel GPU beats the now ancient Radeon HD 7450 handily. Because of this, one may have to ponder if low-end dedicated GPUs from AMD or Nvidia would be a viable alternative at all."Funny, blazorthorn's own links have completely different conclusions than blazorthorn about discrete graphics. They said the low end cards may not be viable alternatives. This supports the actual article posted saying that Intel is pretty much killing off those low end cards. Since laptops are outselling desktops these days, this is the conclusion that matters the most. Most people are good with integrated graphics of that performance as they are not heavy gamers; therefore, the premise of the article is spot on. Intel, the biggest mobile CPU producer, will likely kill off the low and middle level cards as the IGP is good enough to not spend the extra money on those cards. Also, those cards will suck battery life as has been mentioned. Longer battery life, excellent performance all around, means death for low and middle level graphics card additions. The high end systems that most of us use will still have the cards of our choice.[/citation]

Everything that I talked about in this article was the desktop graphics. That is why I included desktop graphics cards in my post. Desktop Llano easily beats HD 4000. I didn't say that mobile HD 4000 get's beaten by mobile HD 4000.

The mobile HD 4000 is only in the i7s. The i5s and i3s will only get the HD 2500, so Llano is still superor on the mobile end to it's competitors (i7 and Llano are NOT competitors).

Funny, people don't pay attention to the context of my post before criticizing me incorrectly... The HD 4000 doesn't beat the mid-range discrete cards that are mobile nor the desktop mid-range cards (low end desktop cards, such as the 6570, beat it with ease).

Furthermore, I didn't provide any links in this article...

And no, this article is completely wrong, because even entry level discrete graphics cards such as the 6570 hammer the HD 4000. The only graphics card from AMD's current lineup of Radeons that would be pointless to buy because of HD 4000 is the Radeon 6450, but that never was even an entry level card, just an upgrade option or an HTPC cards, it's not a gaming card at all. So, this financial guys said that the low end discrete graphics market would get killed. What market is getting effected? AMD wouldn't have made a card as slow as the 6450 again, so they were abandoning that market already, regardless of HD 4000 performing a little better than a 6450 or not.

So, the vast majority 95% or whatever whom aren't gamers would not have chosen a graphics card anyway, so they aren't effecting the market, and the the 5% wouldn't have chosen the only current Radeon that would have been beaten by HD 4000, the 6450. Basically, HD 4000 has no effect at all on the low end graphics market!

Oh, but go ahead, try to prove me wrong when I'm not. People love to do that and I am rarely wrong (although I'll admit that I have been wrong a few times), no matter how many times people tell me I am. People are usually wrong because they failed at reading comprehension and/or they don't know what they're talking about well enough. Congratulations, you have both.

But go ahead, tell me how the low end market gets killed by an IGP that doesn't even beat any low end cards that will be released in the next generation of video cards. Even if a slower card than the HD 4000 happens to be released, it's purely for upgrading older systems and has no bearing on the market that the HD 4000 is for (that is not for upgrading old computers). The slowest card in the Radeon 6000 family other than the Radeon 6450 is the 6570, which more or less doubles the performance of the HD 4000 (and is about 25 to 35% faster than the FM1 A8's 6550D IGP). It can be bought for $50 to $60 and will be even cheaper as time goes on. The 6670, the next card up, is available for $60 to $70 and is a good deal faster still. Even then, these are still entry level and low end cards.

Moving on to the mobile graphics, neither Llano nor HD 4000 can come close to the mid-ranged graphics, so it's still not beating that either. However, it does do a good job of shaking up the mobile low end. Despite that, I wasn't talking about the mobile market in my previous post. The 5550, 5570, and 6570 are ALL desktop graphics cards.
 
Intel has and will continue to be bringing up the rear as far as processor graphics is concerned. Intel HD graphics will only define the top of the bottom! When will Intel pull out all the stops, they have no motivation to do so. It's only a matter of time before IBM and the independent chip fabs cross pollenate their technology and once again hold Intel's feet to the fire!
 
Blazorthorn, once again, you have no clue. 95% of the market wouldn't choose a graphics card because there is already an integrated graphics card that can do the things they need: 1080 video, DX11, and perform well in almost every application they have. AMD was abandoning the market BECAUSE of Intel's IGP, not because they just felt like it. They would sell whatever they felt a consumer would buy. They just new they had to redefine the bottom line because faster IGPs were coming to the market. They didn't just blatantly kill a product for the sake of nothing. So, we know now, that you are not a marketing major and certainly not a VLSI designer as well.

It is funny how often you say you are not wrong because you bloat up a bunch of posts with useless information. When one actually reads what you write, you have basically not supported your claims and just parrot information freely available from Tom's site. You are a moron and an embarrassment.
 
[citation][nom]TruthSeekerIX[/nom]Blazorthorn, once again, you have no clue. 95% of the market wouldn't choose a graphics card because there is already an integrated graphics card that can do the things they need: 1080 video, DX11, and perform well in almost every application they have. AMD was abandoning the market BECAUSE of Intel's IGP, not because they just felt like it. They would sell whatever they felt a consumer would buy. They just new they had to redefine the bottom line because faster IGPs were coming to the market. They didn't just blatantly kill a product for the sake of nothing. So, we know now, that you are not a marketing major and certainly not a VLSI designer as well.It is funny how often you say you are not wrong because you bloat up a bunch of posts with useless information. When one actually reads what you write, you have basically not supported your claims and just parrot information freely available from Tom's site. You are a moron and an embarrassment.[/citation]

No, AMD didn't abandon the market, they abandoned the 6450 because it was time to move on. Have you ever heard of progress? AMD isn't going to make a graphics card with the same performance as the minimum performing card of the last series. It's common sense.

Furthermore, the 95% don't care about DX11 and the 95% were already happy with the weaker HD 2000 and 3000 and even weaker GPUs.

I don't understand why you want to manipulate the truth into thinking that the HD 4000 is changing anything for AMD's discrete cards just because some financial idiot (who was probably just paid to say this) said something. You don't realize that these people just say things to manipulate the stock market, do you?

I'm a moron for giving information freely available at Tom's? You're the moron ignoring the information jsut because some financial idiot said so! THe 6450 will not be replaced by a similarly performing card. I find it hard to believe that this is such a hard concept to understand.

Why would AMD want to release a card that is only as good as their old, weakest card? It makes no sense, except as an upgrade option for older systems and has nothing to do with today's market!!!
 
[citation][nom]TruthSeekerIX[/nom]Blazorthorn, once again, you have no clue. 95% of the market wouldn't choose a graphics card because there is already an integrated graphics card that can do the things they need: 1080 video, DX11, and perform well in almost every application they have. AMD was abandoning the market BECAUSE of Intel's IGP, not because they just felt like it. They would sell whatever they felt a consumer would buy. They just new they had to redefine the bottom line because faster IGPs were coming to the market. They didn't just blatantly kill a product for the sake of nothing. So, we know now, that you are not a marketing major and certainly not a VLSI designer as well.It is funny how often you say you are not wrong because you bloat up a bunch of posts with useless information. When one actually reads what you write, you have basically not supported your claims and just parrot information freely available from Tom's site. You are a moron and an embarrassment.[/citation]

When you actually pull your head out of your butt, ignore the idiot financials, and read what blazorthon and the links to this very site say, you see that his only mistake is assuming that you're intelligent enough to know what you're talking about. You are assuming that just because HD 4000 is entering the bottom of today's low end market, that the low end market isn't going to progress. I've got news for you. The same thing happened a year ago when Intel made the HD 3000 that on the i7s, can match and even beat the HD 5450 (about half of the 6450 in gaming performance). It was then the bottom of the low end. However, the low end progressed, and now both of them are below the low end. The same thing will happen with the next generation of graphics cards that redefine the low, mid, and high end markets. When the new low end cads come out, there will be nothing to match a 6450 because it will have been left in the dust, the HD 4000 with it, possibly along with some other cards such as the 5550 (which just happens to beat the HD 4000).
 
Progress? Good grief, definitely not a marketing major. Why spend if it gains you little profit? The fact is, there was no profit to be had from those older cards anymore so they must spend R&D money to improve the bottom line. They don't just do it because they like you.

By the way, it is not just the financial analysts, apparently, PC World had the same conclusion:
http://www.pcworld.com/article/254178/ivy_bridge_graphics_entrylevel_cards_are_dead.html

Oops! Let me rephrase the title for you:

"Ivy Bridge Graphics: Entry-Level Cards are Dead"

OK, so it is not just me, nor the business analysts whom forgot more about sales and marketing than you have ever learnd. It is apparently a wider group of people who see the same conclusion. What nVidia and AMD will do to react to this is build better graphics cards which is what they are doing. You can't stand on your laurels or you will lose all profitability in this case. The problem for AMD and nVidia is, most users will not need the extra power they are pumping into those cards nor the loss of battery life. They are in a bit of a pickel now because people will not automatically pick up a cheap graphics card if they feel the internal IGP can perform well enough.

At this point, I have no idea why you keep arguing. How many more people have to tell you the same thing before you believe them? You can bury your head in the sand I suppose I don't care. Just don't come on here acting like you are the expert in everything when you are just a fanboy. That is what has gotten you into this argument in the first place. Almost nothing supports your conclusions even the few links you did post.
 
[citation][nom]TruthSeekerXI[/nom]Progress? Good grief, definitely not a marketing major. Why spend if it gains you little profit? The fact is, there was no profit to be had from those older cards anymore so they must spend R&D money to improve the bottom line. They don't just do it because they like you. By the way, it is not just the financial analysts, apparently, PC World had the same conclusion:http://www.pcworld.com/article/254 [...] _dead.htmlOops! Let me rephrase the title for you:"Ivy Bridge Graphics: Entry-Level Cards are Dead" OK, so it is not just me, nor the business analysts whom forgot more about sales and marketing than you have ever learnd. It is apparently a wider group of people who see the same conclusion. What nVidia and AMD will do to react to this is build better graphics cards which is what they are doing. You can't stand on your laurels or you will lose all profitability in this case. The problem for AMD and nVidia is, most users will not need the extra power they are pumping into those cards nor the loss of battery life. They are in a bit of a pickel now because people will not automatically pick up a cheap graphics card if they feel the internal IGP can perform well enough. At this point, I have no idea why you keep arguing. How many more people have to tell you the same thing before you believe them? You can bury your head in the sand I suppose I don't care. Just don't come on here acting like you are the expert in everything when you are just a fanboy. That is what has gotten you into this argument in the first place. Almost nothing supports your conclusions even the few links you did post.[/citation]

Most people didn't need a discrete card BEFORE HD 4000!!! It changes nothing! All it means is that AMD and Nvidia need to progress the market! Do you think that AMD and Nvidia wouldn't have progressed just because Intel didn't, had Intel not made HD 4000? You are implying that Intel is AMD's and Nvidia's only reason for progression and that is wrong. AMD and Nvidia would have progressed regardless of Intel because they need to keep their hardware good enough for the software! You're not even trying to use common sense.

This changes absolutely nothing. People bought many machines that use HD 2000 and HD 3000. Both of these are capable of doing anything and everything up to and including 3D 1080p video playback. The average user doesn't do much of anything even that intensive, so why would they have bought a discrete card before HD 4000? They wouldn't have! It changes nothing. All it shows is that Intel progressed and AMD and Nvidia will also progress. That's what the tech industry does.
 
The lowest end entry level graphics card is the Radeon 6570. The 6450 is not a gaming card. The average user would not have bought the 6570 for their computer even though it was many times faster than the HD 2000 and 3000 that were very common, present on all sandy bridge machines with an i3, i5, or i7 (Pentiums and Celerons have worse graphics than HD 2000). The average person did not need more, so the average person generally did not get more.Now, HD 4000 has come out. What has changed?

The year and a half old Radeon 6450 is no longer an attractive purchase if you want an Ivy Bridge system (not that it every really was because it was more powerful than necessary for the average users and even for HTPCs and such, but too weak for much of anything else). So, Intel can beat an crap, one and a half year old card. Big whoop about that. Later this year, cards better than the 6450 will be made (not like there's a shortage of them right now anyway) at the same price point as where the 6450 sis right now. The low end market is then redefined because the lowest cards will be better than today's lowest cards, sending the 6450 below even the low end market and the HD 4000 with it. Nothing has changed with the HD graphics position relative to the discrete cards, except it is now better than the worst card from a year and a half (maybe almost two years when this happens) old generation of graphics cards. Impressive, isn't it?
 
[citation][nom]alphaalphaalpha[/nom]The lowest end entry level graphics card is the Radeon 6570. The 6450 is not a gaming card. The average user would not have bought the 6570 for their computer even though it was many times faster than the HD 2000 and 3000 that were very common, present on all sandy bridge machines with an i3, i5, or i7 (Pentiums and Celerons have worse graphics than HD 2000). The average person did not need more, so the average person generally did not get more.Now, HD 4000 has come out. What has changed? The year and a half old Radeon 6450 is no longer an attractive purchase if you want an Ivy Bridge system (not that it every really was because it was more powerful than necessary for the average users and even for HTPCs and such, but too weak for much of anything else). So, Intel can beat an crap, one and a half year old card. Big whoop about that. Later this year, cards better than the 6450 will be made (not like there's a shortage of them right now anyway) at the same price point as where the 6450 sis right now. The low end market is then redefined because the lowest cards will be better than today's lowest cards, sending the 6450 below even the low end market and the HD 4000 with it. Nothing has changed with the HD graphics position relative to the discrete cards, except it is now better than the worst card from a year and a half (maybe almost two years when this happens) old generation of graphics cards. Impressive, isn't it?[/citation]


Haven't you see the news that AMD has released HD7470M and Nvidia released 620M ?? They still sure about that market and the scores were below HD4000. I'm talking about mobile graphics right here
 
Yes, AMD and nVidia are absolutely pushing up the timetable on progress to compete with Intel. If you don't believe that, then there is absolutely nothing more I can say to you. This is a FAR greater motivator than software improvement. It is not like you can do anything about it if you upgrade your software but it doesn't run as fast as you want on your hardware, so this is not nearly as big of a financial motivator. They know you will buy a card anyway. Spending millions and billions producing a new architecture must be done for good reason. nVidia and AMD are both trying to a.) 1 up each other, b.) stay relevant in a market where IGPs are improving in performance. They cannot sit on their year old cards because they will be overtaken by IGPs and no one will buy them thus a loss in the bottom line. If the IGPs did not exist, then they could still sit on their current architectures and not have to produce a new architecture as quickly despite whatever is going on in the hardware. Competition is what motivates improvement as are sales and profit.

For an example, Intel made the Larrabee to compete in the discrete graphics market UNTIL they realized that margins were too low, competition too high, and that IGPs were the way of the future as far as profit is concerned. Thus, they turned Larrabee into KC to compete in the HPC market where they can sell the boards for a great deal more just by removing the visual component and adding device support. They didn't just make Larrabee to make progress in the visual market. They are trying to make money and KC made more sense from a financial and competition stand point. Bigger margins, better profit.

The bottom line IS the bottom line. These are corporations with boards of directors that have to be satisfied. They want to make money. That means big profit with as little spending as possible to get there. Unfortunately, with more competition, to get that profit, you have to spend more or else there update cycles would be s-l-o-w-e-d so long as profits were high.
 
Status
Not open for further replies.