Financial Analysts Say Intel Killed the Discrete Graphics Card

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Niche discrete graphics market has been shrinking few years back. Intel never was serious in entering the discrete market. They are happy just to absolve the 3D functions into their igp and then into their cpus. That is their money maker and market. Back when nvidia started the gpu for compute power. It was a aim at Intel's cpu compute power. Intel R&D Larrabee to enhance its own compute power but also looking into consoles and mobile devices which are bigger market then discrete graphics. Nvidia got out of IGP market and is also heading out of low end discrete. Their mid and high end are in trouble and getting more troublesome to make. Nvidia has already moved to a market where it is alot greener money maker then the market they are at now. Same will amd will also head for that market. Both will still stay with discrete market but only R&D new tech to incorporate into the hot markets which most are going to ( ie; mobile devices and consoles). The bulky box desktop/workstation market is shrinking as new devices offer comparable functions. With that said and how the trend in computing is going. The old market is just going to shrink as new emerging markets takes away from the old.
 
[citation][nom]glenricky[/nom]Haven't you see the news that AMD has released HD7470M and Nvidia released 620M ?? They still sure about that market and the scores were below HD4000. I'm talking about mobile graphics right here[/citation]

The 7470m was released before HD 4000 and so was the 620m, so they don't even work for your examples anyway. Besides that, the 7470m is for doing CF with the Lano GPUs and both the 7470 and 620m are for Sandy Bridge notebooks (that's why it's Sandy that has them). Hence my point about slower/similarly performing cards only being released for older systems. You didn't even check the release dates on them and the systems that have them! Furthermore, there are often idiotic exceptions (such as a Sandy Bridge desktop with a CPU that has HD 3000, but a Radeon 5450 as the graphics card despite the HD 3000 and the 5450 being fairly parallel, how close depending on the HD 3000's frequency).

[citation][nom]TruthSeekerXII[/nom]Yes, AMD and nVidia are absolutely pushing up the timetable on progress to compete with Intel. If you don't believe that, then there is absolutely nothing more I can say to you. This is a FAR greater motivator than software improvement. It is not like you can do anything about it if you upgrade your software but it doesn't run as fast as you want on your hardware, so this is not nearly as big of a financial motivator. They know you will buy a card anyway. Spending millions and billions producing a new architecture must be done for good reason. nVidia and AMD are both trying to a.) 1 up each other, b.) stay relevant in a market where IGPs are improving in performance. They cannot sit on their year old cards because they will be overtaken by IGPs and no one will buy them thus a loss in the bottom line. If the IGPs did not exist, then they could still sit on their current architectures and not have to produce a new architecture as quickly despite whatever is going on in the hardware. Competition is what motivates improvement as are sales and profit. For an example, Intel made the Larrabee to compete in the discrete graphics market UNTIL they realized that margins were too low, competition too high, and that IGPs were the way of the future as far as profit is concerned. Thus, they turned Larrabee into KC to compete in the HPC market where they can sell the boards for a great deal more just by removing the visual component and adding device support. They didn't just make Larrabee to make progress in the visual market. They are trying to make money and KC made more sense from a financial and competition stand point. Bigger margins, better profit. The bottom line IS the bottom line. These are corporations with boards of directors that have to be satisfied. They want to make money. That means big profit with as little spending as possible to get there. Unfortunately, with more competition, to get that profit, you have to spend more or else there update cycles would be s-l-o-w-e-d so long as profits were high.[/citation]

All of AMD's low end cards use the VLIW5 architecture, so there's not even much money being spent except for a minor redesign to fit the new cards performance profiles.

Consider this. Let's ignore everything except for the HD 2000, 3000, 2500, 4000, the Radeon 6000 family, and the Radeon 5000 family right now.

AMD has the Radeon 5450, the slowest member of the Radeon 5000 family. A few months after it came out, Sandy Bridge came out. Several months after that, Intel finally made HD 3000 i3s and i5s that did not have the K moniker. Now, Intel has finally made something that beats the Radeon 5450. The problem? The Radeon 6450 is already out and is more than 60% faster than the HD 3000. Several months after that, there is HD 4000 and it is beating the 6450. The problem with this? The Radeon 6450 didn't really do anything that the HD 3000 couldn't do except for having DX11 support and have support for three monitors. So, the 6450 is really only good for a little more future-proofing and for support for three independent monitors at high resolutions. Now, HD 4000 is out, and all it changed is that a card that was already not very useful is no longer needed. The problem for AMD? Well, there isn't really a significant problem at all. All that happened is one card that was not a very useful purchase for most people is no longer even useful at all except for upgrading very old systems to be able to be used as HTPCs, or adding support for another three monitors to a cheap or old machine. Not much has changed for AMD nor the 6450 in particular. So, what happened? The cheapest card that is so slow it's not even able to be properly classified as an entry level graphics card has slightly fewer uses. Does this mean that Intel killed the low end graphics market with HD 4000? Not at all.

You're also thinking that Intel's IGP is a significant motivator for AMD and Nvidia. However, it really isn't. What would happen if AMD couldn't sell cards with performance similar to a 6450 anymore? not much at all. If Intel made an IGP that could match or even beat the 6570, then AMD could be in a little trouble. If Intel made an IGP that could match the 6670, then Intel would have temporarily killed off the entry level (IE low end) graphics market. However, that didn't happen. Intel doesn't provide much motivation for improvement because AMD and Nvidia already have cards that are faster than any card that Intel manages to catch up to.

Intel did make a dent in the low end graphics market for note4books, but all that means is that AMD makes APU notebooks with a low end discrete card and then blows HD 4000 out of the water, even though HD 4000 on the mobile side isn't even what AMD needs to compete with because it's only on mobile i7s that are on notebooks that are out of the budget range that AMD is in. And that's all that AMD needs to do before Trinity comes and beats out the integrated competition. So, AMD is still winning here. I don't get why this is difficult.

Tell me, how could Intel be killing low end graphics if their HD 4000 IGP is only on their high end processors? The lowest end Ivy processor to sport the HD 4000 is the i5-3570K and it costs more than $200 and is only for the desktop.

All AMD needed to do was beat HD 2500 and that they do by a large margin.
 
You all should also realize that the architectures used on the high end compute cards and the consumer cards are the same. All of the Quadros, even those multi-thousand dollar ones with 6GB (I think that's the Quadro 5000) use the Fermi architecture. The AMD cards use their architecture too. AMD's next compute/professional cards will undoubtedly sport the GCN architecture. AMD and Nvidia don't need to spend R&D money on both compute cards and on gaming cards if they don't want to, they can use the same cards with minor changes. For example, the Quadro 5000 is just a Big Fermi card with the 1 to 8 DP to SP ratio changed and switched the 128MB chips out with 512MB chips.

The profit margins in the professional market are greater, but neither AMD nor Nvidia would abandon the gaming market because although it has less money, it still is a multi billion dollar market. That's larger than many other markets and industries that have far more competitors. Besides, AMD and Nvidia are still so far ahead of Intel in performance, there's little reason to abandon the market when they are ahead and that doesn't look like it's changing any time soon. Intel's IGP performance growth will slow down sharply unless Intel is willing to allocate more and more of the silicon and power usage per processor to the graphics, so Intel's performance increases are going to slow down to roughly the rate of AMD's and Nvidia's improvements.

Also, for the gaming cards (as of right now, 6570 and up), the software is a huge motivating factor; far greater than Intel can ever be, unless Intel does some pretty crazy things that are almost as ridiculous as they are unlikely (such as Intel joining the discrete graphics market or making chips to compete with Trinity).

The average person doesn't need them, but the average person doesn't need better graphics even if they have the HD 2000 or 3000 (or in many cases, even if they have worse). The HD 4000 is revolutionary for Intel, but it doesn't really change the market much at all. It simply decreased the difference between Intel's top IGP on their most expensive processors that have IGPs and the discrete cards. Until Intel can match at least the minimum in entry level graphics, then it will not change the market at all.

Llano had a far greater effect because when it came out, it was excellent for what it did. Now, it's not so great. For example, some games can't even be played above 1024x768 with decent frame rates on even the A8s. However, unlike HD 4000, you can just pop in a 6450 and CF it and then it's good enough for entry level play for now, so it has a saving grace. Even when that's not enough, it just needs a jump up to a 6570 or 6670 and it's yet again good to go for a while longer. Trinity seems to be even better if the rumors about it supporting an asynchronous type of triple CF with two low end graphics cards and the IGP because it can then get upgraded twice without you needing to replace a card like you would for every Intel and for the second or third Llano graphics upgrade.

Intel simply can't match this extreme future-proofedness (it's technically not a real word, but I think that it gets the point across very well), let alone the performance of even a single desktop A8 or even with the HD 2500 not matching anything from AMD at this time, except maybe the A4s. The problem with Intel is that they keep their IGPs unbalanced in comparison with their CPUs. The i3s and i5s are fairly powerful CPUs, yet they get the HD 2500. The AMD A6s and A8s that compete with them don't have powerful CPUs in comparison, but they have much more powerful GPUs. The A4's can probably compete with the i3s and i5s in graphical performance, but are a mere fraction of them in CPU performance. They compete more with the Pentiums and Celerons that have yet again, far worse GPUs than them despite the more powerful CPUs.

Intel refuses to properly compete with AMD. They are making you choose (if you want an integrated solution) between excellent CPU processing power and excellent GPU processing power. At least with Trinity, it should be a little less of a compromise, especially with how much faster the graphics will be on Trinity.
 
Um, AMD may have produced those cards before Intel produced Ivy Bridge, but recall IvyBridge has been delayed almost 6 months (4th quarter 2011 -> May 2012). AMD was well aware of the performance metrics of IvyBridge and has been likely for a year. So, unfortunately your point is invalid.

I'm ignoring the middle of your post because it is the same rehash and fanboy talk.

I will say that Intel's mobile lineup is all HD4000 so AMD apparently has to beat HD4000. So, unfortunately, I haven't seen anything yet that dispells the point I and other sites have made. Intel is killing off the low end such that the vendors must redefine the low end by building new products in a hurry. Your problem is you act like the release date is significant when these actions take place way before then. If AMD waited for release dates, they would be painfully far behind the curve and would suffer greatly for it. Knowing what your competition is doing is all part of corportate espionage these days and that is important in staying afloat.

Look blazorthorn. You are an AMD fanboy and your long, tired, posts always say the same thing despite what everyone else says. You have been at this for months. I know it is important to you, but it is falling on deaf ears at this point. Good luck on your crusade.
 
What does Intel's delay have to do with it? AMD's cards were delayed too. Do you think that AMD and Intel tell each other how well things perform? They don't and AMD found out how well HD 4000 does when benchmarks for it came out and not before. Companies generally don't talk about their upcoming products in such a way with their competitors.

I am not a god damned AMD fanboy. All but two of my machines have Intel CPUs and several have Nvidia graphics. My only two machines that have AMD CPUs are my old Gateway laptop (Turion x64 x2) and my Phenom II x6 desktop (virtual machines, was a choice between a Phenom II x6 or tan i7 and the Phenom II was cheaper, so I bought it). AMD has the better graphics at each price point than Intel and that's how it's always been and how it still is right now. I even went as far as to say what Intel would need to do to beat AMD in this within reason. I even talked about Nvidia throughout it (although I din't mention individual Nvidia cards much because I saw no reason to, the AMD cards were enough to prove the point).

I'm not on some crusade except to explain how things are. If I were an AMD fanboy, then why do I rarely recommend AMD CPUs over Intel? I've recommended Intel CPUs ever since I joined Tom's hardware because I knew that they were better, plain and simple, for most work. I'm telling you how it is and you're not listening. That's your problem, but it does NOT make me a fanboy.

EDIT: Okay, I cught my mistake on the HD 4000 versus 2500 being used in the mobile chips, the link I had forgot to mention it was refering strictly to the desktop chips when it said what it said. In this case, yes, the Llano APUs alone are thewn beaten by the Intel HD 4000 equiped i3s and i5s. However, AMD still has the option of CF between the 7470 (also many other low end cards) and the A6s and A8s to beat the HD 4000 within the same budget range, so fine, it's a more level playing field. Fine, for the mobile market, Intel's HD 4000 beats the A8's 6620G when there are no other factors to consider right now. However, like I said, there's still the CF option and it's still within the same price range (often cheaper, but then it's pushing it against some of the i3 machines). Trinity will still bring AMD on top for IGP graphical performance and that will be the end of that at last until Haswell. Regardless, none of that makes me a fanboy, I simply made a minor mistake. AMD is still competing for the graphical performance right now, they just need to CF a discrete card until Trinity. Then it will still be Intel winning in CPU and AMD winning in GPU for the integrated solutions and AMD will still have the CF option should it become necessary when Haswell comes out if Trinity doesn't have a successor by then, or if software necessitates it.
 
[citation][nom]blazorthon[/nom]The 7470m was released before HD 4000 and so was the 620m, so they don't even work for your examples anyway. Besides that, the 7470m is for doing CF with the Lano GPUs and both the 7470 and 620m are for Sandy Bridge notebooks (that's why it's Sandy that has them). Hence my point about slower/similarly performing cards only being released for older systems. You didn't even check the release dates on them and the systems that have them! Furthermore, there are often idiotic exceptions (such as a Sandy Bridge desktop with a CPU that has HD 3000, but a Radeon 5450 as the graphics card despite the HD 3000 and the 5450 being fairly parallel, how close depending on the HD 3000's frequency).[/citation]

I know the release date, and they're very close to each other, not more than 2 month. And some sites had some leaks about the HD4000 performance for a long time. And no, not all of the 7470 beeing CF'ed with the Llano, my friend just bought a notebook with i5 and 7470.
 
[citation][nom]glenricky[/nom]I know the release date, and they're very close to each other, not more than 2 month. And some sites had some leaks about the HD4000 performance for a long time. And no, not all of the 7470 beeing CF'ed with the Llano, my friend just bought a notebook with i5 and 7470.[/citation]

[citation][nom]blazorthon[/nom]The 7470m was released before HD 4000 and so was the 620m, so they don't even work for your examples anyway. Besides that, the 7470m is for doing CF with the Lano GPUs and both the 7470 and 620m are for Sandy Bridge notebooks (that's why it's Sandy that has them). Hence my point about slower/similarly performing cards only being released for older systems. You didn't even check the release dates on them and the systems that have them! Furthermore, there are often idiotic exceptions (such as a Sandy Bridge desktop with a CPU that has HD 3000, but a Radeon 5450 as the graphics card despite the HD 3000 and the 5450 being fairly parallel, how close depending on the HD 3000's frequency).[/citation]

You could at least have the courtesy to read and comprehend the whole sentence before replying with information that helps to prove me right. I made a typo when I forgot to put the "M" moniker after the 7470 in the second reference to it, but that shouldn't matter because there is no other Southern Islands 7470 anyway, so it's obviously still referring to the 7470M. You can clearly see that I put into bold where I say that the 7470s are also for the Sandy Bridge systems, not just for CF with Llano.

Also, leaks are often wrong and it's not like AMD's corporate heads are the kind of people who care about such *leaks* (let alone the kind of people who go to such sites), especially since they are often completely, or at least partially, wrong.
 

All true, but Tom's is apparently in love with Intel and Intel still has a larger market share of CPU's.
 


I don't think that Tom's is "in love" with Intel. They simply recognize that Intel has a very large advantage and that AMD can't compete in the high end gaming CPU market at this time. Hopes for Piledriver closing the gap considerably and Steamroller killing that gap between AMD and Sandy/Ivy off :)
 
I like the part where the market for discrete graphics isn't shrinking. Steam recently surpassed 5 million concurrent users. I personally hate integrated graphics. It was torturous having to use the HD2000 in my CPU while I was waiting on the RMA to process after my GPU blew up. Stupid thing can't even play Minecraft on the lowest settings properly.
 
[citation][nom]Kranchan[/nom]I like the part where the market for discrete graphics isn't shrinking. Steam recently surpassed 5 million concurrent users. I personally hate integrated graphics. It was torturous having to use the HD2000 in my CPU while I was waiting on the RMA to process after my GPU blew up. Stupid thing can't even play Minecraft on the lowest settings properly.[/citation]

"Hating" something because it does not do well in a task that it is not intended for seems a little over the top.
 
AMD APU's Graphics > Intel HD 4000, so how is it that intel's HD 4000 killed discrete GPU's when the AMD offers far more graphics performance ? Oh who am I kidding, even if it is true it's not like any company would actually say it let alone give AMD credit for it in an article. No no no Intel pays Tom's way too much money to ride the intel dick for them to be giving AMD any due credit..
 
Five Star Equities states that Ivy Bridge essentially kills the discrete graphics card because the integrated graphics of the CPU would be good enough for 95 percent of computer users.

Let me know when I can run GTA IV at a decent resolution, in medium or high detail.

I think this market share is more representative of the device in general rather than it's GPU capabilities.. and by the looks of things AMD, which has consistently brought innovative products to the market including APU's, has an even better IGP in the works, something that's considerably faster than the Intel HD 4000.

I don't think 95% of PC users would welcome an IGP for modern games. They're just not designed to run most of them in medium and high detail, at native resolutions AND keep the frame-rate high. The best they can hope to do, even the very latest one's, is keep apace with the budget cards. In terms of GPu development I'd say that in my opinion, Intel has been absolutely dire compared to the competition. No so long ago their GMA series were the principal reason for people complaining about sub-standard performance and stability, even on older titles like GTA: San Andreas. Intel tried to keep up with AMD, but struggled, and Ivy Bridge is the result. For serious gaming it's just not good enough and I've been a proponent of IGP's for ages. Intel didn't kill the discrete market.. it couldn't even compete.
 
I sell computers for living:
95% of the users that only need office; never buy a discrete graphics.Except when change OS and there is not drivers.
The discrete cards market will no change....
95% of the users aware of the IGP performance and buy it anyway; will eventually buy one.Begging their parents; saving;hunting in ebay or whatever but they Do.
The discrete cards market will no change....

Therefore IMO The discrete cards market will no change.The market analyst are wrong; is a lot bovine fecal material.

I hope Nvidia soon design a 28nm low level chip. In modern silicon wafers will be abundant; ridiculous cheap; and with a Thermal profile perfect for MoBo integration.Then all will see Ivy Bridge HD 4000 true colors:

A terrible waste of good die space.

Instead of adding more cores; memory channels; more cache; etc.
They came with IGP that 95% of users either : Don' t care about it or is too underpowered and will be disable soon as possible.
 
[citation][nom]mamailo[/nom]I sell computers for living: 95% of the users that only need office; never buy a discrete graphics.Except when change OS and there is not drivers. The discrete cards market will no change.... 95% of the users aware of the IGP performance and buy it anyway; will eventually buy one.Begging their parents; saving;hunting in ebay or whatever but they Do. The discrete cards market will no change....Therefore IMO The discrete cards market will no change.The market analyst are wrong; is a lot bovine fecal material.I hope Nvidia soon design a 28nm low level chip. In modern silicon wafers will be abundant; ridiculous cheap; and with a Thermal profile perfect for MoBo integration.Then all will see Ivy Bridge HD 4000 true colors:A terrible waste of good die space.Instead of adding more cores; memory channels; more cache; etc.They came with IGP that 95% of users either : Don' t care about it or is too underpowered and will be disable soon as possible.[/citation]

Intel needs to improve their IGP every now and then or else it will get too outdated even for regular work. That, and Nvidia's 680 can beat the HD 3000 for encoding/transcoding. Intel doesn't want to lose in one of the few reasons that such customers use the IGPs.
 
Seriuosly! I wonder where they get these so called "analysts"... making speculations about products or market segments they dont fully undertstand. Idots are being placed in overpaid jobs to churn out useless crap!
 
I have two PC's. I have discrete video cards in both of them. I don't game on either of them. So, I don't know where they pulled that magic number of 95% from. My guess is that it is purely made up. Oh, I'm sure the number of people who can make due with on board video is large, I have a feeling it is no where near 95%.

On board video is not exactly a new idea. Been around for decades. And just like it was decades ago, on board video still sucks in comparison. Always will.

And to respond to an above comment. AMD buying ATI was not wise in my opinion. ATI has always been a horrible gfx card from back in the old days up to current. Heck, they still haven't figured out what the proper default refresh rate for 800x600 resolution should be (yes people do still use low resolutions like that for some applications).
 
[citation][nom]Penguins2012[/nom]I have two PC's. I have discrete video cards in both of them. I don't game on either of them. So, I don't know where they pulled that magic number of 95% from. My guess is that it is purely made up. Oh, I'm sure the number of people who can make due with on board video is large, I have a feeling it is no where near 95%. On board video is not exactly a new idea. Been around for decades. And just like it was decades ago, on board video still sucks in comparison. Always will. And to respond to an above comment. AMD buying ATI was not wise in my opinion. ATI has always been a horrible gfx card from back in the old days up to current. Heck, they still haven't figured out what the proper default refresh rate for 800x600 resolution should be (yes people do still use low resolutions like that for some applications).[/citation]

Unless you can name a problem more relevant than not working with 800x600 very well, you're wrong. AMD's graphics are just as top notch as Nvidia's and the graphics market has been AMD's lifeline for a while now. AMD would have been much worse off without it.
 
Really, this is news? The integrated graphics on my old laptop has been able to play videos and perform other non-intensive graphic related tasks for over 8yrs now. That doesn't mean I haven't desired a better GPU, That just means it was cheap!
 
Status
Not open for further replies.