AMD Piledriver rumours ... and expert conjecture

Page 68 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
well I read the SIMD link that Palladin supplied
fascinating though my brain hurts
I did find this link in that SIMD Wiki
http://en.wikipedia.org/wiki/Many_Integrated_Core_Architecture

which I know is Intel but I think fits into the context of this thread

finding a better way to incorporate parallel processing seems to be the future
in CPU design


Pretty much yeah.

It's something we've been doing in the Sparc world for a very long time. It's cool to see the x86 world finally catching up. They've started to push MIMD (Multiple Instruction Multiple Data) architectures to the very limit of physics. You simply won't be getting much faster in non-parallel code. Intel's done an amazing job with their branch predictor in turning non-parallel code into parallel executed micro-ops. Did people really think that SB was executing those single threads one instruction at a time?
 
I know on average that my Win7 runs around 500 threads or more at any instant
now if I got it right that doesnt lend itself to parallel processing since they are many independent and seperate items with different data sets
alot of them are the OS "doing its thing" which is usually alot fo I/Os
besides the threads that apps are running
so if a CPU was doing single threads one instruction at a time
then the OS alone would cripple it
combining the CPU and GPU is just logical
they compliment each other
and the idea of the GPU handling SIMDs instead of the CPU just makes sense
the GPU is better suited for it

here is my analogy if I may

CPU-heavy weight fighter good at taking on large opponents
GPU-faster fighter better at taking on multiple smaller opponents

having them "fight" together as a team makes each stronger


 
I know on average that my Win7 runs around 500 threads or more at any instant
now if I got it right that doesnt lend itself to parallel processing since they are many independent and seperate items with different data sets
alot of them are the OS "doing its thing" which is usually alot fo I/Os
besides the threads that apps are running
so if a CPU was doing single threads one instruction at a time
then the OS alone would cripple it


Most of those threads are idle though. The each have specific tasks assigned to them and only do those tasks when told do from the OS or their underlying service. They start up and sit there waiting for instructions.

Most benchmarks are done with games these days, games typically only have one to two extremely busy threads with a dozen other lightly worked threads. What SB does is take those heavy threads, use the additional L3 cache to hold as much of it's data pool as possible, then try to execute as much of it in parallel as possible (internet micro-ops) while leveraging Intel's advanced branch prediction and loop detection units.
 
I will say it once more provide some links that will convince me of Intel's IGP going to be better performance wise than AMD's IGP. When it comes to E Series and Llano the problem with the IGP at this time is mainly down to low clocks and more importantly there is bandwidth issues that leave the IGP starved so they underpeform for their unit counts. As for Intel's IGP their designs have traditionally been licensed PowerVR designs and not just the Atom. If their 4K series IGP is in house then provide the links.

I never said it was. In fact I am pretty sure that until at least Haswell, AMDs IGPs will be better. Thats if anything about Haswell is true, which we all know info can change at any time. I am saying that what Triny is saying is wrong. He is claiming that SB does not take on any discrete entry level GPUs, when it clearly does:

http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11

HD Graphics 3000 is a huge step forward. At 71.5 fps it’s 70% faster than Clarkdale’s integrated graphics, and fast enough that you can actually crank up some quality settings if you’d like. The higher end HD Graphics 3000 is also 26% faster than a Radeon HD 5450.

http://www.xbitlabs.com/articles/graphics/display/intel-hd-graphics-2000-3000_7.html#sect0

farcry-2.png


In both cases, ahead of a HD5450 but behind a HD5570, which I would expect.

As for PowerVR, the only ones right now are in Atom CPUs. HD4K and 4K are Intel. I searched the web and there are no articles claiming that HD3K is PowerVR based. HD4K, and even GTX (1, 2 and 3) are all based on the same graphics core as Sandy bridge, just enhanced. HD2/3K are a Intel design. So unless you have any article refuting that HD is Intel made, lets not talk about it again.

I could go on but no need to.

Oh snap! We're in panic mode!? I didn't get the memo!

* Not speaking for Intel Corp *

PANIC!!!!!!!!!!!

With mobile sales in the lurch Intel will be without a credible igp solution
nor will slapping lipstick on hd3000 help
AMD is keeping Intel honest or you would have hd1000 still

You realize that Sonoran works for Intel, right? if anyone has info, he does.

http://www.tomshardware.com/news/AMD-ATI-Nvidia-GPU-Tegra,14795.html
Report: AMD Considered Buying Nvidia Before ATI Purchase

" After the acquisition, AMD struggled to integrate its newly acquired graphics business as Nvidia unleashed a flood of strong products, consuming large chunks of market share. AMD eventually gathered its forces and fought back, but Nvidia had already moved aggressively into the mobile SoC market by introducing its ARM-based Tegra chip.

With Tegra installed in tablets and smartphones, Nvidia now has a market capitalization of $9.7 billion whereas AMD is worth just $5.2 billion. "


now what were you saying again in that imaginary world of yours..? :heink:

As I said, nVidia si doing pretty well. And Tegra will sell.

I thought this thread was about the Piledriver?

Yea it is but PD info is slower than a snail racing a turtle around the world.

If nvidia fails on tegra they will effectively need to compete only with their gpus, they would be in the same boat as AMD.

Not really as up until the HD7K series, nVidia has always been ahead in single GPU setups. The HD6K, which was geared against the GTX5 series, does not compete as well with it. If Kepler is better that HD7K, then nVidia will do fine.

Plus CUDA is more well developed than ATI Stream.
 
nvidia's desktop market has been on the decline since fermi. CUDA will eventually be replaced with openCL, there will be no way nvidia can keep CUDA ahead of openCL since intel and AMD own more than 80% of the graphics market and both will eventually use openCL as standard gpgpu computing when software starts supporting it more.

Being ahead on the single fastest GPU means nothing if they can't capture other parts of the market. The low end nvidia offering are pathetic which also happens to be the larger segment of the market. Nvdia is really banking on tegra because they are trying to branch out from the gpu market, if they don't succeed, they will be very limitted on how much they can actually sell because of APUs and intel integrated graphics entering the market.

nvidia isn't doing bad yet but if tegra fails they will need to seriously beat AMD in discrete gpus because the market just isn't that big any more. Not saying tegra will fail but its just something to consider.
 
afaik nvidia made more money than amd in the discreet gpu sector despite amd's cards being cheaper and more power friendly. their professional and workstation graphics cards bring in a lot more revenue than amd's do. nvidia sold more discreet gpus (mostly gaming cards) during the delay in 7000 series introduction and during bf3, deus ex hr launch. amd's relationship with isv, driver support are other factors. i think one of amd's top executives in graphics dept was let go for this.. i am not sure about that though.
nvidia's entry level cards are indeed crap. they can't get close to amd's offerings in terms of performance, price and efficiency.
 
nvidia's desktop market has been on the decline since fermi. CUDA will eventually be replaced with openCL, there will be no way nvidia can keep CUDA ahead of openCL since intel and AMD own more than 80% of the graphics market and both will eventually use openCL as standard gpgpu computing when software starts supporting it more.

To be more precise based on Q4 2011 sales:

Intel = 59.1% (down from 60.2% in Q3)
AMD = 24.8% (up from 23% in Q3)
nVidia = 15.7% (down from 16.1% in Q3)

That leaves 0.4% for other video card manufacturer(s)... VIA???

http://hothardware.com/News/AMD-Grabbed-GPU-Market-Share-from-Nvidia-Intel-in-Q4/


I will assume if any Sandy Bridge or Llano PC / laptop sold with a discrete card the integrate Intel / AMD graphics core would not be part of the statistics because it would be a "double count" which would lead to a total above 100%.
 
I never said it was. In fact I am pretty sure that until at least Haswell, AMDs IGPs will be better. Thats if anything about Haswell is true, which we all know info can change at any time. I am saying that what Triny is saying is wrong. He is claiming that SB does not take on any discrete entry level GPUs, when it clearly does:

http://www.anandtech.com/show/4083/the-sandy-bridge-review-intel-core-i7-2600k-i5-2500k-core-i3-2100-tested/11

HD Graphics 3000 is a huge step forward. At 71.5 fps it’s 70% faster than Clarkdale’s integrated graphics, and fast enough that you can actually crank up some quality settings if you’d like. The higher end HD Graphics 3000 is also 26% faster than a Radeon HD 5450.

http://www.xbitlabs.com/articles/graphics/display/intel-hd-graphics-2000-3000_7.html#sect0

http://www.xbitlabs.com/images/video/intel-hd-graphics-2000-3000/farcry-2.png

In both cases, ahead of a HD5450 but behind a HD5570, which I would expect.

Not trying to rain on your parade, but picking the best-case scenario for HD 3k isn't very accurate to its overall performance. half of the time it can't keep up with the slowest A4, and barely if at all faster than the 5450 with the exception of Dragon Age and Far Cry 2

Here the Core i7-2600K and 2500K fall behind the Radeon HD 5450. The 5450 manages a 25% lead over the HD Graphics 3000 on the 2600K
- Dawn of War II

civilization.png


http://www.xbitlabs.com/articles/cpu/display/amd-fusion-intel-core-i3_7.html#sect0

the A4 3300 is 160 shaders at 430 mhz, the 3400 is at 600 mhz. Even with a massively superior CPU, the graphics can't even compete. Intel needs a boost of 300% in civ V to give AMD any competition, 67% isn't going to cut it.

I would have to agree with you in part, that if AMD z is slower than A4 Llano, then HD 4k might have a chance, but how many CPUs are even going to see the HD 4k label? currently only the 2600k and 2500k had it at launch, and later 2105 and 2125 have the 3k. I don't expect to see very many HD 4k cpus.
 
Not trying to rain on your parade, but picking the best-case scenario for HD 3k isn't very accurate to its overall performance. half of the time it can't keep up with the slowest A4, and barely if at all faster than the 5450 with the exception of Dragon Age and Far Cry 2

Here the Core i7-2600K and 2500K fall behind the Radeon HD 5450. The 5450 manages a 25% lead over the HD Graphics 3000 on the 2600K
- Dawn of War II

http://www.xbitlabs.com/images/cpu/amd-fusion-intel-core-i3/civilization.png

http://www.xbitlabs.com/articles/cpu/display/amd-fusion-intel-core-i3_7.html#sect0

the A4 3300 is 160 shaders at 430 mhz, the 3400 is at 600 mhz. Even with a massively superior CPU, the graphics can't even compete. Intel needs a boost of 300% in civ V to give AMD any competition, 67% isn't going to cut it.

I would have to agree with you in part, that if AMD z is slower than A4 Llano, then HD 4k might have a chance, but how many CPUs are even going to see the HD 4k label? currently only the 2600k and 2500k had it at launch, and later 2105 and 2125 have the 3k. I don't expect to see very many HD 4k cpus.
Like i said, AMD got the jump on Intel, and they have to do whatever they can to keep their lead. In many cases Intel is going to need 2x the graphics power to catch up to Llano. Intel has no graphics segment , so they are working to get more power out of 3000, without having the luxury of taking an older, better arch and putting that in their chip.

Advantage: AMD

If AMD gets 50-60% on trinity, then Intel will have been put farther behind. (50% of apu igp is more than 50%, or even 70%, Sandy Bridge igp.)

HUGE problem for AMD though. I doubt this will be the case with Ivy, but if both ivy and Haswell get 50% gains, that is the point where Intel's igp will be "good enough". 90% of the market only wants to watch videos on youtube or play some flash games. The moment Intel's solutions can do that, AMD is in a major hole. Most people wouldn't care either way at that point. If its good enough, they will take it, because they aren't pushing it to its limits anyway.

Intel is more well known than AMD, so when Intel's solution is "Good enough" far fewer people will be searching out AMD for even better graphics.

AMD needs to hit with Trinity, and hit hard. They need to get their name out there, and this may be the last time they have a jump on Intel.
 
Not trying to rain on your parade, but picking the best-case scenario for HD 3k isn't very accurate to its overall performance. half of the time it can't keep up with the slowest A4, and barely if at all faster than the 5450 with the exception of Dragon Age and Far Cry 2

Here the Core i7-2600K and 2500K fall behind the Radeon HD 5450. The 5450 manages a 25% lead over the HD Graphics 3000 on the 2600K
- Dawn of War II

http://www.xbitlabs.com/images/cpu/amd-fusion-intel-core-i3/civilization.png

http://www.xbitlabs.com/articles/cpu/display/amd-fusion-intel-core-i3_7.html#sect0

the A4 3300 is 160 shaders at 430 mhz, the 3400 is at 600 mhz. Even with a massively superior CPU, the graphics can't even compete. Intel needs a boost of 300% in civ V to give AMD any competition, 67% isn't going to cut it.

I would have to agree with you in part, that if AMD z is slower than A4 Llano, then HD 4k might have a chance, but how many CPUs are even going to see the HD 4k label? currently only the 2600k and 2500k had it at launch, and later 2105 and 2125 have the 3k. I don't expect to see very many HD 4k cpus.

First, I never was comparing it to the A series. I said that Llano is obvioulsy better than HD3K.

In the XBit article I linked, the tested the HD3K against the HD5450 and HD5570 with the same CPU and in every game they did, it was still ahead in FPS of the HD5450. My entire point was that HD3K is equal to or better than a HD5450, Triny said that Intel s IGP wouldn't even catch up to a HD5450 until at least Haswell when SB already has a IGP that does that.

That means that IBs IGP could possibly be near a HD5570 in performance.

Is it just me or do people miss read everything?
 
For those individuals who may not be very familiar with that exactly FPU / SIMD units do these days.

http://en.wikipedia.org/wiki/SIMD

http://en.wikipedia.org/wiki/Floating-point_unit

FPU's are very very old. No modern CPU has an actual "FPU" anymore they have a SIMD unit that can also handle FPU instructions. Old 8087 instructions have largely been replaced with SIMD equivalents (MMX/SSE/AVC/FMA/XOR). FPU/SIMD units are not accessed the same way the CPU integer units are. They don't share pipelines or schedulers, they don't need to be decoded into micro-ops. They are dispatched directly to the SIMD units for execution with some rearranging done before hand for optimal execution time. Due to SIMD not including ~any~ logical conditional instructions (COMP/JUMP) there is no need for branch prediction. They have their own directly addressable registers and do not share the x86 register file. It's best to treat the SIMD unit as a separate coprocessor even though it inhabits the same die and use's the same memory controller that the integer units use.

Due to this, SIMD instructions will always be highly parallel in nature. The whole purpose of the SIMD unit is to execute multiple operations simultaneously on large arrays of data.

your last couple of posts have been excellent your a scholar and a gentleman sir.
 
First, I never was comparing it to the A series. I said that Llano is obvioulsy better than HD3K.

In the XBit article I linked, the tested the HD3K against the HD5450 and HD5570 with the same CPU and in every game they did, it was still ahead in FPS of the HD5450. My entire point was that HD3K is equal to or better than a HD5450, Triny said that Intel s IGP wouldn't even catch up to a HD5450 until at least Haswell when SB already has a IGP that does that.

That means that IBs IGP could possibly be near a HD5570 in performance.

Is it just me or do people miss read everything?

Until a full tom's review, taken from the Chinese performance evaluations, 
in some cases is close to AMD A6 series(HD 6530D).
 
If Intel can get hd4000 to reach lano performance they will avert a landslide in mobile
but mobile will still tilt to AMD
if AMD can beat or equal intel mobile cpu the avalanche will be big
 
First, I never was comparing it to the A series. I said that Llano is obvioulsy better than HD3K.

In the XBit article I linked, the tested the HD3K against the HD5450 and HD5570 with the same CPU and in every game they did, it was still ahead in FPS of the HD5450. My entire point was that HD3K is equal to or better than a HD5450, Triny said that Intel s IGP wouldn't even catch up to a HD5450 until at least Haswell when SB already has a IGP that does that.

That means that IBs IGP could possibly be near a HD5570 in performance.

Is it just me or do people miss read everything?
Xbit labs is good at picking and choosing to fit their goal of showing what they want someone to believe. They almost never use the same games for their tests from one review to the next.

the Anandtech article had the hd 3k ahead of the 5450 in only 3 of the 12 games tested. Add that to xbits 4/5 (metro 2033 is flopped over to hd3k somehow) thats still only 7 of 17 games.

In order for IB to catch the 5570, its needs to be at least twice as fast or faster than the 3k according to both articles. If its only 67% faster, thats going to definately put it faster than the 5450, on target for the 6450. I was going to say the 5550, but found a review, the 5550 is actually faster than the 5570.
 
If Intel can get hd4000 to reach lano performance they will avert a landslide in mobile
but mobile will still tilt to AMD
if AMD can beat or equal intel mobile cpu the avalanche will be big

If the Intel HD 4000 will indeed get the estimated 60% performance increase, then it would be roughly equal to the Radeon HD 5550 which would put it on par with the graphics core in the Llano A4 APU. Of course that will change when Trinity is released. I think I read somewhere that it would have about a 30% increase in graphics performance compared to Llano.

Intel's next CPU based on a totally new architecture, Haswell, is not expected increase the graphics core's performance by much though. An article I've read a couple of weeks ago basically stated the emphasis on Haswell as going to be on it's CPU processing power. The new graphics core will only get an incremental increase in performance. What's incremental? <shrug> 10% - 15% maybe?
 
Xbit labs is good at picking and choosing to fit their goal of showing what they want someone to believe. They almost never use the same games for their tests from one review to the next.

Actually, that is not exactly true. Below are the a list of games from six different Xbitlabs reviews that are a few months apart from each other which spans from Nov. 21, 2010 to Feb 16, 2012. The games in bold red are consistent in all reviews; the exception would be Metro 2033 which got replaced by the newer Metro 2033: The Last Refuge. That's a total of 6 consistent games used for reviews over a span of a little more an a year.

Why don't they use the same games over and over again? New games come out all the time so I suppose they select new games that have relatively high system requirements to replace similar older games which have lower system requirements. They can't simply continue to add new games on top of old games for benchmark testing because you'll end up with having to run 30+ games for new cards. That's time consuming.



Catch Me if You Can: Sapphire Radeon HD 7950 OC 3 GB Graphics Card Review
2/16/2012
http://www.xbitlabs.com/articles/graphics/display/sapphire-radeon-hd-7950-oc.html

Aliens vs. Predator (2010)
Batman: Arkham City
Battlefield 3
Crysis 2
DiRT 3
Hard Reset Demo
Just Cause 2
Left 4 Dead 2
Lost Planet 2
Metro 2033: The Last Refuge
S.T.A.L.K.E.R.: Call of Pripyat
Sid Meier's Civilization V
StarCraft II: Wings of Liberty
Tom Clancy's H.A.W.X. 2
Total War: Shogun 2


Inexpensive Hi-End: MSI R6950 Twin Frozr III 1 GD5 Power Edition/OC Graphics Card
10/19/2011
http://www.xbitlabs.com/articles/graphics/display/msi-r6950-twin-frozr-iii.html

Aliens vs. Predator (2010)
BattleForge: Lost Souls
Crysis 2
DiRT 3
Hard Reset Demo
Just Cause 2
Left 4 Dead 2
Lost Planet 2
Metro 2033: The Last Refuge
S.T.A.L.K.E.R.: Call of Pripyat
Sid Meier’s Civilization V
StarCraft 2: Wings of Liberty
Tom Clancy's H.A.W.X. 2
Total War: Shogun 2
World of Planes (alpha)

GeForce GTX 560 from EVGA, Gigabyte and MSI
7/26/2011
http://www.xbitlabs.com/articles/graphics/display/geforce-gtx-560.html

Aliens vs. Predator (2010)
BattleForge: Lost Souls
Crysis 2
DiRT 3
Just Cause 2
Lost Planet 2
Metro 2033: The Last Refuge
S.T.A.L.K.E.R.: Call of Pripyat
Sid Meier’s Civilization V
StarCraft 2: Wings of Liberty
Tom Clancy's H.A.W.X. 2
Total War: Shogun 2

AMD Radeon HD 6790 Graphics Card Review
5/4/2011
http://www.xbitlabs.com/articles/graphics/display/radeon-hd-6790.html

Aliens vs. Predator (2010)
BattleForge: Lost Souls
Borderlands: The Secret Armory of General Knoxx
Crysis
Crysis 2
F1 2010
Far Cry 2
Grand Theft Auto IV: Episodes from Liberty City
Just Cause 2
Left 4 Dead 2
Lost Planet 2
Mafia 2
Metro 2033: The Last Refuge
Resident Evil 5
S.T.A.L.K.E.R.: Call of Pripyat
Sid Meier’s Civilization V
StarCraft 2: Wings of Liberty
Tom Clancy’s H.A.W.X. 2
Warhammer 40 000 Dawn of War II: Retribution

ASUS HD 6870 DirectCU: Chasing the Leader
2/21/2011
http://www.xbitlabs.com/articles/graphics/display/asus-eah6870-directcu.html

Aliens vs. Predator
Battlefield: Bad Company 2
BattleForge
Call of Duty: Black Ops
Crysis Warhead
F1 2010
Fallout: New Vegas
Just Cause 2
Lost Planet 2
Mass Effect 2
Metro 2033
S.T.A.L.K.E.R.: Call of Pripyat
StarCraft II: Wings of Liberty

Natural Born Winner: Nvidia GeForce GTX 580 Review
11/21/2010
http://www.xbitlabs.com/articles/graphics/display/geforce-gtx-580.html

Aliens vs. Predator
Battlefield: Bad Company 2
BattleForge
Call of Duty: Modern Warfare 2
Colin McRae: Dirt 2
Crysis Warhead
Fallout: New Vegas
Far Cry 2
Just Cause 2
Lost Planet 2
Mass Effect 2
Metro 2033
S.T.A.L.K.E.R.: Call of Pripyat
StarCraft II: Wings of Liberty
Tom Clancy’s H.A.W.X. 2 Preview Benchmark

 
If the Intel HD 4000 will indeed get the estimated 60% performance increase, then it would be roughly equal to the Radeon HD 5550 which would put it on par with the graphics core in the Llano A4 APU. Of course that will change when Trinity is released. I think I read somewhere that it would have about a 30% increase in graphics performance compared to Llano.

Intel's next CPU based on a totally new architecture, Haswell, is not expected increase the graphics core's performance by much though. An article I've read a couple of weeks ago basically stated the emphasis on Haswell as going to be on it's CPU processing power. The new graphics core will only get an incremental increase in performance. What's incremental? <shrug> 10% - 15% maybe?

Actually I have read that the top end, GT3, will have 40 EUs compared to IBs 16 EUs. They will be overall the sam ebut still thats more than double the EUs. Kinda like HD4870->HD5870, double the SPUs.
 
AMD buying ATI was a big a$$ gamble. They could of easily gone bankrupt during that time.

Thankfully it seems to of paid off and both companies are stronger for it. As evidenced with the growing popularity of the APU's (especially in mobile devices) and ATI's newer cards. I'm mostly an Nvidia user myself, but I can appreciate the power of ATI's cards.

Lessee now, in 2006 AMD had something like 30% marketshare in server - the most high-margin and profitable CPU segment by a long shot. Today they are what - 5%?

In 2006 they also had a much larger share in desktop - again, more expensive and thus profitable CPU segment.

Today they are competitive in low-end mobile, which is pretty low-margin.

By investing that $5.4BN in buying ATI, they had to cut back on R&D, sell off various parts of themselves including their fabs, and now appear to have pretty much given up regaining the lead according to the statements Reid made during the past few months, including the analysts day meeting last month.

Thanks to GloFlo, they apparently lost out on selling lots of Llanos to Apple because they couldn't deliver enough.

In short, it seems ATI has reaped the benefit of the merger far more than AMD has..
 
If the Intel HD 4000 will indeed get the estimated 60% performance increase, then it would be roughly equal to the Radeon HD 5550 which would put it on par with the graphics core in the Llano A4 APU. Of course that will change when Trinity is released. I think I read somewhere that it would have about a 30% increase in graphics performance compared to Llano.

Intel's next CPU based on a totally new architecture, Haswell, is not expected increase the graphics core's performance by much though. An article I've read a couple of weeks ago basically stated the emphasis on Haswell as going to be on it's CPU processing power. The new graphics core will only get an incremental increase in performance. What's incremental? <shrug> 10% - 15% maybe?

Intel doesn't need to beat trinity's igp if they reach lano they may escape loosing big in the mobile market
same with AMD all they need to do is be competitive with Intel's SB cpu it doesn't have to beat it.
 
If the Intel HD 4000 will indeed get the estimated 60% performance increase, then it would be roughly equal to the Radeon HD 5550 which would put it on par with the graphics core in the Llano A4 APU. Of course that will change when Trinity is released. I think I read somewhere that it would have about a 30% increase in graphics performance compared to Llano.

Intel's next CPU based on a totally new architecture, Haswell, is not expected increase the graphics core's performance by much though. An article I've read a couple of weeks ago basically stated the emphasis on Haswell as going to be on it's CPU processing power. The new graphics core will only get an incremental increase in performance. What's incremental? <shrug> 10% - 15% maybe?

Hmm, there are other links posted in the IB/Haswell thread stating that Haswell will be (and I hate to use Baron's hyperbole 😛) a "graphics monster". Something along the lines of 2X or more of IB's GPU. Last fall I saw an Intel slide promising "7X" performance over SB's GPU, although that could have been the HD2K. If Intel does stick low-power dedicated DDR on top of the die with silicon interposer tech, then they could get massive improvements with that feature alone.

I guess we'll all find out in about 14 months from now.

Personally, being a gamer, I'll probably be happy with IB and a high-end 28nm discrete GPU, either AMD or NV depending on which gives the most bang for the buck.
 
Status
Not open for further replies.