AMD Piledriver rumours ... and expert conjecture

Page 226 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 


Just hard to see how a company with no GPU core experience can produce a chip that will all of a sudden be at a "monster level" which to me sounds like something that will be seen in SteamRoller using a state of the art GPU core design.....

That 7x brought it closer to Llano but there were still behind by a decent degree. If Intel get it right then I will happily say well done but I think to assume "Monster" without context is rather ambitious. Monster for all we know could mean llano in Intel terms.
 


Most of the time is not about the hardware, but the software around it.

My biggest complain over the years for Intel graphics has been the support, not the "specs" per sé. After Sandy, they've come a long way in both, and following the trend for Haswell could turn out to be a good thing for everyone.

Cheers!
 
Yes it is very much about the software, but in this instance AMD and Nvidia have a history of Hardware and software support experience, to assume Intel are just in the snap of a finger going to produce a Graphics core, needless to say on a die far ahead of any GPU die is very presumptuous.
 



Let's just say Intel is behind in this department and for the first time in their life their taking that seriously. I still don't get why they didn't enter the high-end GPU market back in the 90's, if you ask me they deserve to be behind in the GPU department i remember when vista came out their currunt graphics couldn't run Aero :kaola: .

Now i'd say intel needs to work on their drivers more then anything based on benchmarks the Intel 4000HD graphics are around Llano level maybe just 10% below. But their drivers are nothing when compared to Amd which is kinda sad if you think about it.

As for me i'll take a 650$ A10 laptop over a I3 ivy laptop for 650$ any day, But anything over 700$ Intel has my vote but i wound never spend that much on a laptop anyways Lol.

Off topic
i'm going to keep my A8 laptop until steamroller then i'm going to give the A8 laptop to my mom. For now i'm only going to get a nice 128GB Samsung SSD(cheap on newegg) to put in here and wait.
 



Funny you mention that since my friend has a I7 sandy laptop and he has nothing but troubles with their drivers on fedora me i'm fine in fact its quite good with out drivers.
 


Ummm...Intel has like 60% of the graphics market...

Granted, their old integrated chips STANK, but its not like they haven't built GPU's before...
 



Because its forced on users practically i would of never bought a Laptop before APU's i could not stand Intel's horrible graphics at the time and I was not going to spend 800$+ on dedicated graphics.
 


If you're pulling the Linux card, remember Intel doesn't have a closed binary there. The actual "Intel Video Driver" is an open source project (of some sort, lol) AFAIK.

Anyway, Intel can totally pull it from nowhere. The driver development they've had from SB to IB has been really good it seems. They're addeing features little by little, but at an steady pace.

I wouldn't be surprised if they manage to get a very big set of features (video post processing, TV support, game profiling, etc) in the short run. Intel has the money for it, they just don't think it's time to spend it IMO.

Cheers!
 

This worries me. A lot.
 


Well, to people not wanting something expensive to play videos on, I've never ever recommended an Intel rig. *That thinking is starting to change since SB came out.

We could start from there. The so called "econoboxes" for computing, haha.

You know there are a lot of folks (even some of us) that actually have the heavy duty and the "kids toy"/"sofa" computer.

Cheers!

EDIT: *Added
 


How is Intel HD Graphics forced on users any more than AMD Fusion? How many people do you think there are that are saying to themselves "Damn Intel forced me to use these HD graphics, I'd much rather pay an extra $50-$100 for an entry level discrete video card that is no better than the one Intel provides for free."

That 60% is the amount of people who DO NOT have a discrete card. That means that 40% of people have CHOSEN to use a discrete card (Or fusion). Nobody is forcing consumers to do anything.



Many of us travel and need laptop with sufficient graphics to play HD video, but do not intend to play any games whatsoever. Not everybody wants to play crysis on their laptop. A SB Celeron browses facebook and runs Word and Excel just as well as an A10 at a fraction of the price. Fusion is only more balanced if the user intends to utilize the graphics power.



That's because Intel graphics WERE horrible before APUs. It wasn't until SNB that their performance and driver stablity was good enough for today's average user. That's the exact point I was making before. You are poisoned against Intel graphics from past generations, so you haven't given Intel that chance to show you that HD4K has improved temendously.
 


Are you kidding me? Even Atom plays 1080P videos flawlessly. You don't need an APU to play videos. And in fact, the lowest end single core SB Celeron has SIGNIFICANTLY BETTER media ("video playing") performance than Fusion APUs for the same low price of dirt cheap.

Don't get media ("video playing") confused with graphics.
 


If you're talking about the ION platform, that's nVidias. And no, Atoms can't play 1080p + filters applied.

Cheers!
 


Intel has had graphic cores for a long long time, they just didn't focus on them. Imagine if they had there would be no AMD(ATI) or Nvidia today.
 

imo if someone has a quad module bd, pd is okay to skip. i made my assumption based on the rumors/info that
amd will incorporate 3rd party ip - rcm (according to my ...rather limited knowledge, should work up to (may be multiple resonant freq defined by the power management for stock use) or at a certain frequency),
clockspeed has minor bump,
both bd and pd are 32 nm chips,
the high vcore amd cpus (both bd and the pd e.s. from obr screenies) use at high clockrate even at stock, etc.
use of rcm can be seen from different angles. one is that amd made a wise choice using rcm for it's effectiveness. another is that amd Failed to reign in bd's powerhogging tendencies by themselves and Had to use rcm. another one is that amd is too short on money to re-work all of the cpu so they're incrementally improving their product such as introducing modular arch, then improving perf/watt, after that may be improving imc, pcie controller, power effeciency and so on. all these seem to contribute to improving stock perf. i don't see how overclocking fits into this. i am not saying amd won't make overclockable cpus, they certainly will have unlocked cpus. i am guessing that they're leaving the oc related things up to people who will oc the cpu. that means everything that comes with overclocking is the user's responsibility including the power use, heat dissipation, cooling management. they'll probably do another pd 8 core + amd branded lcs bundle like the one with closed loop liquid cooler + fx8150.
then there's the mantra - "real gamers/enthusiasts do not care about power efficiency". 😀
amd will definitely claim credit in case anyone manages to break any ghz records. :)
early reviews will make it worse. i noticed (after noob poitned it out iirc) that early cpu reviews almost always use high vcore for overclocking, to see how far the cpu goes in terms of clockrate and make power efficiency verdict on that info. toms do better, with their oc efficiency analysis.
 


I'm refering to Intel's far superior media performance due to fixed function decode/encode hardware (on Core). There's no need for an APU on an HTPC intended for HD video playback.
 


I see your point, but really how much graphical juice does it take to play an HD video? My 2 years old Intel pentium @2.1 ghz plays hd videos fine.
 
There's nothing wrong with Fusion, it's great for any average user. I just don't think it's a clear winner over Intel in anything except for low priced gaming machines on both desktop and laptop.

I see your point, but really how much graphical juice does it take to play an HD video? My 2 years old Intel pentium @2.1 ghz plays hd videos fine.

Zero. Video playback uses media processing power, not graphics processing. APU fans like to shout from the rooftops how great Fusion is for HTPCs. But you don't really need to use the Graphics engines to playback video.
 


Right, because Intel can automatically outperform any other company at anything they choose to do, huh? By the way, in spite of their focusing on CPUs, we still have this other mildly successful company around, which is called AMD.
 


The problem with "fixed function logic" circuits is that when you want something a little more custom, it falls flat on its face. Inte's QS is no exception to that rule (nor is DXVA and nVP).

APUs do offer better OGL compliance, so you can use more specific GPU capabilities that Intel CPUs can't muscle out.

Try using denoise on a Celeron in a 1080p BD playback for instance. The HD2k nor HD2.5K will get even close to an A6 or A10 in that regard. Sorry to be such a videophile, but that's the truth.

And actually, not even the E350 ("Brazos 1.0") could handle 1080p + filters, but it was able to do so in 720p+filters and pure 1080p playback.

That's just for video playback and not thinking about games.

You would really need a side to side comparison of the playback of either system to believe me, but take a look here: http://www.anandtech.com/show/5906/amd-llano-htpc-builders-guide

Cheers!
 



I just think its sad how people forgive Intel for their old graphic designs so quickly when they had the money and resources to do better unlike Amd when they mess up. They were(key word) just unacceptable i did not say that people didn't need to buy a laptop at the time i said i wouldn't and i would of never bought a Amd one either since it was a power hog and hot but i do understand people who have to do work. People who don't know anything such as my ex girl friends dad who bought her daughter a Laptop to do 3D rendering(college) and he spent over 800$ for a I7 sandy at best buy that they said would be good for the job until they took it home and it skips and mine does it perfect for 250$ less. So if someone forced me to have A10 graphics i think i would be fine on a laptop but if i had a I7 sandy with Intel 3000HD graphics i would not be happy.

For the basic user who does facebook/browsing and that's it those people are moving to tablets/smart phones and if not they don't need to spend more then 400$ on a laptop and for that price i would rather have a nice A6 that is on newegg for cheap(380$ and 400$). If i had to go cheaper i would get a E-450 for 330$ and if i had to go even cheaper i would check for used products or buy googles new nexus 7 for 200$.
 
Status
Not open for further replies.