AMD Piledriver rumours ... and expert conjecture

Page 252 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
We have had several requests for a sticky on AMD's yet to be released Piledriver architecture ... so here it is.

I want to make a few things clear though.

Post a question relevant to the topic, or information about the topic, or it will be deleted.

Post any negative personal comments about another user ... and they will be deleted.

Post flame baiting comments about the blue, red and green team and they will be deleted.

Enjoy ...
 
techreport is blowing the story out of proportion and it really goes to show what they care about. The author of the anti AMD article freely did testing and publishing of benchmarks supported and limited to the ones Intel allowed when he did the conroe review.

Not like AMD even forced him to do anything. Judging by the CPU performance numbers on Tom's, it doesn't really surprise anyone that the GPU would be the highlight of what AMD wants people to see.
 


I simply can't relate your statement to my experience of programming in OpenGL.

OpenGL and DirectX copy freely from each other. Both APIs are low-level, nothing wrong with that in a hardware interface. There are plenty of OpenGL games around that have no trouble running on both AMD both Nvidia hardware.

People use OpenGL precisely because it provides platform independence. Of course the system is not perfect and there are platform dependant details - but OpenGL provides mechanisms to test for platform capabilities and execute code conditionally. THe main use for this code is NOT for AMD/Nvidia differences, but to handle cards from either manufacturer supporting different revisions of OpenGL.

The concept of extensions by the way is built into the fabric of OpenGL - this is how new API features are rolled out.
 


And theres a LOT that don't, or required several driver revisions to fix.




You apparently weren't around back before we had unified shaders, when NVIDIA and ATI both had their own, incompatable pixel shader extensions (1.1 for NVIDIA, 1.4 for ATI). So either you had to have two totally different rendering engines, or you had to pick one model over the other. (Coincidentally, MSFT basically copied Pixel Shader 1.1 into the official DX specification, hence why NVIDIA always tended to have better shader performance over ATI/AMD).

Extensions end up killing compatability, because you won't have developers coding for every possible piece of hardware. You get one architecture that gets attention, and the rest fall back into a default path (standard OGL).
 



Nice analogy :kaola:



Amd can do ALL it wants if anything its going to bite them at the end of the day when the REAL benchmarks come out to play!



I agree but as long as the GPU and the CPU are balanced then its fine i have to admit for a balance platform Amd is ahead of Intel. I just don't want to see their CPU limiting what their GPU can do.
 

pretty much can sum up what happened with one statement

AMD's marketing team got fired for BD fiasco.

With that said, there isn't much different with this benchmark info with what we saw with the first a8-3850 reviews, both compared up against the I3 cpus and tested with the IGP

What has changed from then to now? Techreport being bigger douchebags is all, trying to create more Intel fanboys. the reviews are the same.
 
GPU's are solid, iGPU solutions continually pushing integrated graphics forward now it is about evolving the core technology. I think when you are so stuck in k2, moving to modular architecture was a leap of faith, a) AMD were late and that is down to bad management and bad FAB b) it was released unready and that just made things worse.

Anyways BD has come and passed moving on to PD now.

 
power/performance ratio looks good too, and still blows the doors off of the HD 4000.

system-power.jpg


http://www.legitreviews.com/article/2043/10/

considering the a10 cpu performance was right there with the a8, thats good for p/p, RCM seems to be working.

http://www.tomshardware.com/reviews/a10-5800k-a8-5600k-trinity-apu,3241-7.html
 
If you consider the far east movement towards APU's it is a market that I believe Intel are wary of, a large number of CPU's sold this year were APU's in those parts so it is obviously something people are getting into, its evolution to fully fledged APU with muscle would be interesting. Roll on Kaviri.
 



lol yea, pretty much. They hurt their own credibility and fanbase by overhyping dozer. As for techreport, I said repeatedly that all these mud slinging a hoalz get their dollaz from intel. So no wonder.
 
AMD explains its reasons for staged info release
http://techreport.com/news/23644/amd-explains-its-reasons-for-staged-info-release

@techreport bashing: amd fanboys here at toms actually [strike]supported[/strike] showered tr with praise and called them "the most credible hardware site" when they published amd-friendly articles, such as this one below:
http://techreport.com/review/21865/a-quick-look-at-bulldozer-thread-scheduling
the above link has been posted so many, many times in the forum threads to "show" how bd cpus "can be competitive with intel cpus" and "beat core i5 and i7 cpus in gaming" and "beat them by a high margin of 20-30%" and "can be more power efficient (LOL) than sandy/ivy bridge" by amd fanboys (or is it single guy with duplicate accounts? i dunno... 😗 ) if tr didn't test bd like that, amd fanboys(or is it a single person?) would have absolutely nothing to back up their claims and wouldn't be able to incessantly claim what t(he)y are claiming. 😀
tr published it as early as october last year, right after bd was released. that was before toms did bd overclocking efficiency tests, post-os-shceduling-patch benches and way before the very-revealing sub $200 gaming cpu roundup. tr went into exploring bd's gaming strength only recently, as bd was nearing the end of their cycle, right before pd refresh.
http://techreport.com/review/23246/inside-the-second-gaming-performance-with-today-cpus
and this time they went public claiming amd wanted to control their editorial independence with trinity (p)reviews.
i am not "standing up" for techreport. just saying that fanboys are fickle. :sol:
viva irony!! :pt1cable:
 


Bitchthorn :pfff:

its getting to the point where i have stopped reading comments in any AMD related article and review. All i can see is this bitchthorn guy harping continuously about that TR article and BD. :fou:
 
^^
:sweat: looks like a build up of frustration....
anyway, strictly technically, you didn't "name" a person, i won't call it a personal attack. although your intention kinda shows... i don't condone personal attacks anyway. my opinion hardly counts in the grand scheme of things. :)
i was merely pointing out the situation of fanboys flip-flopping their support and opinion thus contradicting themselves because of their strong personal bias. i'd rather make fun that situation instead of singling out someone. :ange:
on topic: i keep hearing that top end trinity will launch at $130-140-ish. that's a reasonable price range imo. offers good value against the sb/ivb core i3.
 
I'd say its totally worth it to get a trinity chip over an i3 if you aren't using a dedicated gpu. The extra feature and performance in the gpu is nice to have around. If you want 4 monitor support later on, trinity would actually work. Not sure how well overclocking will work.
 


Don't worry, read through the threads and you will see certain users recommending i3's over APU's, just because.
 



If they were going to use a 7850 or something in their build i would tell them to get a I3, but if they were just going to use the APU i would definitely say Trinity actually i think on toms they showed that a A10 and a I3 were pretty close to each other in performance when it comes to the CPU which means its better then a I3 unless the person plans to upgrade to something better in the future.
 
Status
Not open for further replies.