AMD "Zembezi

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

illfindu

Distinguished
Nov 30, 2009
370
0
18,810
Hey im looking towards the future and I'm eyeing the AMD 8 core bulldozer CPU'S coming down the line. I'v seen some source say there going to use a AM3+ Socket and im wondering if that means youll be able to toss one in a current AM3+ compatible board?
Will my current http://www.newegg.com/Product/Product.aspx?Item=N82E16813130297&nm_mc=OTC-Froogle&cm_mmc=OTC-Froogle-_-Motherboards+-+AMD-_-MSI-_-13130297 msi 870A Fuzion work i noticed just now that its a AM3 not a AM3+ I'm guessing that means ill need a new mother board cause the sockets are comparable?
 
Solution


That "market" is no different from any other market. Not all software makes the best use of 24 cores - some of the tests in that link didn't even make use of 12 cores. How is a lower clocked 24 core server supposed to perform against a higher clocked 12 core server when the workloads are only optimised for 12 cores?

22156.png


The result of this scaling is that for once, you can notice which CPUs have real cores vs. ones that have virtual (Hyper...
I got this doc file on my pc I got somewhere where they show the whole process in making a cpu. They said in it intel used some of the chips and made key rings with it that failed. But they won't cut of part of it and sell it.
Key rings? That's an interesting way to deal with CPU's that didn't pass. Wonder why Intel won't just reduce it to a lower spec and sell it cheaper...they must have more room for error or something
 
no they make less duds. Probably coz their making their own stuff unlike Amd who's use TMC

TSMC only makes the GPUs. AMD was making their own CPUs but now will rely on GF for that.

And as for Intel, unlike a lot of companies they start the process early so that upon release its become mature. Then as time goes it matures even more producing less bad CPUs and better CPUs. Look at a Sandy Bridge 32nm CPU vs a Westmere 32nm CPU. The Sandy Bridge is more efficient and clocks much higher.
 


It is quite funny. But I think they forget that Intel already beat them to a 8 (real) core CPU in the server market.

:lol:

I also like how its using the old ATI symbol. Interesting how AMD kills of the ATI name brand but uses the symbol for the Radeon.

ati-radeon-logo-oct08-300x300.jpg




I doubt it. AMD has a LGA socket for servers but AM3+ is supposed to support AM3 CPUs, and that cannot be done if the new socket is LGA.

Oh and this is just grand:

amd-bulldozer-ars.jpg


http://arstechnica.com/business/news/2010/08/evolution-not-revolution-a-look-at-amds-bulldozer.ars

I guess they don't know what a Bulldozer is:

bulldozer.jpg


HAH!!!!

BTW, this is interesting.

http://arstechnica.com/business/news/2011/02/amd-to-break-new-ground-with-32nm-bulldozer-design.ars

We've described Bulldozer earlier as a "1.5-core" design, and that's still true. The core represents a kind of extreme approach to simultaneous multithreading, where instead of just replicating some of the instruction flow parts of the machine, AMD has also replicated the entire integer unit execution block. In one of the papers, AMD gave fresh details about the design of this integer block, and specifically about the out-of-order window.

So what the hell is a module? Is it 1.5 cores or is it two real cores? JF might be able to explain this. This was just 12 days ago as well.
 
half a cpu and a gpu vs half a gpu and a cpu.
Lol more like 2/3 of a CPU and a GPU vs 1/16th of a GPU and a CPU.


I thought AMD came out with an 8 core opteron first?

Exactly. And I doubt they will call their server models "FX". Although, didn't the original FX series use the server socket in a platform designed for client?

JF continues to state they are cores. Just because they share more resources, doesn't make them not cores. Like the guy from your article said at the end: Either way, the benchmarks will tell the tale, and we'll refrain from making any predictions one way or the other about this novel architecture until we see some hard data.





The processors look more like the socket C-32 processors to me.


It's pretty obvious the pictures are fake. LGA C32 socket processors with a picture on them? What the hell is "unlocked" even supposed to mean?
 


AMD did push out a 8 core on March 29th. It was a MCM, two 45nm quad cores tied via the HTT.

http://www.techspot.com/news/38392-amd-launches-ten-8-core-and-12-core-server-cpus.html

Intel released Beckton 8 core based Xeons on Marh 30th and is a naitive 8 core:

Intel_Nehalem-EX_dieshot_01.jpg


As for the "Unlocked" I would assume its the same as the old FX CPUs, unlocked multiplier which is what BE was for now.
 
2 things inside the same package and separate cores. But its a good way to sell money. Selling quads as 8 core cpus, dual core as quads...
They are cores...

Intel could have called the 2600k an 8 core processor. It has 4 "real" cores and 4 logical cores. The last four just don't perform as well as the other 4. Instead they market it as 4 cores with HT, because the performance for the HT cores are only 20-30% of a regular Intel core in the best situations.
 

What is the definition of a core? That's why jimmysmitty is having such a hard time with this.

AMD's definition of a core is different then his. It can still do FP and Integer calculations like any other processor. It's just that the 1 FPU per 2 integer cores is where people believe this shouldn't be stated as 2 "real" cores.

All that matters to me is the performance; who cares if they call their cpu cores 30 oranges that can process instructions. As long as their oranges beat Intel's cores in the workloads I want, that's all that matters. Now, that would be very confusing to the less tech savvy; this is mostly why AMD wants to call BD an 8 core. It's better marketing if people think you're getting double the cores for the same price. They might not perform better in certain workloads, but the regular consumer doesn't know that.
 

Yeah you're right, the majority of the consumers won't know the difference, and I bet a lot of them don't even care. So the only difference is that AMD's core only has 1 floating point unit, whereas a complete core has more FPU's?
 
Status
Not open for further replies.