MU_Engineer
Splendid
de5_Roy :
@centurion: is it possible to fit a 220w cpu in motherboards supporting up to 140w cpus? if yes, how? i asked before and got no answer....
You can physically put one of the rumored high-TDP FXes CPU in any AM3+ motherboard. Whether or not it will work is another question. Some lower-TDP boards such as those only rated to 95 watts won't run current 125 watt CPUs. They detect the CPU and give a BIOS error of "unsupported CPU" because running 125 watts through VRMs only specced for 95 watts is a recipe to ruin the VRMs and the board. Older AM2/AM2+ boards didn't routinely do this and the 140 watt Phenom X4s ruined quite a few 95/125 watt boards and gave board makers who put in just enough VRM capacity for whatever TDP level CPUs it supported a bad name. I would bet the better enthusiast AM3+ boards would support the high TDP chips just fine since they are already over-provisioned for overclocking. The mediocre 125-watt boards out now would either refuse to run or do okay until they turned their VRMs into smoke.
hcl123 :
Usually not places i would visit or discuss but generally i'm convinced that enthusiast OCers extract much more than 220W with most of their exercises... yet none of those chip as been rated above 125w... but likewise expect no endorsement that officially touches guaranties and certifications.
AMD, Intel, and motherboard makers already disclaim liability when overclocking. I'd bet the 220 watt chips would come with a limited warranty if only run at up to 220 watt "spec" but they would come with a big warning that they could ruin motherboards.
UPDATE: also is not to discard, that this new FX is nothing but a fake news, like many other news, of someone with nothing better to do, or if truth not those 220W but under certification up to 140W ... its possible to tweak SOI processes that much, i think the IBM one with 600mm² chips and 5.5Ghz is even better than that which the new FX suggests.. only very few would believe it, the official "popular" mantra is quite different, and GF inspires lots of salt (lol)...
IBM's POWER microarchitecture, macroarchitecture, and platform architecture is much different than AMD's. The 5.5 GHz POWERs are in-order RISC chips, liquid-cooled, put in custom IBM motherboards and chassis, and VERY VERY expensive. You can't compare the clock speed of an out-of-order CISC AMD chip built to be cooled with a moderate sized air cooler, an MSRP of under $250, and designed to work well in any pile of garbage madeinchina motherboard and chassis to the IBM parts. It would be like telling NVIDIA they suck because their GPUs are only running around 1 GHz on bulk silicon while Intel pushes CPUs up to about four times that on bulk silicon.
8350rocks :
That's it! EM64T...I kept thinking it was EM something but that didn't make any sense...it was very similar in a lot of ways. Though the instruction sets AMD brought out were all adopted...I am not sure what of Intel's 64 bit architecture actually made it in...certainly some of it...but how much of it, I am unsure about that.
Intel's 64 bit architecture was IA64 aka EPIC, used in the Itanium. It went over like a fart in church but Intel does keep it in a zombiefied state, inexplicably and randomly bringing out a new one every handful of years on a very old process. I don't think much came from IA64 into x86_64 other than Intel learning that most code is awful, a good auto-parallelizing and auto-scheduling compiler is really really hard to write, and that reverse compatibility with older x86 applications and OSes was critically important in the 2000s. What is in x86_64 was mainly AMD's doing and both added on various subsequent SIMD extensions such as SSE3 and the various SSE4s and AVXes.
Way off topic, I suppose Intel tried to resurrect some of the Itanium in the also very ill-fated Larrabee. Larrabee was also a very wide in-order processor like Itanium with an Intel ISA lock-in as a goal and a massive TDP as a side effect. Larrabee also suffered from two of the things that plagued Itanium. It relied on having an excellent software renderer to have any sort of performance as Itanium relied on having a God compiler or a bunch of hand tuning. Both also broke compatibility with current popular ISAs- Intel wanted to go from DirectX/OpenGL to x86 with Larrabee as they wanted to move from x86 to IA64 with the Itanium. Just some food for thought.