AMD CPU speculation... and expert conjecture

Page 166 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


You can physically put one of the rumored high-TDP FXes CPU in any AM3+ motherboard. Whether or not it will work is another question. Some lower-TDP boards such as those only rated to 95 watts won't run current 125 watt CPUs. They detect the CPU and give a BIOS error of "unsupported CPU" because running 125 watts through VRMs only specced for 95 watts is a recipe to ruin the VRMs and the board. Older AM2/AM2+ boards didn't routinely do this and the 140 watt Phenom X4s ruined quite a few 95/125 watt boards and gave board makers who put in just enough VRM capacity for whatever TDP level CPUs it supported a bad name. I would bet the better enthusiast AM3+ boards would support the high TDP chips just fine since they are already over-provisioned for overclocking. The mediocre 125-watt boards out now would either refuse to run or do okay until they turned their VRMs into smoke.



AMD, Intel, and motherboard makers already disclaim liability when overclocking. I'd bet the 220 watt chips would come with a limited warranty if only run at up to 220 watt "spec" but they would come with a big warning that they could ruin motherboards.

UPDATE: also is not to discard, that this new FX is nothing but a fake news, like many other news, of someone with nothing better to do, or if truth not those 220W but under certification up to 140W ... its possible to tweak SOI processes that much, i think the IBM one with 600mm² chips and 5.5Ghz is even better than that which the new FX suggests.. only very few would believe it, the official "popular" mantra is quite different, and GF inspires lots of salt (lol)...

IBM's POWER microarchitecture, macroarchitecture, and platform architecture is much different than AMD's. The 5.5 GHz POWERs are in-order RISC chips, liquid-cooled, put in custom IBM motherboards and chassis, and VERY VERY expensive. You can't compare the clock speed of an out-of-order CISC AMD chip built to be cooled with a moderate sized air cooler, an MSRP of under $250, and designed to work well in any pile of garbage madeinchina motherboard and chassis to the IBM parts. It would be like telling NVIDIA they suck because their GPUs are only running around 1 GHz on bulk silicon while Intel pushes CPUs up to about four times that on bulk silicon.



Intel's 64 bit architecture was IA64 aka EPIC, used in the Itanium. It went over like a fart in church but Intel does keep it in a zombiefied state, inexplicably and randomly bringing out a new one every handful of years on a very old process. I don't think much came from IA64 into x86_64 other than Intel learning that most code is awful, a good auto-parallelizing and auto-scheduling compiler is really really hard to write, and that reverse compatibility with older x86 applications and OSes was critically important in the 2000s. What is in x86_64 was mainly AMD's doing and both added on various subsequent SIMD extensions such as SSE3 and the various SSE4s and AVXes.

Way off topic, I suppose Intel tried to resurrect some of the Itanium in the also very ill-fated Larrabee. Larrabee was also a very wide in-order processor like Itanium with an Intel ISA lock-in as a goal and a massive TDP as a side effect. Larrabee also suffered from two of the things that plagued Itanium. It relied on having an excellent software renderer to have any sort of performance as Itanium relied on having a God compiler or a bunch of hand tuning. Both also broke compatibility with current popular ISAs- Intel wanted to go from DirectX/OpenGL to x86 with Larrabee as they wanted to move from x86 to IA64 with the Itanium. Just some food for thought.
 


Slot A Athlons and Slot 1 Pentium 3s much?

Here is some nostalgia(no bait intended).- http://www.tomshardware.com/reviews/intel-admits-problems-pentium-iii-1,235.html

Created by none other than the real Tom, of whom I suspect may have become a Mac User :lol:
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


One analyst estimated the embedded RAM to cost about $40. With Intel having extra capacity and now back into making memories they could eventually couple it with their lower end processors. Or they could reserve it for their server lines.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I keep hearing AT is biased but their last article threw that to the wind when they recommended this:

"A CPU for Single GPU Gaming: A8-5600K + Core Parking updates"
"The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580"

Only for dual GPU did they recommend going to an i5.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Hah. I vaguely remember that article. Those things were priced so high I don't think it mattered. There are always some things broken in CPUs. It's a matter of severity if it holds up production or not. Windows 7 went gold with like 10,000 logged bugs but that didn't stop it from becoming a much desired OS. On the hardware side you have to dive into the errata lists to see whats broke.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


take out the opencl and gpu (f@h and ratGPU) tests, it drops to 102.7% since those aren't part of the core IPC. doesn't look so great then does it.

as far as almost double the performance of the 5800k, sure ... for triple the price.
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


centurion is nothing more than a high leakage part, AKA TWKR, AKA 220W cpu for the current PD lineup. This time instead of "giving them away", they are attempting to make a little money off of it this time instead of just watching them sell on ebay.

http://www.xbitlabs.com/news/cpu/display/20090720222305_AMD_Phenom_II_X4_42_TWKR_Microprocessor_Sold_for_11_600_at_Auction.html

http://www.ebay.com/itm/AMD-Phenom-II-TWKR-42-Black-Edition-VERY-RARE-ONLY-100-EVER-MADE-/281027561154?pt=UK_Computing_CPUs_Processors&hash=item416e8c46c2

If AMD wants to make a little money off the "extreme crowd", then I say go for it. I wouldn't be suprided to see the same thing happening, people will buy them up and re-list them on ebay for 200% markup or more. There won't be very many of them.

http://www.techpowerup.com/184960/amd-centurion-is-fx-9000-scrapes-5-00-ghz.html
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

interesting find. I would guess that the NDA lifts at the same time as the customer will recieve the part, ie 1 or 2 days.

... but microcenter has them too ... so ... thats theory is out. http://www.microcenter.com/product/414754/A10_6800K_Black_Edition_41GHz_Quad-Core_Socket_FM2_Boxed_Processor

so ... where are the reviews
 


Why take it out? Its used, isn't it? Oh wait, AMD doesn't do as well with it, so we have to disregard it from the results set.
 

8350rocks

Distinguished


1.) It's not cocky...it's truth.

2.) OpenCL is not a competitor to DirectX in anyway shape or form...that's OpenGL.

3.) Nvidia and AMD both support DirectX11 and OpenGL 4.2.

4.) OpenCL is something completely different...and while NVidia supports this some, AMD is big on OpenCL. NVidia's counterpart is CUDA, which must be programmed for specifically to work on NVidia GPUs as it's proprietary. Any GPU can run OpenCL as it's an open standard...that's why CUDA will fail.

5.) OpenCL and CUDA are about using the GPU to do parallel processing by running compute functions on GPUs.

6.) DX and OpenGL are about 3D hardware acceleration and graphics APIs, as well as physics computations and other media tasks.

You might want to do some homework before you run around calling someone "cocky".

 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860
^^@ gamerk316
are we talking IPC or not? or has IPC changed yet again to mean CPU + gpu?
I was talking intel vs intel.

If you remove those same benches from amd, it gives intel even more of a lead, 199% to 190. so trying to say "amd doest do as well" is quite the opposite, those are some of the better ones, f@h being the best 2 scores.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Richland is a kind of trinity 2.0 before kaveri arrives



If you see the Nvidia interview linked above, price is only one of reasons why OEMs are massively rejecting Haswell GT3e, the other reason is high power consumption.
 


Speaking of the Phenom series, the FX8150 reminds me so much of the Phenom 9650, which was eventually developed into the Phenom II 965BE.
 


If history serves for something, it's lessons. Once OCL gets adopted by the big industry players, it will stagnate big time. It already happened to OGL, making the Khronos group unable to push innovation due to "legacy" restrictions.

Hell, that's a common factor across the Industry for "industrialized tech". 16bit and now 32bit for the programs is another example of it. And the list keeps going and going.

I'd say it's one big point in favor of CUDA if it gets relayed to second place: it will be the DirectX for GPGPU, and just like DirectCompute in DX, it will get updates for games and other consumer oriented stuff quickly.

I really don't know how HSA is going to play in that area to be honest, but if Khronos serves as an indication, not good.

Cheers!
 
unless microsoft or google starts to only support cuda, which is completely unlikely ever. Cuda will never gain traction in consumer space again. Cuda will never run on AMD GPUs, there would be need of a new GPGPU standard even if OCL fails. A big software company like microsoft or google would probably be the only companies that can push a standard on all their platforms. Maybe eventually dominating the market.
 


Well, MS could, but otherwise I agree. And even more, since nVidia has a big ego (much like Sony right before the PS3), they won't play in the consumer space, I'm sure. Developing an standard with no big software ecosystem besides it, sounds like a hard thing to do alright.

Microsoft and Google (less likely IMO) do have the size and resources to push pretty much any standard they want. I'd say IBM can be in those shoes as well, but they're working on other things; interesting things, haha.

So, given those points, Microsoft will push DirectCompute inside DirectX in the upcoming years as the GPGPU for games (again)? haha.

Cheers!
 


The Itanium didn't use a slot edge connector like Slot 1, Slot 2 and Slot A PIIIs, Xeons, and K7 Athlons. It did sit in a cartridge but the cartridge was mounted flat to the motherboard with a pin grid array on the flat surface of the cartridge instead of an edge connector. The PAC418/PAC611 Itaniums were actually much closer to the MCA retention mechanism in the Mobile Pentium IIs than anything else.

Mobile PII MCA package, bottom:
S_Intel-MobilePII-266-512%20(bottom).jpg


Itanium PAC418 package, bottom:
L_Intel-80541KZ8002M%20(bottom).jpg


Speaking of PIIs, PIIIs, and Athlons- those were the "good old days," at least as far as Intel was concerned. PIII 450s running at 600 MHz, dual Socket 370 Celerons overclocked in an ABIT BP6 and then later replaced with Coppermines, also the wonderful chip known as the PIII-S Tualatin. You overclocked cheap chips to match the performance of extremely expensive top-line stuff. Nobody overclocked top-end stuff, it was pretty well tapped out from the factory. Besides, the point was performance on the cheap and not "oooh look at how much I spent on my Extreme Edition CPU!!" Intel didn't intentionally cripple dual-CPU SMP on chips so you could make a nice dual-CPU workstation without shelling out for Xeons. Those were the days. Now you have to pay en extra thousand bucks or so to get the Xeon DP "equivalent" of a $300ish Core i7 and Intel killed overclocking on anything less than the most expensive Core i5 chip. We keep hearing rumors that in the near future anything outside of enthusiast chips from Intel (read: anything using the same socket as DP server) will become BGA. Boo. At least Steamroller will keep using AM3+ and AMD will apparently keep on using the FM line of sockets on the desktop, and any AMD desktop chip with a TDP over 65 watts (most of them) is overclockable. I think AMD will get a lot more enthusiast notice if Intel really does go off the idiot end and force people to LGA2011's successor and >$300 chips if they don't want a non-replaceable BGA CPU setup.

Okay, old man rant over.
 

8350rocks

Distinguished


I agree to some degree with what you're saying, especially about Khronos Group. They did not actively push the standard as hard as they could have, nor did they attempt to keep the software updated as quickly or as often as MS did with DX.

However, HSA is not in the hands of the Khronos Group, and frankly...I think that's a good thing. OpenGL now, is nearly as good as DX in many ways, but if they had pushed like MS did back in the days of 3D hardware infancy, it would be that we were talking about them reverse. Especially considering that OpenGL had many supporters in the beginning up until DX became so much more comprehensive and easier to use.

I do think that a software design environment using HSA and OpenCL will take off much faster though. Partly because developers have been asking for HSA/HUMA for quite a while, and because, with the advancements in GPUs coming much faster and with much greater percentages of improvement each generation, more will be offloaded onto GPUs in the future.

Once the software gets on board with OpenCL (companies like Adobe are already optimizing software for OpenCL and HSA) then the hardware will get sufficient optimization from software, and the performance increases we should see from HSA and HUMA will be forthcoming. However, as always, the software is a few years behind the hardware.

 

^This
Gotta love booting up into Windows 2000 and exploring the magical world of the Internet, running your overclocked and beloved Pentium III or Athlon...
One word, ABIT.
Fast forward to 2003-2005, gotta love overclocking those Athlon 64s and P4 Northwoods on a ATi Radeon 9700 Pro and Abit KV8/IS7, man I want my 3200+@2.3GHz back!

Nostalgia time:
http://www.youtube.com/watch?feature=player_detailpage&v=cAqlA9EJ4ME
http://www.youtube.com/watch?feature=player_detailpage&v=y39D4529FM4
http://upload.wikimedia.org/wikipedia/commons/4/43/Slocket_PCB_Slot_1_to_PGA370.jpg
http://upload.wikimedia.org/wikipedia/en/4/44/AMD_Athlon_Processor_Logo.svg
http://upload.wikimedia.org/wikipedia/en/6/65/AMD_Athlon64.jpg
http://upload.wikimedia.org/wikipedia/en/3/34/Pentium4ds.jpg
http://upload.wikimedia.org/wikipedia/en/7/7e/ABIT_Logo.svg
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Anomalies are usually eliminated from data because otherwise distort statistics of small samples.
 

Cataclysm_ZA

Honorable
Oct 29, 2012
65
0
10,630


I still haven't seen reviews for Richland mobile chips in their own, but those chips are still saddled with DDR3-1600 controllers. Meh.
 


Is it sad that I remember that stuff like yesterday? I remember watching the second video when it first came out and the resulting kerfluffle it created. K7s were *supposed* to have a power-off mechanism similar to the K8s in the first video but some board vendors cheaped out, so the CPU's magic smoke got let out.

I had an ABIT too, a K8N SLI with a for the time very expensive X2 4200+ Manchester. It would go to 2.64 GHz at 1.45 V and that's as far as I ever pushed it. I had an ATi x1900GT, a 74 GB Raptor, two optical drives (so I could copy a disk the lazy man way), and two 1600x1200 monitors. Oh, and 4 GB of DDR-433. That was quite the setup in those days but it was 64 bit, SMP, and a massive improvement over the 2.2 GHz Northwood I had before that. The Northwood was a big improvement over the K6-2/500 I had before that, which was a quantum leap over the 486SX2/50 before that as the K6-2 could get on the Internet (dial up at the blistering speed of 28.8kbps) and had Windows 98SE instead of OS/2. I played a lot of Civilization (I) and SimCity 2000 on the K6-2. Yes, the 486 had OS/2 and it's crappy "freeze on right click" Windows 3.x emulation because my dad was an IBM man. But that was still much better than the 10 MHz 286/512 K RAM machine as it could technically multitask. But the 286 had a whopper of a 30 MB hard drive, color screen, mouse, and took "hard disks" (3.5" floppies) as opposed to the actually floppy 5.25" floppies that the Apple ][es with their monochrome screens used. Oregon Trail, Breakout, and Bomber on the ][es were still fun though. Also had to remember the magic keystrokes to turn on the 80 column card to type much of anything. Geez, I feel old. But at least I never used punch cards or TTYs like my dad did way back when :D
 
Status
Not open for further replies.