AMD Says It Is Not Abandoning Socketed CPUs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]CaedenV[/nom]Intel currently only has 3 chip designs on the market, and they get amazingly good yields on their chips. On the low end you have your base model chip which covers all Atom CPUs. On the midrange you have a chip design that covers all Pentiums, i3s, i5s, and consumer i7s. Then on the high end you have the Xeon based i7's and server processors. If you have a Pentium G CPU today, it is very likely you have a fully functioning i7 that has been artificially 'broken' so that it functions as what you purchased. So the idea is that you buy a motherboard with BGA CPU integrated into the board. Need an upgrade from that i3 that you could afford when you bought it? Send Intel some money, and they send you a code to enter in to upgrade the system to an i5 or i7. Personally, I am not a huge fan of this idea...[/citation]
...and man, neither am I. I get making your chips modular, so that if part of them fails, you can still sell it at a reduced price. But taking a chip that's fully functional and deliberately zapping off features is just...ugh. If you can take a chip and deliberately neuter it and sell it for less, you could sell the full-featured chip for less, which means you're artificially jacking up prices on the chips that you don't break.

And making it so that the chip isn't even broken in hardware, and can be upgraded by some software process...no. Just no. Count me out. Sell the chips that are legitimately limited to low-end OEMs and sell un-neutered chips to high-end OEMs and Newegg et al. If Intel makes this chip upgrade nonsense SOP, I'm off to AMD.

Really, though, I'm probably gonna use AMD for my next build anyway. Yeah, I get it, they're slower (and less energy efficient), but the worst thing they do is fuse off viable cores. Presumably, when they bin their processors, the bin is actually based on clockspeed/voltage testing, or proximity of the die to the center of the wafer, or whatever.
 
People need to stop freaking out until they get all the details. If they make it so that high-end pc enthusiasts will have the ability to swap out cpus then what's the problem? for the 300-700 dollar PCs that get sold in best buy. Do you really think those consumers that bought that pc is going to care that they won't have the ability to upgrade their CPU when that processor will serve them just fine for their basic needs for 5-8 years anyways. Then they'll just buy a whole new PC by that time. For those that are willing to spend 1k+ on a pc configuration will probably be taken care of with what we have today. This model they want to do seems like a good idea for the masses. And if you're a gamer on a budget. Then the processor you could afford will suffice until your next PC. Yes the option is now taken away from you to upgrade it later. But it seems intel has done enough research and has determined that the % of people that actually do upgrade processors for every new type that comes out isnt enough reason to stop them from choosing this business model.
 
[citation][nom]atminside[/nom]BGA packages.......i think of xbox 360 red ring. I don't care if BGA has come a long way....I will never trust it.[/citation]
lol.

BGA is actually among the most reliable packaging out there, which you should be thankful for since it is the only practical way to connect the 1000+ "pins" from the small chipset or CPU die to the rest of the world. Nearly all ICs in mobile devices are BGA. All DDR3 ICs are BGA. All core logic chipsets are BGA. All SoC CPUs in routers, tablets, smartphones, STBs, etc. are BGA. BGA is omnipresent in modern digital devices.

The problem with the X360 never was the BGA packaging, it was the grossly under-engineering HSF Microsoft put in the original launch designs. This is much the same way as Nvidia's long streak of abnormal failures caused by excessive and uneven heat generation, excessive current per ball and mismatched ball vs substrate material combinations that caused balls to fail prematurely.

BGA is perfectly fine and technically superior to mechanical contact when the mechanical, thermal and electrical designs are sound.

BTW, Intel has been using Flip-Chip BGA to attach CPUs to their Slot1/SPGA/LGA substrates since the P3 about 14 years ago so there is absolutely nothing new about it. Also, the motherboard socket itself is commonly attached to the motherboard using BGA reflow soldering.

So if BGAs scare you, now is more than a decade too late to start worrying.
 
[citation][nom]NightLight[/nom]why is there such a upheaval about this? intel will cut costs for the consumer for low end or mainstream users by doing this, it's all good![/citation]
Intel will cut costs for themselves. Whether they will pass on the savings to the consumer or increase their own profits as a result...well, hope springs eternal, but I'm guessing it'll be the latter. They don't have much incentive to do otherwise with AMD in such a weak position. 🙁
 
[citation][nom]paidbyhalf13[/nom]Do you really think those consumers that bought that pc is going to care that they won't have the ability to upgrade their CPU[/citation]
Even in the socketed enthusiast crowd, how many people will actually ever bother upgrading? Most people who already has the 25xxK for gaming have little to no reason to bother with either the 3570k or 3770k.

Between the savings from eliminating the CPU socket and the costs being shifted from motherboard to CPU with Intel integrating the VRM into the CPU package, Broadwell motherboards (at least those without fancy extras) will likely end up marginally more expensive than CPUs alone are today.
 
[citation][nom]milktea[/nom]AMD says it will continue to offer socketed CPUs and APUs in 2013 and 2014. But what about 2015 or 2016? It's not far from now. No plans for BGA in next year, doesn't mean it won't in the later years. Sooner or later, AMD will have to follow Intel's foot steps, because that's where the market is heading.*haha*[/citation]
You're assuming AMD will actually last that long.... In their current state, there's no guarantee that AMD will still exist by 2015.
 
Yeah.. a big corporation passing savings onto their consurmers..har har! That's a good one.

More like passing savings onto their shareholders!
 
Personally I could care less if intel goes away from sockets.. I mean seriously I can't think of the last time I just replaced my cpu. By the time I need to replace my cpu it's usually time to replace the mobo.

Though I guess with how amd does the socket it's a good thing they're keeping sockets.
 
I kind of don't blame Intel. Once you hit a certain performance level, you lose a lot of reasons to upgrade. The game studios kind of hit a wall at a $200 Core i5/Turbo-boosted Phenom II.

That being said, I will prefer being upgrade my CPU. There's been a few times where I have been in a bind and got something like a $40 CPU to put in a good motherboard while I build up enough money to get something better.
 
I got no problem with BGA chips as long as you can keep them cool enough that the RoHS lead free solder don't fail. And they only show up in mobile devices like laptops.
If they pull this crap in the desktop / server market they are going to be in some hot water.
 
[citation][nom]Shin-san[/nom]The game studios kind of hit a wall at a $200 Core i5/Turbo-boosted Phenom II. .[/citation]

What bull.

The only limit PC game devs have now are consoles. Even now Crysis 1 is still one of the best looking games out there.
 
I'm no architexpert (see what I did there?) but perhaps there's a paradigm shift coming? While we currently have the north and south bridge and ram on a motherboard, perhaps we'll see a "Thingamajig": a part with BGA CPU, north/south bridge, RAM slots, perhaps PCIe slots for GPU(s)... that plugs in to the "Doohicky", with PCI slots, USB, etc. i.e. "primary function" and "secondary function" boards.

Your primary board would be more expensive than a CPU currently, but when you factor in not having to buy a whole new motherboard to get the latest greatest chipset (i.e. you're not paying for all new PCI slots, USB ports etc, just the chipset) then maybe it would work out cheaper? Having direct control from CPU to Chipset sounds like a good idea to me.

Secondary board would be chosen based on how many peripherals you need and the colour of LEDs.

If the interface between primary and secondary can be universal, then maybe competition between AMD and Intel will increase as people can swap between the two more easily.

That's a huge change though! I just think if they're going to be fixing the CPU to the MOBO, there would be too many combinations needed to meet everyones needs: that would mean either they would have to limit peoples choice, or think of an alternate way to provide that choice.
 
[citation][nom]robthatguyx[/nom]im an intel guy,but more and more i see myself leaning twords amd as my next build platform when my 2500k is to out of date[/citation]

We get hung up on "out of date". Phenom IIs and Core 2s are still very reliable processors. I still use those. For me to upgrade I would have to throw the baby out with the bath water on the Intel system.
 
[citation][nom]yeesh[/nom]Of course nobody wants the Intel BGA-only rumor to be true. That said,1. Manufacturers aren't too invested in what you do with the old computer parts, and would probably prefer that you throw them out.2. Yes, the last thing Intel wants is to end up like Apple, the largest, richest, most successful company of all time. I'm sure Intel executives have nightmares about going down THAT path.3. But would you really ditch Intel? I prefer AMD myself, but as of today it's clear whose chips are faster. AND (I'm sorry to say) it looks like Intel's lead should be even more pronounced if AMD basically sits out 2013. Will people like us who care enough about PC performance to frequent sites like this REALLY choose a significantly slower AMD CPU rather than submit to Intel's BGA scheme? How much speed would the average enthusiast give up for principle and the ability to upgrade CPU and mobo separately (which I never seem to do, I don't know about everyone else)? What if the Intel chip is 20% faster? 30%? 50%? This is a site where a 20% difference in FPS is a slam dunk trouncing. How many of us would REALLY opt to buy the slower CPU just to have a socket?I'm just sayin'.[/citation]

How much will overclocking be hindered by the Intel system at that point? I'd think that it'd be hindered a lot except maybe in premium-priced models and at that point, AMD will likely still have its place so long as they're still around and have continued the progress from Bulldozer and Piledriver with Steamroller and Excavator.
 
[citation][nom]Shin-san[/nom]There's been a few times where I have been in a bind and got something like a $40 CPU to put in a good motherboard while I build up enough money to get something better.[/citation]
The biggest problem with putting any $40 CPU on a motherboard is that the socket and related accessories cost nearly as much as the CPU itself and it isn't going to get any cheaper with electrical and mechanical requirements getting more stringent with each iteration. Not making plans to get rid of that cost overhead at the low-end is going to hurt when PC margins start getting the squeeze from more powerful embedded platforms getting integrated in just about everything and every form factor.

For people who's computing activities are not intrinsically tied to some PC-exclusive software (at least half the people I know), the embedded world is effectively on a collision course with it. Not much point buying a $300 PC when the $40 SoC embedded in the LCD already does everything he/she would have (otherwise) used a PC for anyhow.

The biggest limiting factor right now is RAM: not much multitasking and productivity to be had with only 0.5-1GB on most devices currently in the wild. Things should improve with recently launched 2GB devices but the real fun won't start until 4GB goes mainstream in 2-3 years.
 
[citation][nom]billgatez[/nom]I got no problem with BGA chips as long as you can keep them cool enough that the RoHS lead free solder don't fail. And they only show up in mobile devices like laptops.If they pull this crap in the desktop / server market they are going to be in some hot water.[/citation]
Just about every major/critical digital ICs in modern devices are BGA.

Your CPU die is attached to its LGA/PGA substrate by micro-BGA, the CPU socket is attached to the motherboard using BGA, the chipset is BGA, your GPU is attached to its OLGA substrate by micro-BGA and that substrate to the GPU's PCB with BGA, your DDR3 RAM is BGA, your GPU's DDR3 or GDDR5 is BGA, etc.

BGAs are extremely reliable as long as you do not mess up the thermal, electrical and mechanical engineering that make them so like Nvidia did a couple times many years ago. You will not find many PCs, servers or densely packed digital electronics manufactured within the last 12+ years that do not rely extensively on BGA packaging/attachment.

BGA is a vital enabling technology for current and future devices. It has been with us for most of the past 20 years and every new PC/server/etc. you have touched over the past decade. If there was anything fundamentally wrong with it, I think we would have read about it by now.
 
[citation][nom]SuperVeloce[/nom]@CaedenV... I heard a different story. dual core sandy and ivy's are supposed to be true dual core yields, not disabled quads.[/citation]
looks like you are right, so the i5 and i7 share a design, and the PentiumG->i3 share a design. Has and Broad however will be merging the 2 chips to a single set.

[citation][nom]Old_Fogie_Late_Bloomer[/nom]...and man, neither am I. I get making your chips modular, so that if part of them fails, you can still sell it at a reduced price. But taking a chip that's fully functional and deliberately zapping off features is just...ugh. If you can take a chip and deliberately neuter it and sell it for less, you could sell the full-featured chip for less, which means you're artificially jacking up prices on the chips that you don't break.And making it so that the chip isn't even broken in hardware, and can be upgraded by some software process...no. Just no. Count me out. Sell the chips that are legitimately limited to low-end OEMs and sell un-neutered chips to high-end OEMs and Newegg et al. If Intel makes this chip upgrade nonsense SOP, I'm off to AMD.Really, though, I'm probably gonna use AMD for my next build anyway. Yeah, I get it, they're slower (and less energy efficient), but the worst thing they do is fuse off viable cores. Presumably, when they bin their processors, the bin is actually based on clockspeed/voltage testing, or proximity of the die to the center of the wafer, or whatever.[/citation]
How does it not make sense? The less chip designs you need to make, the less time and effort it takes to make a chip, and the more fab space can be utilized towards a single streamlined process. Sure, if something is damaged, then you slice off a part of it, and then sell it as a cheaper chip. But (for sake of argument) you have demand for 100 i5 chips, but none of the chips are coming out damaged enough to be sold as i5s instead of i7s, then you have to disable part of it deliberately in order to meet demand. The other option is that they could lower the price further, and sell more i7 chips, but there are other market and legal issues which prevent that.

Anywho, at the end of the day, I will buy whatever meets my needs of the day at the lowest price point. In the past it was Intel, even though AMD was better on $/! because AMD chips at the time did not have extensions that supported pro audio software. But once they did, I was all over AMD. Currently AMD is plenty capable of the workloads I need, but Intel is simply cheaper on the higher end chips (which is still really really weird to me). If a BGA chip is capable and cheap, then I will go BGA for that gen. If a good old socket design is capable, then I will go socket. And when I am ready to upgrade in 5 years there happens to be a dockable ARM or x86 tablet that is capable of what I want to do at a good price... then I will jump all over that platform.

How many times have you upgraded a CPU in a rig you have built? For me it has only been 2x. The first time I didn't know what I was doing, and I simply purchased the wrong part for what I wanted to do (turns out the 'pro' in the 'Pentium pro' meant something lol). The 2nd time was recently where I kept the CPU, but upgraded the mobo for the sake of Trim over RAID1 support on the z77 mobos. But that is 2x, over the course of some 6 machines I have built for myself over the years. Then when I consider all the builds I have done for other people (not a single one of which has ever upgraded the CPU without also doing a mobo), and all of the laptops I have owned over the years, then we are talking about 2 CPU changes over the course of ~25-30 systems, and I am sure that in the general market CPU changes are even more rare.

I am not saying that it is not annoying... because it is. But to have a single gen (the 'sky' series is already confirmed to be in a traditional socket) that is lost on the high end market, is hardly going to be the end of anyone's world. And so long as they offer a high end chip on a high end mobo at a decent price... there is really very little reason to complain.
 
[citation][nom]CaedenV[/nom]... But once they did, I was all over AMD. Currently AMD is plenty capable of the workloads I need, but Intel is simply cheaper on the higher end chips (which is still really really weird to me). ...[/citation]

Where are you seeing AMD's high-end CPUs more expensive than Intel's high-end CPUs? AMD's highest end consumer CPU, the FX-8350, can be had for $200-220 in most retailers and Intel's high-end CPUs start around $300 with the quad-core Sandy/Ivy i7s. Even if you counted Intel's i5s as high-end, they're only cheaper if you're comparing multiplier-locked models to AMD's multiplier-unlocked FX-8xxx CPUs and even then, some of AMD's high-end CPUs are still cheaper than Intel's LGA 1155 i5s.
 
I thought this was originally a SemiAccurate report?

Anyway. My point is simple, intel most likely won't kill their enthusiast/DIY market. The BGA transition might not be total and complete. We don't know what Broadwell will bring except for the die shrink. Then someone mentioned that the following tock cycle (Skylake) will revert back to the LGA package. So i don't really think we should shake our fists at the sky without official word from Intel. Yes they're moving (and have moved already) to the SoC side of things, which isn't all bad.

So my theory is this:
"Tock" Cylcles: LGA. This way mobo makers continue to make profits, intel sells chips to the DIY markets, etc.
"Tick+" cycles: BGA. For the OEM/Mobile market. Refresh every two years or so. Obviously this has implications in Intel's fight against ARM, so maybe Atom SoCs will stick to their own current cycle that only loosely follows the main tick-tock model. But for the rest of their non-DIY parts, BGA.

Each tick+ cycle will see more and more integration, along with die shrinks, while tocks will see arch changes and only minor integration. Though i'm not entirely sure if it's so black and white anymore, they seem to be blurring the lines between both cycles (like with Haswell, for example).

Bottom line: If they kill the enthusiast market, most of us will jump to AMD for stuff like gaming rigs. Everyone who wants low power stuff will stick to Intel. Will Intel kill the enthusiast market? I doubt.
 
Status
Not open for further replies.