No, Coffee Lake Will Not Run In Z270 Motherboards (And Here’s Why)

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


You're a good example of someone who still has significant headroom in their platform. Used 2700K would be a hefty boost (oc's easily to 5GHz), and the drop to 2.0 really doesn't matter that much. PCIe adapters for NVMe offer good storage upgrades, and in some cases modded BIOS files exist for NVMe booting.




Can't agree there, numerous examples in the past, eg. it's why SB-E was core-borked to begin with (tech sites at the time did say this), and why X79 was never updated (Intel just didn't need to, AMD couldn't even compete with P67). The entire nature of the X299 release, etc., absolutely smacks of a panic response to AMD. Hot CPUs, crazy thermals, exotic cooling required, total mess of PCIe stifling (no wonder the manuals are getting thicker, so many slot/port useage caveats depending on which CPU is installed). Would intel have done what they did, when they did it, if Ryzen/TR didn't exist? No chance.

Ian.

 
I found a typo that says,"drop into the physically identical LGA1551 socket found on previous-". Its the 3rd sentence from the top and the first LGA1151 typed. Looks like the rest are all correct. Good article so looks like Z370 is a must.
 

That's assuming he wants to pass up on all of the updated IOs and other platform benefits. I have an i5-3470 and I'm not going to buy a used CPU to put on my h77 board. I also have an i7-2600k but I can't use it due to lack of hypervisor support.



And all of your 'examples' are still chipsets/platforms that only lasted two CPU generations. No Intel platform ever lasted longer than that regardless of what AMD did. Competition does not factor into Intel's decision to limit platforms to two CPU generations regardless of how rushed those platforms may have been.
 
I'm sure the 8700k will be to my 3930k what the 7700k is to the 2700k. Not worth the price for unnoticeable improvement
 
So much noise and complaining about the requirement of the new motherboard, whoever cares about this is the kind of thing is someone that has the cash to upgrade anyway. The vast majority of consumers don't usually buy the newest cpu each year they come out, and if you do you, then you are a rich premium customer that wants to spend hundreds of dollars to get even a 5% performance upgrade, so you should have the money to pay 100 or more to get your nice new shiny thingy anyway(Apple fans style). Me and a lot of others don't regularly update our computers we build them on a 5 to 7 year cycles. For my needs this platform seems really good, after 6 years its time to say goodbye to my 2500k and hope that a good o/c'd 8700k will last me as long as its SB predecessor did.
 


Well, it looks like they never got around to it...probably because soldered BGA sockets make sense on certain systems (laptops, "embedded" systems, etc.), where you're likely to break the motherboard if you even look at it like you're going to try to open up the case. But even for OEM machines, desktops with soldered CPUs won't go over well with customers, because even the OEMs not only want to be able to offer their systems in multiple configurations subject to customer demand, but they want to be able to charge -- sorry, "offer" -- their customers for "upgrade kits".

And DIY systems? Oh, I can just imagine the vitriol & backlash that would happen. How long do you think ASRock, for example, would maintain that policy if they sold their motherboards with the policy of, "if you want a new CPU, you need to buy an identical motherboard with the CPU pre-soldered onto it...& if you try to de-solder your existing one, it 100% voids your warranty". They'd see sales drop so much you'd think you were riding Tower of Terror or Demon Drop.
 

Don't worry, soldered CPUs will come around to desktops in due time, just need the PC market to wither some more to the point where where product diversity needs to be thinned out because the market isn't large enough to bear the duplicate R&D costs anymore. We're already seeing that with laptops, NUCs, transformables, compute sticks, etc. offering only 1-3 CPU options out of 12+ possible SKUs.

From the technical side of things, increasingly fast IO interfaces require increasingly tight signal integrity and we'll inevitably reach a point where sockets will become a major hindrance to further IO speed improvements. We may already be seeing this with DDR4 which was originally poised to bring 3200MT/s to the masses (double the DDR3 mainstay standard) but ended up officially standardizing on 1833-2666MT/s, only 50% faster.
 
I'll give you that Intel has always had a history of only using particular sockets for 1 or 2 generations of architectures. However, they also have a history of using multiple sockets for the same architecture. Take Kaby Lake: LGA 1151 for the "mainstream" desktops, LGA 2066 for the "HEDT" desktops, BGA 1356 for the "U" & "Refresh" mobiles, BGA 1440 for the "H" mobiles, & still they felt the need to use a BGA 1515 for their 3 "Y" mobile (i5-7Y54, i5-7Y57, & i7-7Y75), even though there are only minor differences between them (i.e. the mobiles tend to have 'configurable TDP"), because they're the same architecture.

And I suppose we shouldn't be surprised at Intel using a socket that technically has the same number of pins but isn't backwards- or forwards-compatible; they did the same thing with Socket LGA 2011 -- their LGA 2011-1 CPUs didn't work with LGA 2011 sockets, & LGA 2011-3 didn't work with either earlier version.

It ends up making a very confusing road map for Intel when trying to figure out when features are available. And even in the chipsets for a single generation, they weren't always consistent. Sandy Lake introduced SATA III...but not on the H61 chipsets. And some of the Sandy Bridge & Ivy Bridge chipsets still had a legacy PCI slot when others in the same generation didn't. And Haswell didn't finally eliminate SATA II ports for all-SATA III configurations until the 87-series.

Say what you will about AMD, they may have some convolutions in their sockets & chipsets, but Intel takes the cake (& no, that's not something Intel should be proud of).
 
I bought an H110 chipset with an i3-6100 for a budget build about a year ago. I did that with the hope to drop in a Coffee Lake i5 in another two years--then I'd have a solid gaming machine through 2024 or so (considering my i7-2600K still beasts through almost any title).

I did get screwed over by Intel. I will be dropping in an i5-7600 instead, I guess. But it's still a pretty big let down.

I can do a cheap build with a Ryzen R3 right now and upgrade it with an R5 in 3 years. That is actually really nice. It's what I've been telling my students to do for their $500 gaming builds lately.

But I guess Intel could care less what my high school students want. Their target is people with deeper pockets...and that's exactly my problem with them.

 
I bought an H110 chipset with an i3-6100 for a budget build about a year ago. I did that with the hope to drop in a Coffee Lake i5 in another two years--then I'd have a solid gaming machine through 2024 or so (considering my i7-2600K still beasts through almost any title).

I did get screwed over by Intel. I will be dropping in an i5-7600 instead, I guess. But it's still a pretty big let down.

I can do a cheap build with a Ryzen R3 right now and upgrade it with an R5 in 3 years. That is actually really nice. It's what I've been telling my students to do for their $500 gaming builds lately.
A year ago nobody knew Intel would bring hex cores to the mainstream with coffee lake. So if you were planning on an eventual upgrade of your i3, it would have been to a quad core. Which you can still totally do, and are planning on doing. So nothing has really changed.

Upgrading from an R3 to an R5 would be analogous to upgrading from an i3 to an i5, which again is something everyone can still do.

Or you could just spend an extra ~$50 on a cheap H310 (or whatever they're going to be called) once they're out and upgrade to coffee lake anyway.

Look, I get that it kinda sucks that us current LGA 1151 owners can't get a hex core CPU without upgrading mobos. But no one had any reason to expect to be able to make that upgrade in the first place. All this self righteous indignation and feelings of betrayel are just silly as far as I'm concerned.
 


Most, I suspect, adopt a far less frequent cpu/mb swap...as the 2600K/2700K owners :)
 
Let's take the 7700K for example, cranking out 150 fps in BF1...; are those results suddenly 'woefully inadequate' if the 8700K releases and can crank out 180 fps (probably a tad optimistic, but, shows a 20% increase)? :)
 
Remember AMD is no saint in this regard, there are 3 AM3 sockets, AM3/AM3+ and the AM3+ that can cope with the FX9XXX.

Do you need to upgrade everytime anything new comes out? Unless you are making money out of the upgrade, more fool you.
 


Not at all, especially if the minimum FPS are high on both of them. If anything, that extra performance ends up getting wasted, as I think there are literally only a handful of monitors that could even display that kind of FPS...although, I guess, if you can afford those monitors, you can probably afford to replace 80% of your system to go from Skylake/Kaby Lake to Coffee Lake anyway...
 

It like they haven't heard of brand loyalty.
 


You think you can upgrade to an R5 in 3 years, you couldn't fit an FX8350 in an AM3.
 
Well, that kind of varied from supplier to supplier, didn't it? Gigabyte supported some of the AM3+ CPUs on some of their AM3 motherboards, like the GA-890FXA-UD7 (http://www.gigabyte.us/Motherboard/GA-890FXA-UD7-rev-20#support-cpu). Limited, to be sure; their GA-880GM-USB3 supported one FX CPU, the FX-4130 (http://www.gigabyte.us/Motherboard/GA-880GM-USB3-rev-1x#support-cpu), for example. And it's not like they advertised those boards as Socket AM3/AM3+ (unlike my current GA-990FXA-UD3, which they did market that way). Either way, definitely more flexible than Intel has been for some time.
 

Flexibility is of limited use when all it does is confuse the heck out of the market, much like what Intel did with Kaby Lake on X299 which is soon going to be made even more nonsensical when Coffee Lake begins doing laps around the 7800X and anything weaker next week.

Flexibility for flexibility's sake when it isn't backed by uniform support is largely pointless. Without uniformity, your "flexible" purchase is entirely speculative - it may or may not meet future CPUs' requirements, it may or may not properly support all of said future CPUs' updated or new features, etc.

As I wrote earlier, I'll believe the AM4 futureproofing hype when Zen 2 launches without requiring new chipsets and motherboards to fully support the CPUs' new features and requirements. Since AMD has admitted that Zen was a worst-case scenario (new everything), I wouldn't be surprised if Zen 2 ended up requiring AM4+ to address some of those mistakes.
 
What I mean is that I specifically bought Skylake because it was unusually compatible with an extra generation of hardware. I thought this was thanks to Intel's new tick-tock-tock cycle. Intel doesn't keep things compatible like they say.

I can count on AMD to let me upgrade to something new I'm not expecting in 3 years. That's something I appreciate.

But you're right, it is a silly feeling of betrayal. I guess it's all these years of Intel pricing things however they want that has me a bit bitter.

 


Of course, that 'something new' could be akin to the widely respected FX-9590. Or the much loved Bulldozer.
 
Status
Not open for further replies.