Intel: We 'Forgot' to Mention 28-Core, 5-GHz CPU Demo Was Overclocked

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

The lie is that they're willing and able to deliver a product with that kind of performance, by the end of the year. What was implied is that it would also be in a dekstop form factor and the presentation showed it as gamer-oriented, which strongly suggests things about pricing.


There's a big difference between failing to uphold a realistic promise, backed by a good faith effort, vs. making a promise knowing that you have no ability or intention ever to come close to meeting it.

Intel has a strong track record of delivering products that work. This was built up over many decades. I'm dismayed that they would jeopardize that reputation over cheap tactics like this, which really don't have much upside for them.
 

I do blame Intel for lying, but I also blame tech journalists (especially those who should know better) for not injecting a healthy and prominent note of skepticism in their initial coverage.

We do expect tech news to cover such announcements (it's Intel - you pretty much have to), but the public also needs them to highlight when something seems too good to be true.

I credit Tom's for following this up. Certain other news outlets I've seen have yet to do so. Also, they did point out signs of external cooling.
 

Your first good point in this thread.

; )

Sometimes I get down-votes and am at a complete loss as to what they disagree with. A one or two sentence reply would at least make it clear and maybe help me get my facts straight.
 
^^Again I am not disagreeing this was a slimy/cheesy move. It reminds me of late night Infomercials selling crap that falls apart after a month or does not perform as expected. It also reminds me of the old days of video game developers/publishers who would enhance videos and screen shots of upcoming games that looked nowhere near as good when you got them home. That's why these days you see disclaimers of new upcoming games that say "actual gameplay video" or screenshots. They got the memo.

But apparently I have failed to be clear in that there is a big difference between a stage show on a pre-production product not even on the market yet, and what you buy and install in your PC. I give up and am crying uncle!
 

Yes and no. Until now, they didn't really have a reason to doubt Intel's product announcements.

To the less savvy, they hear enough "gee whiz" tech announcements that 5 GHz on 28 cores might not seem so unbelievable. You probably had to be familiar with overclocking, or familiar with the thermal shortcomings of Intel's high core-count chips and having the knowledge of their 10 nm status and HCC design pipeline, for this one to immediately raise red flags.

That's why I'm so unforgiving of those who should've known better. Because everybody else depends on them to call BS. And to the extent that Intel gets away with it, it just makes it more likely that they or others will try it again.


Were you there? From the coverage, it seems the only external sign something wasn't right were the pipes protruding from the rear, which seem to have been made as difficult to see as possible (painted flat black?).

So, nobody at the presentation saw the chiller, but you're right that it's presence should've been inferred.

Funny thing is, a long ago I read about how Intel had patented some kind of cryogenic chiller that was supposed to be about the size of a quart of milk. The idea being we could buy these and run some kind of LN2 loop to cool our PCs. I wonder what ever happened to that.
 

If Intel wants to be seen as no more trustworthy than late night infomercials, I'd say they're well on their way.
 

High-end silicon design has long lead times and is incredibly capital-intensive. Nvidia might be the only ones who have the luxury of "sitting on a design", right now.


First was Kabylake. Without Ryzen, I don't see them releasing a CPU so closely on the heels of Skylake that's just a bit overclocked with only minor process enhancements. You could say the Haswell Refresh was similar, except that came long after Haswell's launch, and largely because Broadwell was late.

The second move was Coffee Lake. I think that's widely accepted to be a response to Ryzen, but maybe they were already thinking about it as a contingency plan for 10 nm delays.
 

You're completely missing the point. If the article wasn't clear enough about the real issues, I think my & other comments should've clarified it for anyone truly interested in understanding the grievances and not just trying to stir the pot.


How about starting with being a reading king? Please read all my posts in this thread. Then, if you disagree with any points, feel free to reply to them.

If there's anything I can say without repeating myself, I will try to do so. At a certain point (and we might already be there), two people can agree to disagree. I can only explain why I think this was such a bad move by Intel and a failure by many in the tech press.
 
^^Thanks again Bit. That's what I meant :). The same people who are into PC hardware who attend/pay attention to CES are the same as at Computek. Also the same group of people like us here at Tom's. This issue has exploded on the internet well outside of Tom's who first started sniffing around.
 
Well i watched the demo on youtube.Thing is Intel did this for bragging rights.This isn't a true product.Yes it will be released just not at 5ghz.Shows how m,uch pressure intels in cause now they did this demo which can be said are fiction vs amds 32 core which is fact.Amd didn't release much info on core speed etc but i think they will be around 3.4ghz with a boost clock to 4ghz or little more.Hard to say at this point and time.
 
That is hilarious coming from a company such as Intel.
I kept defending Intel on the forums that their 10nm has always been ready but intentionally delaying it.
This time, I hope I'm dead wrong and AMD eat them for breakfast next year! They are unbelievable, worse than a child. They could been frank and just mention that they overclocked so show it in its best light
 
"The area we were in simply didn't have enough dedicated circuits for the task [to handle 2300W]."

Didn't have a SINGLE 20-amp circuit? Novice.
 
I dunno 5Ghz on 28 cores when Intel was hyped about a CPU that can only boost 1 core of a 6 core processor to 5Ghz, and there 18 core as a all core turbo clock of 3.4Ghz, and the fact that Sky Lake X is stupidly hot that it should have taken you all of 3 seconds to say, either Intel has pulled out the magical pixie dust or something isn't right here.

I watched the event live and I looked at the tubing leading outside of the computer which was clearly viable and rapped in dust tape and was like well that's sub ambient cooling, so this is an overclock and an unrealistic one at that.

And you guys are the tech press and I am some random dude who doesn't have any real speciality in tech... Pleas just think before you write an article next time k.
 
Hey guys maybe think before you post an aticle in future, like seriously. The tubing leading outside the computer covered in duct tape was a dead giveaway.

Seriously the Skylake X 18 core consumes 500W+ at 4.7 to 4.9Ghz, and they were just hyping a 6 core CPU that managed a 5Ghz clock on a single core... On what world can Intel improve clockspeeds that much without a new architecture or node as Cascade Lake is an optimisation with the 14++ node with spectre and meltdown hardware mitigation, which 14++ isn't a new node, it's the same node Coffee Lake uses, which we were just talking about Intel bragging about having a 6 core processor having 1 core that can boost to 5Ghz on a single core.

Can we PLEAS think before we write.

O
 


I'm not sure if you are referring to other press here, but we reported that it was overclocked before we even seen the tubing. So, yes, we did think, and reported accurately from the get-go. That obviously didn't happen everywhere.

From our initial coverage:

The impressive display of multi-threaded performance probably consumed a hideous amount of power, which wouldn't lend itself well to a reasonable TDP rating for the processor. As such, we assume the processor was overclocked for the presentation.


...and then we spent an entire paragraph talking about the power implications of a setup such as this.
 
I totally blame AMD for this Intel 28 core fiasco. If AMD did not produce the 32 core monster, Intel would have produced 4 core beasts in the next 10+++++ more years using the 14nm+++++! Its all AMDs fault for making Intel intentonally forgot about the 1hp chiller.
 


People have been saying that every decade or so since the 80's. Cyrix or Via anybody? I'm not saying nobody ever enters the market with an x86 chip. Just that the best anyone does is briefly achieve rough performance parity for a few years before disappearing entirely. That's different from being competitive.

Do things change? Sure. Is AMD putting chips in PC's and the Datacenter...yeah, and they have before. Stuff like Athlon and Opteron were viable alternatives for a few years before AMD abandoned the market entirely for long periods of time (leaving VAR's hanging, I might add)...but again, that's different then being competitive.

I hope things are better this time around, but I've seen this movie about a half dozen times before, and it's going to take more than a chunk of the "Gamer" PC market for a year or two, during a historically rare period of process node stagnation by Intel, I might add, before I share your enthusiasm.

Oh, and BTW, you're wrong about AMD being the only company to develop all three of the PC components you describe, but since I apparently "don't keep up with current events", I'll let you figure out who the other is.
 


Lol
 


I don't think that it's even remotely reasonable to compare Cyrix and Via to AMD. First, AMD started about 2 decades earlier than either Cyrix or Via.

Second, AMD has not only been near or at parity with Intel in the past, but at some point had actually surpassed them, performance-wise, at least in the Thunderbird Athlon and Duron, during the Pentium 3/Pentium 4 transition era. If there were other times that they'd outdone Intel, I don't know, as I was sort of out of the PC game in general for several years.

Your assessment is way off.

Did AMD slack off for a while? Yes. But then again, so did Intel - not to mention their slipping up a bit in the late P3 early P4 days.
 


I don't need to figure out anything. The mainstream market between all three is Intel, AMD, and Nvidia from a consumer level (keyword: CONSUMER) desktop/laptop and console gaming market. Sure. You can buy a Raspberry Pi or Qualcomm processor too on the corporate level. But is that market leading with the big dogs who do the Big Three? Nah. My overall point. Let me know when we see a gaming Qualcomm CPU/APU outside of a smart phone or server. I'll wait.
 
Status
Not open for further replies.