Skylake: Intel's Core i7-6700K And i5-6600K

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

royalcrown

Distinguished
I agree with you, not saying it's the most efficient or best solution. It's just what the end of the line will have to be if we want the fastest platforms.



 

royalcrown

Distinguished


That and they charge too much. This coming from someone who owned a 27" core i7 Imac and ipad, Iphone. If I could've upgraded the GPU just a bit though...I'd still have it. They really do perform well (Imacs). Macbooks are awesome, but need a lot more airflow than apple gives them.
 

royalcrown

Distinguished
Steve Jobs never wanting to update the hardware was the biggest reason they never got market share back when it would've been easier. That and the ugly Imac that looked like a lamp.

Tim Cook has done a lot for the company as far as improving stuff, he should've been in charge a lot sooner. I do have my gripes about them though, such as how much it costs to get more memory in anything apple, the cost for their stupid cables, Macbooks cooking. They have probs for sure.



 


It was a double edge sword. Without Jobs, APple was failing (in the 90s). With him they succeed but they remain a niche product.
 

InvalidError

Titan
Moderator

If you want the absolute fastest CPU/GPU for a specific task, you need to go full-custom ASIC for your application.

For anything else, there will always be a set of compromises between performance in some cases vs performance in other cases for the benefit of better overall balance. It is extremely unlikely that any single architecture will ever be simultaneously best at everything, so there will always be a need for architectures more heavily geared towards some specific applications. If you look at Intel, they have the Core iX series for desktop software that relies heavily on single-threaded performance and Xeon Phi for servers and high-performance computing which uses many more much simpler quad-threaded cores for massively threaded workloads. Two different use-cases that use drastically different approaches to deliver performance for the applications they are intended for.

Even 50 years from now we will probably still have a handful of different architectures for different purposes and multiple versions of each architecture to accommodate different scales or mix of computing power requirements.
 


I believe Steve Job's design philosophy was "Freedom from Choice". It's actually based on human psychology and how we react to having to make decisions. If you take an average human "Bob" and give him a task that involves pushing a single button, he will gladly push that button with very little stress. Put two buttons in front of him, one red and one blue, then tell him when there is a red light push the red button, blue light push the blue one, and you just increased his stress levels slightly. Continue to escalate this with more and more levers, buttons, dials and gauges and bob's stress level goes up and up. He doesn't want to have to read gauges and be responsible for making the right choice. This holds true even if the task remains simple, (red / blue) but you introduce useless "fluff" buttons and gauges. Humans don't like having to be responsible for making choices. Jobs' capitalized on that human nature and pressured his design departments to limit the amount of user decisions required to do any action. Make a device that requires minimal input so that users don't feel decision anxiety when faces with the UI. It worked marvelously for turning their products into house hold names.

As enthusiasts, or in some of our cases engineers, we like having the option to do read gauges and make decisions. For us it's almost like a game or test of skill, how we personally optimize a specific build, balance the GPU, CPU, tune the OS, and do all the cool stuff that makes our rigs personal. For the vast majority of humans, this would just add stress to their purchasing decision, they just want something "that works" without them having to make it work.
 

royalcrown

Distinguished
Along with that I believe he was heavily into minimalism as far as design was concerned, hence the home button.



 


There is no denying that when Jobs was alive Apple did seem to be pushing newer more popular products. Look at the iWatch - big flop on Tim Cook's part there. Jobs had fantastic ideas on products, and really did help bring about the modern day smart phone and tablet.
 

ThatDevGuy

Reputable
Aug 17, 2015
1
0
4,510
It's worth noting the Broadwell desktop "C" socket 1150 parts with superior graphics listed in all the comparisons are *still* not available anywhere in the USA either for end-users or in OEM systems despite being listed by Intel as released last quarter in June. With Skylake essentially here there's a lot of speculation Intel is going to quietly kill off the Broadwell 1150 CPUs and wasted a lot of reviewer's time even sending out review samples.

It's also really sad even relatively high-end Skylake parts like these don't have Iris Pro graphics and, as this article clearly shows, their graphic performance is far inferior to the non-existant Broadwell "C" parts.

I get that most gamers will use external graphics but there's still a very significant market for Iris Pro on the desktop. And, as of now, we have no idea when that will be available in Broadwell or Skylake. Intel is doing a *really* poor job these days with their CPU roadmap and roll-outs. I suppose we can partly blame AMD for not providing any real competition.

Ultimately, Intel is only hurting their own sales by causing many people to just keep their existing desktop hardware. They have been teasing us for nearly a year with Iris Pro for the desktop and it's still not available. It also seems like a giant waste of all the R&D that went into the i7-5775C and i5-5765C if they're never going to make them widely available.

With any luck they'll announce something more promising at IDF but I'm not holding my breath. Despite their press releases, and coddling reviewers at Tomshardware and elsewhere, Intel's actions say they don't care much about the desktop anymore.
 


Because there will be Skylake parts with better graphics coming out. No reason to wats time and money producing products that wont be widely bought when a superior version comes out.
 

BadBoyGreek

Distinguished


Your 3930k is nothing to sneeze at. Unless you have a burning need or desire for the new features that the Z170 platform brings, the 3930k is still a very capable CPU that you should get at least a couple more years from.
 

PCDesignerR

Honorable
BANNED
Jul 30, 2014
401
4
10,795
I don't understand the negativity. It's brand new. Even the mobos and memory will be fresh and new. I honestly think that Intel still has a sucker-punch waiting to be revealed depending on what Zen is capable of. Why bring out your very best when mediocre is enough to beat your closest competitor, after all? It's far more lucrative to keep providing small incremental updates and charge a premium for every iteration than it is to say wham-bam and provide the masses with your very best effort and then have to charge a massive premium for it. I say be happy, every new bit of tech comes with advantages. I AM disappointed about the IGP though. These days I find myself playing MOBA games that don't require a lot of muscle - I'd love it if I could get a brilliant CPU and more than adequate IGP without having to shell out more money for a discrete GPU. I hope they do release something more top tier in both areas on the same CPU - I'll definitely purchase something like that.

Because, some people always have to be negative about something. I personally am running an i7-5960X Haswell-E and I love my processor. Get the best forget the rest, and all of the the negativity too, y'all can hold onto that as well.
 

Jeff_2

Reputable
Aug 23, 2015
1
0
4,510
I'm really shocked and disappointed that Intel would take a step backward with their integrated graphics chip 530 on what is suppose to be their top-of-the line 6th generation 14nm Skylake desktop processors. They are suppose to be at the forefront of cutting edge technology. The fact that the previous generation's Broadwell i7-5775C and i7-5775R graphics 6200 chip beat Skylake's 530 graphics chip is inexcusable! Plus you can barely find any motherboards that support those 2 Broadwell desktop processors. On top of that everyone is out-of-stock on these 2 processors which are ridiculously expensive at well over $400! Plus the new Skylake desktop processors use 91 watts which is more power then Broadwell (65 watt), Haswell (81 watt), and Ivy Bridge (77 watt). But yet they promised us greater speed with LESS power because of the 14nm architecture. Plus, and this one really makes me mad, is that the CPU PassMark for the Broadwell i7-5775C is 10,757 and the Skylake i7-6700K is only 10,807 running at 26 watts more power than Broadwell. Skylake is only 50 more PassMarks greater than Broadwell (source: http://www.cpubenchmark.net/high_end_cpus.html). Skylake is a freaking fiasco!!!!!!!!!!! I think for now I'll just stick with my Ivy Bridge i7-3770 with WQHD (2560 x 1440) graphics support and just wait for 4K support with the 7th generation 9nm processors. Maybe Intel will get it's act together by then. But physics tells me that they will have a very difficult time getting the architecture down to 9nm.
 
Jeff, Broadwell processors are highly underclocked which is why they use less power. At the same frequency, Skylake uses less power. Also, LGA1150 motherboards support Broadwell, so I'm not sure how you say you can barely find any motherboards to support Broadwell CPUs.

Also, CPUbenchmark is not a good source. It puts the 4790K above the 6700K, it's all faulty and does not represent actual performance,
 


The Skylake CPU is a direct replacement for Haswell and has better integrated graphics than Haswell. Broadwell is more targeted at HTPC and it shows with how the iGPU is what was focused on over the CPU.

The CPU of Skylake is a better performer than Haswell and uses less power. The TDP is always higher than what it actually uses.

There will be a Skylake version of their CPU that uses GT4e which will be up to 50% faster than the GT3e in Broadwell.

The i7/i5 6000 series are meant for enthusiasts who tend to use a dedicated GPU rather than the iGPU.
 


3930K will easily beat any quad core skylate in heavily threaded tasks.
 


I would say it could be close if the heavily threaded task can take advantage of newer features Skylake might have that the 3930K might not.

But for most cases, yes it should. 50% more cores and all....
 

Faceboy

Reputable
Aug 28, 2015
2
0
4,510
I just installed the i5 6600k into an Asus hero 8 motherboard with a kraken x61 for cooling. 16 gb of ram EVGA 980ti card. I installed the kraken incorrectly at first and then reseated it after removing it. Did not clean off or add new paste. My cpu idles at around 40c and pops up to 75c during benchmarking. Based on this article it seems it should be much lower. My question is: Will properly reseating it after cleaning and applying new thermal paste fix this issue?
 

Aspiring techie

Reputable
Mar 24, 2015
824
9
5,365
What intel needs to do is at every die shrink (ex. IB, Broadwell) they need to just design the CPU the same way that the previous generation was, except shrink the transistors. If they did that, then Broadwell's design would be exactly the same as Haswell's, just at a smaller process node. That would allow for higher clock rates and lower temps.

I'm just wondering if this would work. Am I just dumb, or is this actually a viable idea?
 
Status
Not open for further replies.