Skylake: Intel's Core i7-6700K And i5-6600K

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


It normally is the same but has some tweaks to the design to help foster that. SOmetimes they can get the same clocks at lower temps/power draw or higher. Penryn was a good example of higher clocks on a smaller process but some tweaks to the design. Of course that was a two fold change. Not only was it a smaller node (65nm -> 45nm) it also employed a new material design for the transistors (HighK/Metal Gate) that allowed it to drop power consumption while increasing transistor switching speed.

The last one that did that was IB which was 32nm -> 22nm with 3D transistors.
 


I would definitely try cleaning off of the old paste and applying a bit of new thermal paste. 40C for a massive water cooler at idle says you are not transferring heat to the cold plate correctly (or your room ambient temp is really high).

I use coffee filters and rubbing alcohol to clean off paste and artic silver for new paste. I also follow the 'tint' procedure from the artic silver web site instructions. GL.
 
I disagree about the WHY people are negative. You are right that some people will always be crabby, also that Intel is the best overall CPU. There just isn't a compelling reason for the average user (including average computer ENTHUSIAST) to upgrade. Sure, if you need it because of a specialized software application and that little bit of difference adds up over the long run, then go for it. If you want to own it for the platform as a whole or just to have the "latest", that's also valid. However these days, 100 MHZ and another 2 FPS isn't worth it when the AVERAGE system can play everything, is so fast that it idles most of the time anyhow. Not for all the money it costs to get increases in the low to mid single digits. All this hype for such lackluster increases is the real reason for the apathy IMO. Just my 2 cents :)

I've been a computer junkie since my Atari 7800 and was born in 1970, so I've seen firsthand how stuff has really stagnated the last decade (as far as CPU performance goes). I realize it's physics, but also lack of competition kills off what is left after that.







 
Seriously, who's going to buy the latest enthusiast Intel CPU, main board, ram, pci-e ssd, and run integrated graphics? Stop complaining about integrated graphics performance.
 


Who upgrades every time Intel comes out with a new design? Who buys a new car every time a model is updated?

People should realize by now that any new model that comes out is not going to blow the doors off the previous model. You upgrade when your current system is struggling.
 

Assuming there is a new model worth upgrading to, which is often not the case with CPUs where current models are only 30-40% faster than their nearest equivalents from 4-5 years ago.
 


If you aren't gaining 30-40%, your system isn't likely struggling, unless playing a game that is just horribly designed.
 

Look at it the other way: if your system is struggling, 30-40% more performance from upgrading to the current equivalent of whatever you had likely won't make it feel a whole lot better.

This is one of the reasons why I do not bother with overclocking: by the time I might wish for the extra performance, the 20-30% performance gain from overclocking or upgrading to a higher-end CPU on the same socket wouldn't be anywhere near enough to make me happy again.
 


5 years I doubt. A lot of things are in the pipeline to be coming out that will push well beyond what the Sandy Bridge CPUs can do.
 


Or you could have a 4790k at stock clock rates with better performance and a lower TDP...



30-40% is a huge bump for a CPU. Imagine if the clock-rate went up 40% instead. 3.0GHz would become 4.2Ghz.

Except it's even better than that because now that 3.0Ghz CPU is faster than your 4.2GHz CPU, has bonus features, low power states, a smaller TDP, can be cooled passively, etc.
 

From the very modest progress over the past five years, that seems highly optimistic. We might see hex cores trickling down to mainstream i5 status five years from now, maybe another 25% combined clock+IPC gain and mainstream software still mostly single-threaded.
 


TurboBoost 2.0(3.0)? We're going to see much stronger single / dual thread turbos for consumers. There's no reason we couldn't have an i3 that boosts up to 4.4GHz on a single thread today and I think that's coming soon.

2C/4T produces a lot less heat than 4C/4T or 4C/8T so theoretically an i3 should be capable of higher clockrates than their i7 counterparts, and I'm sure the only reason Intel doesn't do this is because they wouldn't be able to sell i5s.

Respectfully I think they need to cut out the i5 completely and simplify their design process. There should be two processor tiers, desktop(i3) and performance (i7) with 95w/65w/35w or K/S/T versions of each, the 95w 2C/4T would have competitive or better clock-rates than the 4C/8T and would be the clear winner for productivity users. The i5 just fills a niche right now for video-gamers and people trying to save money. If they cut down to 6 CPUs I think they'd make more money or market them better than they do now.
 


I don't mean on the CPU side but the other side of things. Sandy Bridge only supports PCIe 2.1 which isn't now but in a few years could become a bottleneck. We have 3D NAND and HMC etc. While I always enjoy faster CPUs sometimes it is what is going on around the CPU that warrants an upgrade.
 


I'm a big time gamer, I use the IGPU for streaming to Twitch and YouTube....
 
Status
Not open for further replies.