The Core i7-4770K Review: Haswell Is Faster; Desktop Enthusiasts Yawn

Page 12 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Sandy a disappointment to you when it represented the biggest evolution jump by intel in architectures for some time, what is disappointing is that a 4770K is not noticeably faster than a 2600K of 2011 other than in iGPU performance and to be honest who buys a i7 for integrated graphics anyways. Then there is the fact that trying to get 4.5ghz without Chernobyl part deux is nigh on impossible unless you maybe want to drop $500 on a mainboard.
 
So um why the new socket...No really now things are just looking cluttered this barely is a upgrade over socket 1155 I mean like by wat 1% at most. Why even make a i7 for it again not much of a upgrade over the i7 3770k and the rich extremest would just get a socket 2011 cpu which still is barely worth the money I feel. With this whole bga thing why not keep the socket solely for tablets and laptops that seems like the goal with optimization in power and improved graphics so this would have been better if kept as mobile only and a separate socket from the desktop lineup .That way the next i7 could have had more time to be developed and improved upon,socket lga 1155 would have stayed in use and most likely more people would have upgraded that way like your intel core i5 user. The only reason I can see for a socket replacement is more money and intel may in fact make less money by doing this.They should have just kept these few new features for the mobile market where they have value and upgraded 1155 more and never made 1150 to begin with.
 
New socket because the powermanagement is now inside the CPU and it allso run in smaller voltage number. It is not the features, it is the support system this time why there is a new socket...
 
so it seems intel played a game with it's Haswell release. i feel bad for the people who've ordered one... it seems the preproduction parts that they sent out for reviewers ran cooler and overclocked better then the actual parts.

here is an article for reference.

http://www.pcpro.co.uk/news/382267/intel-haswell-hotter-and-slower-than-expected

It seems intel scamed people with this release. No one is hitting over 4.5 on their production run chips. Some people are overheating at 4.2... the production chips have a massive temp problem that is far greater then the temp issues in the preproduction samples... which leads me to think they cherry picked the best chips from their production run to send to the reviewers and companies. I think we have to face the very real possibility Haswell isn't getting past it's turbo speed without insanely robust cooling, and even then it might not get far.

the fact one person was saying he can't get any i5s over 4.2 is disturbing... 50 chips and not one past 4.2?

that makes the a10-5800k look like an awesome overclocker.

I would like Tom's to address this issue and see if there is any truth to this
 

Someone de-lidded a Haswell to find out whether or not the TIM is worse than Ivy Bridge.

What he found out:
- there appears to be a ~60micron gap between the CPU die and IHS
- using enthusiast paste while duplicating that gap produces worse CPU temperatures than Intel's stock paste, indicating that Intel's paste may be better quality than what most enthusiasts replace it with
- removing the gap improves core temperatures while overclocking by as much as 30C, indicating that the gap between IHS and die is the primary factor behind high core temperatures

So the question is: is Intel deliberately gluing the IHS ~60 microns above the die or was that simply a bad sample/batch?
 


I would quest that the reason why Intel stuff is better in this case is that good paste will stay thinner than Intel one, so the gap is wider with good stuff compared to pudding that Intell is using this time... I supose that Intell stuff allmost fill the gap between the shim and cpu... allmost... And that there is huge cap when using good paste because good paste is as thin as possible so it would not slow down the heast transer between shim and cpu...
*sigh* I can not believe that Intel is getting this sloppy only few years after AMD stopped be a treath...
 

Extra 300 what? If you're making a new build, Haswell costs practically the same as Sandy Bridge, so it's definitely worth it. Of course it's not worth it to upgrade, but then it's practically never worth it to upgrade after just two years.
 


So intel has maxed out the core architecture, what's next? More cache, wider pipelines etc.? I think intel doubled the cache on conroe with only a few more clock cycles lost. Wasn't the 256k l2 a heated topic with intel engineers when nahlem came out?

 

Intel did not just "max out the Core architecture", I think they have just about maxed out the practical limit of how much IPC/ILP can be extracted from typical x86 code. From that point forward, even with infinite cache and infinite execution resources, per-thread IPC would remain largely unchanged - the more execution resources are available to throw at a single thread, the least likely you are to find something to keep them busy with.

The only ways to go from there are more threads and more cores.
 



Whoa, hold on there genius...According to anandtech's 1440p articles an A8-5600 is perfect for ALL SINGLE CARDS (so A10-5800 to them I guess is a super chip...LOL).
/sarcasm

For anyone who doubts Novuake is stating reality here, and that little inner whiny voice is about to kick in, check the comments section and CTRL-F jian here:
http://www.anandtech.com/show/6985/choosing-a-gaming-cpu-at-1440p-adding-in-haswell-
It's unbelievable to me that Anand has let his site become the "whiny" voice Novuake is talking about.

Ian Cutress is a DR for crying out loud and he prints this:
"If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU."

So ANY single gpu can do fine with an A8-5600...ROFL. Who are they trying to kid? Read my comments there, which actually point to some articles here, among others who show FAR more games than Civ5 are affected by CRAPPY AMD cpus. I used Tomhardware, hardocp, techreport etc articles to show they are so far off from reality they should be ashamed to call themselves a review site. It is no wonder they got a SPECIAL VISIT from AMD and nobody else did. It is no wonder they find every excuse possible to not print FCAT results or even printing minimum fps...ROFL. Ian calls .87% (less than ONE % use 1440p and only 1.25% total 1440p and up) the midpoint of gamers. Understand above that TOTAL only equals 1.25% (1920x1200 and below are the other 98.75%). Between Ian and Ryan Smith's shenanigans they should just post a sign on their front page that says "WE DON'T REPORT FACTS ANY MORE AND LOVE AMD".

I may complain about pushing opencl at toms, but at least they aren't totally misrepresenting the data. They just aren't using data I consider important. Until you can make money with opencl what's the point? Supporting open platforms? I couldn't care less about that if it's slower or not used; note tomshardware is about to shoot down bitcoining for money shortly - (asics do that now and botnets too). I can't get anything from folding@home but a high electric bill and a warm fuzzy feeling for contributing to humanity...rofl...I care not if I don't get a share of profits from the great cancer solution I help solve etc - it won't be FREE after we solve it either (YOU WILL PAY for the cure even after you help...LOL). But that is completely different than shouting a whiny voice into a megaphone like Anandtech saying AMD as a cpu choice is ok (only for completely BROKE people). A discrete card requires Intel or you're wasting your money. If you're dumb enough to run a $100 cpu at 1440p you'll still be proven wrong in a few gpu gens when 1440p is doable and you AGAIN see Intel run away from AMD when the GPU is NOT the limiter.

But Anandtech continues to think people like Novuake are on crack...And clear back to the 660TI article (which again me and Ryan had a go...LOL, they no longer comment on my comments) they've been recommending giving your Visa to some dude in Korea for a 1440p monitor (or ebaying it from a NOBODY)...I digress...Their Alexa traffic report shows 1/2 traffic pretty much since that 660TI article :) It has trended down ever since. The public isn't as dumb as they used to be. I hope their articles improve or they tank into nothingness. While I hate Intel for forcing me to buy a crap gpu (which drove up watts/heat) they are still the best you can buy if you own discrete gpu's or even plan to at some point.
 


so umm whats your point?
 
The inherent problem with overclocking with increasingly (decreasingly?) smaller chip dies is that the smaller and more transistor packed the manufacturing process the less capacity the chip has for cooling. I don't think that anyone here would dispute that the 2500 K is much easier to OC than the 3570 K, and with the increase in transistors of the Haswell (1.4 B vs 1.2 B) the heat dissipation becomes dicier.

Everyone's concentration of the mobile market leaves a niche open for an entrepreneurial opportunity: create a CPU that maximizes its ability to overclock for the desktop. To a certain degree AMD has done this.

Despite reports of its demise the desktop is not dead, it is evolving into a gaming console/HTPC/server/productivity device. The market may shrink for the desktop, but as long as it serves these other functions it has value that transcends the mobile market.

Cloud computing is an invitation for hacking & government spying. If you own your own cloud you are much more secure. At the very least you will not have MS, Yahoo, or Google bending over and greasing up for the government to probe your data.

The Haswell is an improvement, but it shows that Intel is moving towards a paradigm of a SOC that can run various OSs. There has been a sea change in attitudes about cyber security and the recent government attitude that it should have access to all things electronic. Haswell, as a branching of technology is a good innovation, however, all chip makers should see that people will not tolerate Big Brother for long, and will demand automatic encryption on a chip and walled off storage; something that Haswell lacks.
 
Getting ready to build into a 900D, and currently have the i7-3820. Would it be worth it for me to go to the Haswell instead of a 3770K? Only thing I can think of is power consumption.
 

From an i7-3820? Neither would really be worth upgrading to IMO unless you are having hardware problems with your existing build or really need some of the new stuff on newer platforms.
 


^ +1.

You want to upgrade a 3820 get a 3930K. Going with a quad on LGA 2011 is a waste anyway.
 




LOL what if I am on crack?
 
I won't wait in line for this.
Intel is killing it's desktop business ... and it won't win the mobile fight either. Most people's next desktop would be an ARM dockable 15"-17" tablet if it existed ... everybody's now used to Android. If Microsoft keeps protecting Intel, as Win8.1 is not ARM compatible, then Microsoft will follow Intel in the limbes (power users will go for linux rather then use the half backed metro UI with a mouse).
 
Do you know what I think? I think that the problem is that our benchmarks are old and shitty. These processors have capabilities,that are simply under utilised and the software we are using is clumsy. Lumbering haswell with software that frankly still performs well enough on a C2D or C2Q is the issue.
 


i read it 2 times but i still don't get it... are you saying APU is crappy? or anand tech is crappy? or they both are crappy?
 
To all those who keep bringing up the fact that Iris Pro SKUs cost $650...that's 1K unit pricing.

From what i read on Seeking Alpha, bulk orders to OEMs are generally around 55% of the 1K price (for mobile parts, at least).

So your $650 Iris Pro is really a $325 part for Dell/Lenovo etc.
 

From reviews of ARM/Android "PCs", Android appears to have a long way to go towards addressing keyboard+mouse usability on touch-less desktop displays since everything is designed with touch input in mind.

While ARM/Android is becoming a powerful enough solution for many people's everyday tasks, Android is currently far from being mouse-friendly and that may be enough to rub would-be desktop-Android users the wrong way.

It will probably take a few years for Android to adapt to people wanting to start using it on the desktop... probably too late for most people's next PC but possibly in time for their next-next computer.
 

"Most people" are not attached to mouse input. If they can ban it, they will. Anyways, basic mouse interaction (for basic word processing, spreadsheet, aso) is already supported. The next "desktop" will be a semi horizontal tablet, laying in a flatbed dock surrounded by a small keyboard (lower border), a stylus, and, maybe, a wireless mouse.
Power users (everybody but "most people") will use "true desktop" OSes that are still evolving beside the tablet craze.
 
Status
Not open for further replies.