The Core i7-4770K Review: Haswell Is Faster; Desktop Enthusiasts Yawn

Page 14 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
do you really believe that those microseconds make an economical difference for most consumers? from 1st to 4th gen and all the other parts that will need to be "upgraded" to feel the presence of a forth gen processor.... get coke its less expensive!
 


😀

I've got my 'old' Opty s939/940 '4' copper heatpipe coolers (with solid copper base) in my will ... still using them to this day. I just moved one into my 'new' HTPC in the last few weeks.

pcpartscollection_2246_18393581


PhII 965BE @ 3.8GHz 58C under-volted to 1.3v on my 'stock' cooling :lol: It's got to be 7-8 years old.


 

True, but then they don't qualify for the "Ultrabook" tag, because it's an Intel term for a thin, light notebook. Don't know if it's a formal trademark, though.

But yeah, they'll lose out on the marketing and support from Intel.

Another reason why i'm skeptical about it is because even with a 17w part, ultrabooks can barely keep cool externally and even internally under load. Their CPUs hit 70*C easily.
 

How many people actually care about Intel's Ultrabook brand? I know I don't - whenever I see the term, the only thing I associate with it are overpriced under-featured ultra-portable machines. I would gladly pick an ultra-portable that weighs slightly more and is slightly bigger without sacrificing IOs or a bunch of features I do not need (ex.: touch) forced in.

As for temperatures, since when has 70C been remotely close to being a problem? Ivy Bridge is rated for 105C TJmax so even 90C is still well within Intel's warranty spec. BGA models have no IHS and this means no IHS-CPU gap to worry about so cooling those embedded chips should allow a fair bit more creativity with heat dispersion.

All that heat does need to go somewhere once removed from the chip though. For 17W, there is a simple way: put the CPU and other critical/hot components in the LCD/lid and use the metal lid as a heatsink - this is exactly what convertibles do by putting just about everything behind the display and reducing power/thermal budget while in "tablet" mode. For 45W, things may get a little more challenging due to the need for active cooling so there may be no getting away from a bit more height and weight... one of the more outlandish ideas I can think of would be an active heat pipe (hybrid between heat pipe and closed liquid loop) to spread heat evenly across the whole lid. A 12" LCD's lid has ~300X the surface area of a TO220 package so it should be able to dissipate 50W with less than a 8C temperature rise over ambient.

Where there is a will, there is a way.
 
Intel do have more powerful processors such as 8-cores with 4-channel memory and better bus performance, a few more etxensions added as well, sadly they are all branded Xeon and are available only at skyrocketed prices. I'd rather buy an older version of Core i7 or Core i5, like this one http://corei7.com/amz-product/intel-core-i5-3570k-quad-core-processor-3-4-ghz-4-core-lga-1155-bx80637i53570k/
 
I love how the review of the new intel cpu did not include any game benchmarks with a descrete gpu. Anyone taking bets on the framerate variance between this and the i7-920? Put me down for 8fps. Its really to the point that intel is spending all of their money on trying to edge out AMD in all corners of the market and have done just about nothing to push gaming performance. Sure content creation performance is improved but games are what drive the industry. If intel is spending all of its time making onboard gpus to compete with AMD's APUs in the budget sector then the high end gaming people are the ones to suffer. Sure there are more of the budget people but they buy a $200 cpu and $50-100 mobo while I buy a $600 cpu and $300 mobo. Plus how many times have we heard that pc gaming is dead yada yada yada? You think we'd of been hearing that for the last 6 years if games required more computing power than the 360 and ps3 can offer? No. GPUs have out performed the consoles for the entire 10 year run of the current consoles but it hasnt been enough to call the pc superior. Thats why Intel has screwed the pooch on the last few generations of cpu.
 

There is nothing left for Intel to do. Clock rates are about as high as they can go without compromising IPC, execution latency and TDP at least until 14nm comes along if it turns out better than 32nm to 22nm for clock rates.

Game developers who need more processing power have no choice but to get used to writing multi-threaded game engines if they want to tap multi-core and multi-threaded CPUs' performance; even more so on next-gen consoles featuring AMD's octo-core SoCs running at merely ~1.6GHz which means less than half the performance per core/thread to work with compared to desktop CPUs.

The game has been in game developers' park for several years already but most of them are dragging their feet with embracing multi-threaded programming because writing and debugging multi-threaded software can easily get very complicated when threads need asynchronous interaction with each other.
 


Heh, yeah right, if that was true then nobody would be so easily running SB/IB CPUs at 4GHz+ without
any need for a voltage increase. 😀 Intel can do better, they just don't have to. That's why they
deliberately knobbled IB's cap design so it ran hot.

And what's wrong with just once not trying to shrink the damn TDP every single time?
Leave it as it is and give us more performance instead, something that's worth an upgrade,
instead of the repeated pointless revamps and socket changes we're seeing atm.

Ian.

 

Most people these days are more interested in quieter, cooler systems than higher performance and today's CPUs still have tons of untapped performance due to most software still being largely single-threaded.

If you do a CPU% graph while running today's most CPU-intensive games, you will likely find out that most of them do not even use 50% of an i5-3570's theoretical performance because they aren't adequately threaded.

If there were many games that ended up pushing i5s to 100% usage on all cores, Intel might be more inclined to increasing mainstream core count.
 


People buying enthusiast hw are very often using threaded apps (not everyone's a gamer).
Plus, modern cooling options, PSUs, etc. all allow performance systems to run quiet anyway.

Producing mobile-focused chips for a desktop market is irritating. We already have
the low end of the PC market being eaten away by tablets/etc., understandably so;
if Intel doesn't bring out new mainstream and upper-end desktop CPU products that
justify the cost in terms of improved performance then it's inevitable that the upper
end of the PC market will also decline. Just look at the IB-E review, that chip is totally
pointless. X79 needs an 8-core or 10-core i7 to continue to be relevant; the chipset
can certainly handle it, so what does Intel think it's playing at? Such a shame.

Ian.

 

The high-end/enthusiast market share is only ~5% of PC sales. Investing large amounts of resources to remain relevant there is of relatively little importance.

At the low-end of things, performance has already risen above most people's requirements so unless average people's typical processing power requirements suddenly jump, even the mid-range will start struggling simply because most people do not need anywhere near that much processing power - many people are still perfectly happy with their 4+ years old Core2Duo just like I was until I started running out of RAM on a regular basis.

While enthusiasts, high-end gamers and professionals always want more power at (almost) any cost, the harsh reality is that most normal people (the vast majority of the PC market) have not had any use for it in years.
 
InvalidError writes:
> The high-end/enthusiast market share is only ~5% of PC sales. ...

That's misleading - many high street stores make a massively higher percentage of their profits
from high-value items, because those are the ones that carry a useful margin. Many budget items,
especially HDDs, are often sold for no profit or even a loss in order to keep custom coming in. And
this is comment from shop keepers in the US I talked to, so it's not a UK thing I'm talking about,
though a computer store owner I knew in Preston told me much the same thing.

Besides, the push in recent years has without question been to make it so that ordinary users can
mess about with video and images in ways never before possible, which does require a lot more
compute power (and GPU power) than the Core2 era of tech. This is only going to get worse as
people expect to be able to work with HD, etc. Can't do this with cloud tech, there isn't the bw and
it's not reliable.


> While enthusiasts, high-end gamers and professionals always want more power at (almost) any
> cost, the harsh reality is that most normal people (the vast majority of the PC market) have not
> had any use for it in years.

That much is true, but the majority of users does not equal the majority of revenue. Plus, there's
the professional space to consider, which does want a lot more compute power (ask any movie
company). Intel is idling and in the long run it's going to hurt them. If they don't make properly more
powerful CPUs, it just encourages companies to look elsewhere for the gains they need, such as
GPUs, which certainy isn't going to benefit Intel.

Core2 was a useful leap over what came before. Nehalem was a leap. So was SB.

IB wasn't and neither is HW.

Ian.


 

GPUs are in a completely different league from CPUs that CPUs have absolutely no hope of ever matching for tasks that GPUs can handle efficiently so you are expecting the impossible out of Intel there. The only way Intel can compete with GPUs there is by massively upgrading their IGP rather than their CPU cores.

As for IB and Haswell "not being a useful leap", each of them at least doubled IGP performance at any given price point over their respective predecessors. I would take that as a strong indication that Intel views the entry-level market as being of much higher strategic importance than high-end.

Revenue-wise, if Intel sells around 19X more mainstream parts than high-end parts, you still have over $1500 in mainstream parts per $700 worth of high-end parts so low-end parts still beat high-end on revenue and profit. The low-end and mid-range markets are about twice as important to Intel's continued success as high-end consumer chips. No surprise that Intel is putting a lot more emphasis on stuff that matters for mainstream PCs and laptops.
 


I was talking about mid-range and top-end desktops; IGP is completely irrelevant there. Is that
really all you can point to as being a 'useful leap'? If anything that just proves my point. IB/HW
are signs Intel is simply treading water wrt CPU performance because they don't have to do
any better.

Ian.

 

And the main reason why Intel does not have to do "any better" is that they get the bulk of their profit from the lower end of their product range that shares the same general die design so they optimize their base design and related building blocks around that.

If you need more raw CPU performance than mainstream-oriented chips on LGA1150/1155 can deliver, get an i7-4930X/4960X or Xeon SP/DP... or maybe AMD's new FX-9590.

As I have been saying for years: if improving CPU performance was so easy that Intel could do so at will, AMD would be able to do (most of) it too. AMD's on-going failure to catch up with Intel indicates that improving CPU performance is much harder than people think it is.
 

It does not really matter how much more money they have when they are butting heads against practical and theoretical maximums: you can reduce the frequency of branch mispredicts by making prediction buffers larger but larger buffers are slower so increasing them may increase execution latency and reduce overall throughput. Same goes for larger register files, caches, execution pipeline depth, etc. Once you find the optimal size for each structure on a given process, architecture and target clock rate range, there is nothing more that can be done about individual CPU core performance no matter how many more billions you may want to throw at it.

When was the last time Intel, AMD, IBM, ARM or any other CPU designer introduced a fundamentally new structure or technique in their CPUs? The newest thing I can think of is Netburst's trace cache which Intel reused in the Core2 and Core iX architectures; just about everything else in modern CPUs throws back to the DEC Alpha and SPARC days. So no fundamentally new CPU structure in 10+ years from anyone. They are doing little more than tweak the balance of well documented structures, add new instructions, integrate more stuff in their CPUs and update external IO interfaces.

The apparent halt of progress in fundamental CPU design does not stop with Intel; it is industry-wide because the whole industry has run out of tricks to push per-core performance any higher. The only reason SoCs are progressing so quickly is because they started from so far behind; their progress will slow down as they catch up.
 
Only 1 upgrade from Sandy Bridge and that 2011 Socket. A 3820 build will cost about the same , give you quad channel memory support and not run so hot as well. Many also get a good 4.7Ghz out of the 3820.As Far as Ivy and Haswell for 1 any CPU you need to delid to keep cool even on water is what i would call bad quality control and a defective chip right from the start. Yes i own a Haswell ,APU's, 3960x ,bunch of 1155 socket chips as well as a bunch of AMD chips and TBH if your getting a CPU to run close to stock clocks Haswell isnt that bad at all. Hell i was playing Battlefield 3 just tonight with no GPU and it was smooth and very playable.

My suggestion would be if have the Cash go for a 3930K
Or if just want to spend what you would on a Haswell go with the 3820
Lastly if already on Sandy or Ivy stay there because nothing exciting is going to come from upgrading either of them.

Personally i love all CPU's but that's because i mainly use them for testing and benching, my Favorite not counting my 3960x of course would still be my Sandy 2700K.

Just my 2 cents :)

MybadOmen
 
I just got the i7 4770 because I see no reason to overclock when I can just get on with things until a faster chip comes along at a reasonable price.
Overclocking just takes up precious gaming time.
 


When you buy high-end equipment and you don't intend to break an OC record, you don't need to OC to get great gaming performance.
 
I would say that this article is actually pretty kind. I have gotten two 4770ks and only 1 was able to get to 4.3 @ 1.25v stable. The first cpu I got would not even break 4.2 on cores 1&2, 4.1 on cores 3&4 without stability problems. Spent over a month researching and adjusting settings which leads to be advise, save your $ and wait for something better unless you absolutely need to upgrade. Your chances for a decent overclock are a gamble at best.
 


BOTH suck :) Further, you're crazy if you pair a 7970 etc with this low of a cpu (any trinity, richland etc and I'd probably say the same about kaveri). Your card will be held back in MANY games and it gets worse year after year. It is at best a poor gaming man's cpu (apu) and shouldn't ever be paired with a $250+ GPU unless you really can't afford Intel or can't wait until you can afford it.

Of course it goes without saying I'm happy about anyone improving the lowest common denominator for gaming, but they still suck for gaming (I don't call minecraft gaming...LOL) for any serious gamer. Intel won't hold your NEXT gpu purchase down, but AMD surely will as it already does with current $250+ cards (heck lower cards in quite a bit of stuff too, even FX chips hold high end cards back). If you are a serious gamer you are insane building an AMD based machine. I'm talking cpus as gpus are pretty OK, though I'd opt for better drivers and lower heat/noise always (which this year means NV only for me without huge heat/noise swings at 20nm for AMD).

I might find AMD bearable in a few months as all the OEM's add their own fan/heatsink solutions (though will still likely recommend NV to friends), but for now they are a no go as voiding my warranty on day one to fix AMD is ridiculous unless you're a water guy who doesn't care and would void it anyway ASAP. I don't see the point in spending $75 of a fan/heatsink as opposed to just buying NV cards from the get go with a great game bundle (but a water guy already spent his money probably, and is re-using old stuff potentially).

The second I add $50+ to AMD's gpus I might as well buy a better card with no need to void the warranty, and it runs cooler and less noisy out of the box. But I voided my xbox360 console warranty seconds after a 72hr burn-in was done...ROFL. I did the same to my PS2 ages ago which I considered a no brainer as I didn't buy until they hit $200/$130, bought two-where my xbox360 was $300), So who am I to judge 😉 If you can afford a $400-550 loss next week I guess the warranty means as much to you as my console warranty did :) I could afford $300 loss at the time of my xbox360 purchase and $200/$130 didn't even make me ponder the warranty on the PS2's...LOL.

I like AMD by far over Intel, but my wallet can't afford choosing 2nd best on purpose knowing I will be bitten later if not next year with a 20nm gpu purchase. I will hate an AMD cpu then, but I will REALLY hate it at my next purchase no matter what AMD cpu/apu you buy today. That is only counting gaming as important. It's like doubling-down on a stupid buying decision if you value your CPU for APPS that do real work (I'm not talking browsing, outlook or word here).

I'm still baffled by toms pushing OpenCL. Why use a TITAN for PS CS6, After Effects CS6, Maya etc and NOT turn on CUDA? Is tomshardware mentally retarded? Nobody runs like this, and nobody buys a cpu for serious work that a gpu can do 5-20x faster. That's just dumb right? We are ALL far more interested in something we may actually RUN. Like a CUDA card RUNNING CUDA, with AMD/Intel cpu's and seeing who allows my GPU to run the fastest it can. What idiot buys any NV card (a $1000 titan in Tomshardware's case...LOL) and turns off CUDA in an app that has run cuda for years (AE/PS CS6 both run it, Blender, etc)?

I'm confused by their compression tool choices also. Nobody uses either 7zip or winzip for packing stuff and instead winrar is default for newsgroups etc. I haven't downloaded a 7zip or winzip file in ages (or anything else, everyone uses RAR). People who compress all day, use winrar. They need to switch to winrar, and add par2 tests since the same people that use winrar, also use par2 for recovery crap and creating your own for backup purposes can take ages on a slow cpu. Just try packing a bluray and creating some par2's for it :) Come back after you eat and take a trip to the store and maybe it will be done with both jobs on a dual core 3ghz (doubtful though, better be a long store trip).

Tomshardware needs to change a LOT of their benchmarks to stuff we ACTUALLY USE. No I'm not talking opencl crap here. 90% of the workstation market uses CUDA and NV cards (and 65% of the discrete gpu market is NV). Call me crazy, but I'm thinking CUDA tests should be in EVERY article testing cpus, and even in gpu articles that include any app testing. It's fine to test cuda vs. opencl but don't just show opencl crap as if I won't go home and run CUDA and it's unacceptable to benchmark them separate as if they don't compete in the very same apps you test with (all adobe, 3dsmax, maya etc - all should be running CUDA for NV and OpenCL/OpenGL for AMD). It is useless to show NV/OpenCL when NV/Cuda is sitting there ready to go in the same app. It should be illegal for you to own a computer and do this...LOL. Yes, it's just that dumb. They should just be able to take your PC if you're dumb enough to turn off CUDA when available and opt for OpenCL. You don't deserve a PC in this case 😉

Is toms going to test BF4 vs. NV and EXCLUDE Mantle because NV can't run it? HECK NO. They will benchmark Mantle vs NV w/o Mantle because YOU SHOULD DO THIS. Why not do the same with Cuda vs. AMD (choose any way you want to run AMD, but NV MUST HAVE CUDA ON!!!)? You buy a MANTLE card hoping MANTLE will be important soon right? We would call tomshardware stupid if they run AMD with Mantle OFF, right? And that is a market share we'd call NICHE at best for likely the next year+. Cuda on the other hand is already in 200+apps including most of the apps toms uses. As noted before NV/CUDA owns 90% of the workstation market (that's a HUGE majority, nowhere even close to called NICHE). Cuda at 90% here, is a virtual monopoly correct? But Tomshardware ignores it, while I'd bet money they won't ignore Mantle which will probably only make it into 1-3 games in the next 12 months (BF4 and what else?). To make matters worse, according to devs at AMD's recent event, you MIGHT see 20% perf improvement. They said they wouldn't bother for 5%, but I think they shouldn't have bothered for 20% either knowing they weren't using it in consoles which leaves you with a REALLY small niche on pc's until it's the only thing AMD sells and it's out for a year or three. Make no mistake, AMD knew this all along, despite not openly commenting until MS said NO MANTLE which killed many hopes and articles including anandtech's mantle fantasies article (WOW Ryan really blew that one huh?).

Toms needs to get a grip on reality. The second I see Mantle discussed I expect to see some words on Gsync as that is a major reason for me (and I'm not alone) buying NV next time knowing a 1440p monitor purchase isn't far down the road for me. It's a feature that just works if you have the hardware, where I have to HOPE Mantle is used in every game by every dev (not likely as they are not a majority and mantle isn't in consoles or mobile) and it makes them NO extra money doing it. Gsync gets a premium for monitor makers (should for a while), and likely sells some new monitors by next xmas for people who might have otherwise waited until their current model dies.
 
Status
Not open for further replies.