Official Intel Ivy Bridge Discussion

Page 16 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's another thing that is totally down to personal preference. My PC is just down below me and I can hear it if I listen for it.
When the sound is on even on quiet I cant her it at all.
If I could it would be too loud in my book.
Mactronix :)
 
I think that telling us how quiet it NEEDS to be could help us make any improvements/replies to his build. Not in dB, but like... do you need to not hear it with a headset? Need to hear your own breathing over it? Those types of comparisons.
 
^That's actually really cool. The desktop doesn't sit just on carpet as it is (it's got feet),but that elevates it more. Just one question: Does it obstruct airflow at all, or cause heat issues? I mean, if I put my laptop (it's pretty old and gets VERY hot) on a stand with cross beams, the laptop doesn't get much cooler. If it's on like a frame, then it gets cooler.
 
that depends on the build itself.
what case and what fans / how many and placement.
high-flow fans, adjustable fans, low-flow quiet fans; 140 mm fans instead of 120mm fans.
motherboard fan headers + BIOS HW monitoring, maybe passive cooling.
is the case metal, steel, windowed, how thin is it.?
hardware, GPU's and fan control, power supply with duals fans or a single 140mm fan.?

point is I can go on.


-Silverstone Sugo SG-03 with two 120mm front fans
-Biostar TH67B using the Smartfan control feature
-All aluminum and steel case
-Core i7, 560 TI, 4 Modules of DDR3 @ 1333mhz, 7200 RPM HDD and a 600w PSU

I'm a fool for thinking I can stuff all that into a small form factor case so higher noise must be expected.
 
OMG..!!!
I use to have that case.

it's not that loud (for mATX case) but if you add the cross-flow fan it can get a little noisy.
but it is louder than the average tower, yes.

That makes you the perfect person to verify that! Awesome. Yes, it doesn't do too well in the summer and it's really only after about an hour of gaming that the cpu + gpu fans really start to spool up.

Would a Corsair Carbide be a good alternative for cooling or are there better cases you can think of?
 
Hey Guys I overclock my IB 3570k to 4.2Ghz Here a SS of it
34470626.jpg
 
Due to unforseen circumstances, I became an Ivy Bridge owner last weekend(i5-3570K) and I want to make a few quick points and observations.

1. Even though it is notably quicker than my i7-860 according to various benchmarks and applications, it is amazing how small the improvement feels on a day to day basis.(Perhaps I am just not a demanding enough User)

2. I am running it at a lazy 4.0Ghz overclock, and I have not touched the voltage, that all seems to be handled very nicely on the fly by my Asus motherboard, the Asus P8Z77-M PRO. However when I overclocked it to 4.0Ghz, I lost all Turbo features. Dunno if this is how it works for everyone when overclocking, or if there is a way to turn that back on.

3. I haven't bothered to set this system up fully yet, I have heaps more tweaking and tinkering to do in the Bios, but with a highly regarded thermal paste and the Cooler Master Hyper 212 EVO, the temps on this beast are very much in control. I guess things will be different when I run the Intel Burn Test and overclock it higher.

I dunno what Sandy Bridge ran at, but when I run wPRIME on IVY @ 4Ghz, I get around the same temps as my stock i7 860(with Noctura Cooler) did.

4. The HD4000 IGP seems like a complete dud to me. Sure I was never going to live with a 2012 Era IGP, but this is much worse than I was expecting.

I have downloaded all the Intel Drivers I could find, yet when I plugged my Dell 30" monitor into the IGP output connector, I was limited to a resolution of 1600 x 1200 and stuffed if I could find a way to go up to 2500 x 1600. Dunno if there is some setting in my motherboard I failed to turn on or what????

At that resolution, everything looked ugly too.

Anyway, I installed my ATI HD5770 into my system and downloaded the latest drivers and StarCraft 2 at 2500 x 1600 has never looked sweeter to me. 😀



 
talk to me Boga...

1. Last Friday when I got home from work and tried to do some internet browsing, my computer rebooted and couldn't get past the Windows start up screen.

To make matters worse, I had arranged to have the Monday off work as ISP technicians were coming around to my place to install a Fibre Optic connection box, connecting me to Australia's new National Broadband Network. So I needed to make sure I had a functioning computer for them to connect me up.

2. My i7-860 was running at stock speeds and I will buy a new hard disk for it and have it as my back up computer.

I purchased the following gear :


1 LiteON SATA DVD-RW 24X Black @$22.00
1 Microsoft Windows 7 Home Premium 64bit OEM @$95.00
1 SeaSonic S12II 620W 80+ PSU @$118.00
1 G Skill 8G(2x4G)DDR3 1600Mhz PC3-12800 CL8(F3-12800CL8D-8GB) @$78.50
1 Logitech MK550 Wireless Wave Combo @$72.00
1 Cooler Master Storm Trooper Black Case @$165.00
1 Cooler Master Hyper 212 EVO w Transparent 12cm @$39.00
1 Netcomm NB16WV ADSL2+ Wireless N300 3G Gigabit WAN Modem/Router Voip @$138.50
1 Intel 520S 240GB/SATA3/R 550MBs,W 520MBs/25nm/3.5"Kit @$359.00
2 Seagate SATA3 2TB 7200RPM Barracuda 64mb Cache @$238.00
1 Corsair 64GB Flash Voyager GT USB 3.0 @$98.00
1 Asus P8Z77-M PRO P8Z77-M PRO.Z77 4xDDR3 3xPCI-E16 GBL SATA3 USB3.0 RA @$135.00
1 Intel Core i5 3570K LGA1155 CPU 3.4Ghz 6Mb Cache Ivy Bridge @$244.00

3. Why leave Turbo Boost disabled? Having said that, I may return my CPU settings back to stock, until such time that I encounter a set of circumstances that need an overclocked i5 3570K :kaola:

4. At some point just to try things out, I'll take the CPU up to 4.5Ghz to see what the increase in heat output is.

5. In respect of the IGP, I was going to leave the ATI HD5770 that was in my i7 860 system intact(mainly because I am a lazy bastard), and try living with the HD4000 for a while, but it was just terrible. I'm not sure if the main problem was running at 1600 x 1200 on a 2500 x 1600 monitor, but the visuals were even uglier than what my old nVidia 6800GT produced back in 2004.

The only thing I currently need a half decent video card for is StarCraft 2, I'm not tempted by any other games yet.

When Doom 4 and Half Life 3 eventually come out, then I will buy a decent videocard, something equivalent to today's nVidia 6700 in terms of where it sits, but obviously more powerful because a year or two will have passed.

This also means that I will now sit out the Haswell generation for desktop, as I expect that the i5 3570K at stock speeds will give me all I need, let alone if I decide to overclock it to somewhere between 4 & 4.5Ghz inclusive.

Also from now on, I will have two functioning and functional desktop computers and I will take turns in updating their respective innards, with my i7 860 internals to eventually be gutted for something in 2015 I suspect.
 
^
all I can say is nice on all aspects, the fiber optic modem makes me jealous.
nice back-up unit when you get it sorted, DO SOME FOLDING...
now you need a nVdia GPU though me personally not really feeling the GTX 6 series.

cheers.

God only knows when the foundries are going to go to 20nm, but it may be that my next video card is a GTX 7 series on a 20nm process, or the ATI 8970.

In checking out the benches and pricing and noise levels, heat, energy consumption, the GTX6700 would definitely be the card I would buy right now, if there were games out that I wanted to play, that needed a strong GPU.
 


I've seen this behavior before on the IGP; typically it's that the EDID is not being read properly by the IGP (sometimes due to bad timings on the EDID read of the monitor, sometimes due to [MYSTERIOUS CIRCUMSTANCES]). If the IGP can't find the 2500x1600 in the EDID, the drivers will not offer it as an option. You might want to try another cable (longer, even) to mitigate a possible timings issue. If the explanation is [MYSTERIOUS CIRCUMSTANCES], alas, I can't help you much.

Is this being used as a second monitor, or sole monitor? It defaults to Clone mode with two monitors, I believe, which will not let you put 2500x1600 in place unless both monitors are capable.

You can PM me and I'll try to help you through it, but it's been a couple of years since I helped people through the quirkiness of the IGP drivers.
 
Well, nVidia used to have this problem with some particular cables (lengths). I experienced it with my old TNT2. Changing the cable won't be the fail proof solution though. Intel drivers really have a bad reputation for a good reason 😛

And regarding the CPU change perception, Chad... Unless you skip like 4 or 5 generations of improvements, the change will not be that much. I noticed the big change from my old Athlon X2 to the Phenom II thanks to the 2 additional cores and way higher speed (2.5Ghz OCed to 3.9Ghz OCed), but in your case, there's not even a big jump in speed nor core quantity.

Still, looks like a very good build that still has room to grow and I agree with you that maybe you're not putting real stress into it at all 😛

Cheers!
 
tell us ALL more please...

Not much to tell, really. I'm a hardware guy, but not explicitly involved in IGP either in a design, manufacturing, or SW capacity.

I just happen to be a home theater enthusiast and because of the forums I frequent, I've been helping folks out, where possible, in getting sometimes-quirky drivers to work with their monitors in ways people customarily expect-- including but not limited to getting the resolution your monitor supports out of the IGP which nominally supports it. Sometimes I'm successful, sometimes I have to give up and drop the attempt because I'm making zero progress.

:)
 


I too have been playing with the HD4K on my 3770K, although my main GPU is an HD7970 factory oc'd (Gigabyte). I have the HD4K connected to a 1080P plasma TV via the Asus Z77 mobo HDMI connector, and the 7970 connected to a 1080P monitor (27"), and so far the only issue I've encountered is that the HD4K doesn't want to adjust the scaling or overscan for the TV, so the desktop or a full-window app extends slightly off the visible TV screen. I haven't yet found any option for zero scaling ("dot for dot") yet in the Intel control panel, and I installed the latest Intel drivers..

When I have time, I'm gonna see what Lucid's Virtu does for some game benchies, as supposedly it'll use the HD4K for offloading some graphics chores (physics?) from the 7970. Haven't had the time to read up on it yet, let alone decide what and how to bench 😛.. But the main attraction would appear to be Virtu turning the 7970 completely off until needed. IIRC idle consumption is still significant with the 28nm GPUs.
 

Thank you very much for the offer and suggestions.

For now, because I have installed a discrete card, I'm happily running at 2500 x 1600, but will try using a different cable in the weeks ahead.

It is my sole monitor, a Dell U3011.


 
I guess today is one of those days as I'm at 83 degrees Fahrenheit indoor room temps already and it's likely to get a bit hotter. I wonder if Intel will ever make any effort to improve upon their stock CPU cooler enough for it to possibly compete with Coolermasters 212+ and Evo?