Official Intel Ivy Bridge Discussion

Page 20 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


lol

I would try to add something, but I just can't. Well played, sir.

And anyway, on the mini GB subject... I don't trust them since a friend got a 790FX chipset'ed (i think it was a UD5 or something) and lasted 4 months.

Cheers!
 

josejones

Distinguished
Oct 27, 2010
901
0
18,990
^ That's actually wise advice. I did wait as long as I could at the advice of others. I simply couldn't wait any more. These z77's first come out in what, like March? I've been trying to upgrade for 3 years but, something would always come up and I couldn't come up with the funds or whatever. I'd been wanting an i7 Ivy bridge ever since the epic failure of Bulldoxer.

I'd rather have a 2.0 or 3.0 revision for sure. So, I just called the giggles again and asked about the revision issue you just brought up and the giggles said they were at F8 now and it has taken care of all the issues. He said they're not having any significant issues to worry about now. They said they doubt they need to come out with a z78 version either. He said even the z68's were only a rev 1.1 or something like that - maybe just 1.0. So, it appears to be as sound as it could be right now all things considered. Still, always better to wait for rev. 2.

VRM, yeah, I believe it is a 12 phase.

My obsessive giggle-byte thread:
http://www.tomshardware.com/forum/308664-30-gigabyte-claims-world-spot-motherboard-durability
 

josejones

Distinguished
Oct 27, 2010
901
0
18,990
but then second look and now I want to say it's a funky 12+2+2 VRM...
hell, I dunno..

now I'm curious and will not stop till I find out.
(still not going to get one though..)
Here's what it says about the z77 UD5:

12 Phase Power Design

"GIGABYTE's cutting edge 12 phase power VRM design utilizes the highest caliber components to provide unadulterated, smooth power to the CPU. The innovative 12 phase power VRM has been designed and engineered to deliver fast transient response times through quick and seamless power delivery during extensive CPU loading variations. In addition, heat from the VRM is effectively reduced by spreading the load between the 12 power phases, resulting in a cooler, more stable platform."
http://www.gigabyte.com/products/product-page.aspx?pid=4139#ov
Just admit it - you know you want it. Or would you prefer the UP5 with the Thunderbolt ports and the ultra durable 5 with that new IR3550 PowIRstage with a 6 phase VRM.
http://www.gigabyte.com/products/product-page.aspx?pid=4279#ov
 

WILDLEGHORN

Distinguished
May 2, 2011
163
0
18,690
Hey guys i have a question.
I'm gonna be buying the following very shortly:
core i5 2500k
msi gtx 670 factory oc
asrock z77 pro4

Somebody told me that I should go with Ivy Bridge i5 CPUs instead of Sandy Bridge i5. He said doing so will mean that my system will be able to fully utilize my PC's full potential since my mobo & GPU has pci ex 3.0 support.

Is this really true??? Will this mean going with Sandy will effectively not give me max results outta my rig?

I've heard other people say that there's not much really use for Ivy Bridge CPUs in terms of gaming & that it runs a bit more hot than Sandy Bridge CPUs + it consumes more power as well.

So I'm a bit on fence as to which CPU to go with for 1080p gaming purposes- Sandy or Ivy with that build above?

Hope someone can clear this up for me, thanks :)
 

josejones

Distinguished
Oct 27, 2010
901
0
18,990
NOPE..
I'm more proud of the fact I was able to tell the VRM just by eye-balling it...
:p

I'm still done with Giggle-Byte..
I still think you're a closet Giggle-byte lover just putting on a drama-queen show here.

You never did answer my question: When was the last time you had a Giggle-byte mobo?
 

WILDLEGHORN

Distinguished
May 2, 2011
163
0
18,690
the benefits of Ivy over Sandy do not apply in gaming...

yes, Ivy is a little faster clock for clock but your going to overclock anyways.

PCI 3.0 is not a big deal in terms of speed over 2.0

now if you were video editing and other Intel Sync beneficial applications then go Ivy but your gaming...

Yup, no video rendering & other stuff...only gaming @ 1080p & web surfing, etc..
So, I'm taking your advice & getting the i5 2500K :D
How long (in yrs) do u think my rig can run games at 1080p at a min of 30 FPS (or more)??

Thanks :D
 

ddan49

Honorable
Mar 13, 2012
1,549
0
11,860
With a 670, playing games at 1080p at MAX settings... well... it barely cuts BF3 right now. I think that medium+ 1080p 30+fps, you'll be good for about five years.
 

WILDLEGHORN

Distinguished
May 2, 2011
163
0
18,690

Thank You so much :bounce:
 

earl45

Distinguished
Nov 10, 2009
434
0
18,780
It's hard for me to buy any new video card like the GTX 680 or 7970 because
my two XFX 6950's cost me $440.00 and will still beat anyone of them running
single card setup.
 

ctbaars

Distinguished
Dec 16, 2007
496
0
18,810
Then why do they only get a passing Honorable Mention in the Best Graphics Cards article. The GTX670 gets Best at $400.00 and GTX680 gets a whole Honorable Mention section at $500.00. Newegg 6950 x2 is $500.00 ATM.
 
Well, from a pure gaming perspective, you'll always prefer a single GPU solution at a given price/performance point.

Unless the dual 6950's/560ti's deliver more performance at the same price point. Well, they always mention the dual GPU set ups as well, but we all know they require more power and can be a hassle, so...

Cheers!
 


But what typically happens in SLI/CF is a single card starts to struggle in a game [be it memory running out, or just too much work], the entire SLI/CF config will start to fail. Remember that most games use AFR; one GPU does one frame, the other does the next. The second one GPU starts to struggle, FPS drops off a cliff.

I view SLI/CF as a cheap way to get more FPS for a GPU that is starting to age, but thats about it.
 
I'll let you know when and if I experience anything like that... :/
and your view of a SLi/CF-X set-up is YOUR view..

I've seen it happen lots of times. SLI'd 4850's seemed like a good idea, until one of them runs out of memory. Thats why SLI'd 4870's lasted significantly longer.

Nine times out of ten, if you have a choice between a single high end card and a pair of mid-range cards, the single card, while "slower", will end up lasting longer.
 

FLanighan

Distinguished
Jul 19, 2009
180
0
18,690


I ran a GTX 275 sli setup a few years back, enabling SLI caused my fps in crysis with everything on ultra to go from 25-30 to 50-60. I didn't notice any micro stuttering. I ended up downgrading to a single 5870 for eyefinity because nvidia was taking too long to release its 3D surround drivers, which I ended up regretting.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
What bothers me most about this CPU generation - it's not that overlcocking has not improved, it's that price has not dropped.
Normally each year we see performance/dollar improve substantially. Not this year.
For example the 2600K/2700k sat at the $320 price point since release.
Intel releases it's successor, the 3770K and it's marginally faster, but it's also more expensive! Shouldn't it be at the same price point as it's predecessor?! Over one year later, shouldn't we be getting more for our money?
 

Blandge

Distinguished
Aug 25, 2011
316
0
18,810


I'd like to point out that the 2700K was $369 at release and the 3770K is cheaper than that now. In addition, Intel recommends the same price for the 3770K as the 2700K. It's the distributors like Newegg that control the price to us customers.
 


We're a Capitalist economy. "Everything is worth what his purchaser will pay for it." ~ Adam Smith - The Wealth of Nations.

If Intel can maximize its profit by selling CPU's at $300, its going to sell its CPU's at $300. If they instead can maximize their profit by selling fewer CPU's at $1000, guess what they'll do? If people are willing to purchase something at a given price, its worth that price. Simple. Hence why competition on price is a necessity in a Capitalist economy.
 
G

Guest

Guest


thank you, i now know a reason why someone should buy a bulldozer.