Intel’s Z77 Express And Lucidlogix MVP: New Features For Gamers

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Pedrovsky

Distinguished
Mar 27, 2012
121
0
18,690
It's just funny when you go asus site you read the specifications of the new z77 chipset motherboards and you see something like this highlighted "lucidlogix Virtu MVP..something something...up to 60% graphics boost"
then you read the article and it actually makes for slower fps in most graphical bounderies... Unfortunetly we can't tell if overall the games looks better with it on but it certaintly seems to be doing nothing.
To me it brings nothing new and seems like a desperate and failed attempt to compete against dual graphics from AMD.
Apart from that the overall chipset looks to have some new features and comes cheaper then z68 did so yeah, looks like a nice improvement. Let's just hope that time brings some better performance which i believe it will with some drivers updates.
 

Crashman

Polypheme
Former Staff
[citation][nom]Pedrovsky[/nom]It's just funny when you go asus site you read the specifications of the new z77 chipset motherboards and you see something like this highlighted "lucidlogix Virtu MVP..something something...up to 60% graphics boost"then you read the article and it actually makes for slower fps in most graphical bounderies... Unfortunetly we can't tell if overall the games looks better with it on but it certaintly seems to be doing nothing.To me it brings nothing new and seems like a desperate and failed attempt to compete against dual graphics from AMD.Apart from that the overall chipset looks to have some new features and comes cheaper then z68 did so yeah, looks like a nice improvement. Let's just hope that time brings some better performance which i believe it will with some drivers updates.[/citation]I didn't see any tearing with Virtual Vsync, and HyperFormance does improve performance in some games. That performance gain is a freebie, because you can disable the software per-application under the "applications" menu for games it doesn't help.
 

Pedrovsky

Distinguished
Mar 27, 2012
121
0
18,690
I didn't see any tearing with Virtual Vsync, and HyperFormance does improve performance in some games. That performance gain is a freebie, because you can disable the software per-application under the "applications" menu for games it doesn't help.

Did you see any tearing with the Virtual Vsync off? And apart from 3dmark i can't see wheres the improvement. Lets wait and see if drivers will change it
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
I just wondering does the configurable TDP in Ivy bridge are available in Ivy desktop? Would love to set back 95w TDP and let the CPU OC itself to meet that TDP.

I reallt hate intel for dumbing down the Ivy desktop chip to 77w instead of clocking it @ 4GHz @ 95w TDP.
 
[citation][nom]slicedtoad[/nom]^because 79 will be more expensive. SLI/CF is only useful in a few cases, most people (even enthusiast) can live with a single card.[/citation]
Duh what?! Try running 5900x1080 + 3D on any (1) GPU. I'm waiting for the GTX 680 4GB non-reference now in 3-WAY. Imagine driving 3 monitors and that resolution and the resulting frame rate, and then render in 3D or divide your frame rate by ~2 (half).

Folks who get a GPU today that barely cuts the mustard have an easy choice later to SLI/CF when, not if, needed. There are 2/3/and 4-WAY Z77's now, with the Z68 the limit was 3-WAY SLI.

Next, as 4K monitors start to roll out you'll need extremely strong rendering and in most cases 2+ GPUs to drive that resolution with any of the current GPUs; QFHD (3840x2160). 4K resolution/monitors - http://en.wikipedia.org/wiki/4K_resolution
 
[citation][nom]A Bad Day[/nom]What's the performance penalty of running dual GPUs in 4x mode (instead of 8x) and an expansion card?[/citation]
None. If you're using an Ivy Bridge you're running x8/x8 PCIe 3.0 which is the same x16/x16 PCIe 2.0. The GTX 680 can barely saturate x8 PCIe 2.0 and it's laughable with x16 PCIe 2.0 or x8 PCIe 3.0.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]jaquith[/nom]None. If you're using an Ivy Bridge you're running x8/x8 PCIe 3.0 which is the same x16/x16 PCIe 2.0. The GTX 680 can barely saturate x8 PCIe 2.0 and it's laughable with x16 PCIe 2.0 or x8 PCIe 3.0.[/citation]

Thanks. My friend planned on running two GPUs (7970 or 680) and two/three other 4x expansion cards.
 

pacioli

Distinguished
Nov 22, 2010
1,040
0
19,360
Overall I've been disappointed with the z68 chipset. I find the benefits it has over the p67 chipset to be performance gains that are hardly noticeable or lacking. Plus actually using the z68 features are a pain to implement for very little gain and not worth the effort. On the two z68 boards I own the lucidlogix software is disabled.
Z77 seems like it has only slight advantages over the z68 chipset... and it appears that the 'pain in hind-end' factor is still there.
Hopefully the inclusion of USB 3.0 will drive the price down for these mobos so there is a reason to get it.
 
[citation][nom]A Bad Day[/nom]Thanks. My friend planned on running two GPUs (7970 or 680) and two/three other 4x expansion cards.[/citation]
Keep in mind, the x4 slots i.e. non-GPU slots are running in PCIe 2.0 those slots are not affected one way or the other to the PCIe 3.0 GPU slots, but those slots are quite often 'shared' so it's a case-by-case basis where through investigation is required i.e. reading the manual(s).
 

dreadlokz

Honorable
Mar 30, 2012
312
0
10,790
no point in this test, just show what everyone already knows, at least I know, that your not getting any difference if you don't change the hole plataform! The real test is z68 vs z77 using the i7 3770K!

Also in gaming, you should not see any difference if your not using z77 with Ivy and 3 or 4-way GPUs!
 
[citation][nom]Why_Me[/nom]http://www.anandtech.com/show/5626 [...] i7-3770k/1 <----- AnandTech had completely different results.[/citation]
TweakTown was atleast 'honest' (after violating their NDA) what MOBO were tested with the IB; AnandTech (also violating their NDA) was pretty darn vague what MOBO was used 'Intel Z77 Chipset Based Motherboard'. The issue as I described it in another post was the BIOS and/or Drivers used in TweakTown were probably the culprits; see -> http://www.tomshardware.com/forum/329828-28-bridge-cpus-benchmarks-review-3770k-3570k

Therefore, it seems the 'Intel Z77' vs 'Gigabyte Z77' is an issue of BIOS and Drivers.

Motherboard:
ASUS P8Z68-V Pro (Intel Z68)
ASUS Crosshair V Formula (AMD 990FX)
Intel DX79SI (Intel X79)
Intel Z77 Chipset Based Motherboard

Hard Disk:
Intel X25-M SSD (80GB)
Crucial RealSSD C300
OCZ Agility 3 (240GB)

Also, when I see erratic benchmarks that simply don't add up and/or are less than 1920x1080 I tend to believe it's ALL screwy and aka useless data - http://www.anandtech.com/show/5626/ivy-bridge-preview-core-i7-3770k/9 A plethora of drivers, settings, and or testing methodology.

Further, benchmarks @ stock 'bins' is a little deceptive in that Intel purposely plans ahead stock bins so you end up comparing e.g. 3.2GHz vs 3.5GHz when either can be ramped-up >4.5GHz by a blind man.
 

SuperVeloce

Distinguished
Aug 20, 2011
154
0
18,690
[citation][nom]nebun[/nom]still...i would rather have 2 pcie lanes at 16X just in case i want to install two nvidia gtx 590 in sli...get the point?....why settle for something less when you can have the cake and eat it too[/citation]
lol and GTX590 is specified as pci-e 2.0, is it not?! For GPU pci-e, there is a 3.0 pci-e standard on this board, twice as fast at the same number of lanes...
I bet 2x 16 lanes is easily achived in 2.x mode.
 

husker

Distinguished
Oct 2, 2009
1,251
243
19,670
Aside from the technology, the Virtu MVP Female/Cyborg 3D model shown in the screen shots is horrendous. The disproportional sagging chest reminds me of one of the body suits worn by Eddie Murphy in the Nutty Professor. The color scheme is also bad, with the bright red sections making it look like we are supposed to see the the figure as a cut-away of human tissue exposing the muscle and brain. Just plain bad design given a shiny high-tech treatment.
 

Cyberat_88

Distinguished
Today's programmers are lamers who know nothing, building bloatware from bloatware on top of bloatware, inside bloatware, does anyone remember the efficiency of a Commodore 64 a 32K machine ?
The OS bloatware to the anti-piracy bloatware to the actual game bloatware makes you go grab a quad cpu server board and 3x crossfire AMD HD cards slap in 16gb ram, just to be able to popup apps. and games in the click of the fingers. And technology has not gone cheaper, they make it in China (crappy), for 1/10th the cost they used to and charge us 2x-3x the price they used to. Where are the savings, where is my top of the line gamer under $1000 ??? We need a revolution people, we need to put an end to mega greed of corporations in USA.
 
Status
Not open for further replies.