HP 2311 gt 23" Monitor Review: Passive, Polarized 3D On A Budget

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
I have the Vizio 42" E3D240 3D passive TV with a ATI 6970 and am using Tridef 5.3. Other than being less than pleased about having to buy Tridef, I'm otherwise very happy with my 3D gaming experience. Most games that I have work very well - including crysis 3, bulletstorm, oblivion, and metro 2033. other games are simply awesome, such as the batman arkam series, sniper2, dirt3, and duex ex revolution. I experienced shutter 3D gaming way back in the day with e-dimentional and a CRT screen, worked great. Have not experience the latest shutter tech, but the passive works much better than my old shutter experience in that eye fatigue is almost none and screen is much brighter. 3D gaming on my 42" Vixio HDTV, even the cheap junker it is, simply kicks a$$. I do NOT notice any significant difference in resolution quality between playing 3D and non 3D, but then again I'm playing on a 42" screen and sit about 7 feet away. There is no comparison between the enjoyment factor I get playing in 3D versus my old 24" samsung T240HD (1920x1200, very good screen). Yes there are still bugs and caveats to 3D gaming but if you can get it to work its just better than non 3D gaming. In my opinion the only thing that really counts is the end experience, not the resolutions specs or the screen brightness or whatever else, and my end experience with 3D is that it kicks the everloving snot out of 2D (when it works, lol)
 
G

Guest

Guest
The explanation for the brightness issue is not quite right:
"When it comes to brightness, HP's 2311 gt is significantly better than the 120 Hz solution. This is because the polarized glasses allow light to pass through 100% of the time, while the active implementation alternately blanks out each eye half of the time."
The polarized glasses lets light through 100% of the time yes, but it only lets 50% of the light through, because of the filter. If the active system was actually able to ONLY black the eye for half the time, the two solutions would be equal. In reality it blacks the eye for longer, and that's the difference.
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
[citation][nom]army_ant7[/nom]I forgot if I read this before, but your GPU would have to pump out twice the number of frames for games. As it obviously seems, this is true for active shutter 3D displays. I assume that even if polarized 3D displays "interlace" 2 half resolution frames for 1 3D frame, the processing needed is still for 2 full resolution frames.If anyone has better knowledge on this, please correct me. :)[/citation]

Geometry and vertex shaders won't see an increase in work. The rasterizer and pixel shader will have to work with two different frustums but they only have half of the pixels to work on. Some things will have to be done twice, such as clipping, culling, vertex interpolation, z-buffering etc., but it shouldn't nearly take twice as much rendering time.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
[citation][nom]Fokissed[/nom]Geometry and vertex shaders won't see an increase in work. The rasterizer and pixel shader will have to work with two different frustums but they only have half of the pixels to work on. Some things will have to be done twice, such as clipping, culling, vertex interpolation, z-buffering etc., but it shouldn't nearly take twice as much rendering time.[/citation]
Hm... You sound like a programmer. It's like something my brother would say, who is a game programmer.

Based on what he has taught me, your reasons seem plausible. I say plausible because it would still depend on how the 3D algorithm works or the drivers handle 3D. But they probably optimized it by taking out redundant operations, and yeah, if you the GPU does render the world's vertices and other geometry, rasterizing images from two views based on that sounds logical.

As for having only half the pixels to work on for each perspective (frame), I'm not sure if the rasterizer could be made to select which rows of pixels to render, efficiently. It might render the two frames and just discard or overlap them in that fashion.

I just got the idea because of how Eyefinity works when it has bezel compensation. I know it's a totally different technology, but it does end up rendering extra pixels which are discarded with bezel compensation. It might be like that because they can't accurately predict which exact pixels would be discarded, or they haven't/couldn't implement a way to selectively render some pixels. Just some ideas. Thanks for the reply and I'm looking forward to another if ever you have more to share. :)
 

Fokissed

Distinguished
Feb 10, 2010
392
0
18,810
[citation][nom]army_ant7[/nom]Hm... You sound like a programmer. It's like something my brother would say, who is a game programmer.Based on what he has taught me, your reasons seem plausible. I say plausible because it would still depend on how the 3D algorithm works or the drivers handle 3D. But they probably optimized it by taking out redundant operations, and yeah, if you the GPU does render the world's vertices and other geometry, rasterizing images from two views based on that sounds logical.As for having only half the pixels to work on for each perspective (frame), I'm not sure if the rasterizer could be made to select which rows of pixels to render, efficiently. It might render the two frames and just discard or overlap them in that fashion.I just got the idea because of how Eyefinity works when it has bezel compensation. I know it's a totally different technology, but it does end up rendering extra pixels which are discarded with bezel compensation. It might be like that because they can't accurately predict which exact pixels would be discarded, or they haven't/couldn't implement a way to selectively render some pixels. Just some ideas. Thanks for the reply and I'm looking forward to another if ever you have more to share. :)[/citation]
Due to pixel shaders being programmable in current GPUs it seems very possible to selectively choose which pixels to render. As to whether D3D or HLSL will allow a reasonable implementation, I have no idea. My experience with D3D programming is somewhat limited and HLSL is beyond anything I have ever coded.
 

army_ant7

Distinguished
May 31, 2009
629
0
18,980
[citation][nom]Fokissed[/nom]Due to pixel shaders being programmable in current GPUs it seems very possible to selectively choose which pixels to render. As to whether D3D or HLSL will allow a reasonable implementation, I have no idea. My experience with D3D programming is somewhat limited and HLSL is beyond anything I have ever coded.[/citation]
You seem fairly acquainted with game programming concepts using the GPU nevertheless that you were able to make those previous points. :) My brother has some experience with HLSL shaders, but not much. I'll just be lazy and ask him if that can be done (instead of researching). Haha!
 

waxdart

Distinguished
May 11, 2007
199
0
18,690
[citation][nom]soldier37[/nom]Tired of seeing these cheap 1080p displays being churned out week to week. Where are the LED 30 inch 2560 x 1600 models at to replace my current LCD model? Get with the program guys. Once you go that size you wont ever want to do 1080p again.[/citation]

I totally agree. Why are they proud of these 2004 specs? We should be on 4k Native on a PC buy now!
Dell make what you want.
http://accessories.us.dell.com/sna/productdetail.aspx?cs=19&c=us&l=en&sku=224-9949
It aint cheap.

Are people really happy to wear sunglasses that block out 50% of the light for each eye? I can still see two images flicker for passive and active so the generation of 3D screens isn't for me. :(
 
G

Guest

Guest
3D is always an illusion and you are always decoupling the natural process of focusing and viewing. When you look at things at differing distances you converge and focus. All 3D systems, to my knowledge, make you converge but not focus as the display stays in the same relative place in stationary systems (portable systems have compound problems). I wonder if this un-natural process causes the headaches we hear about so often?
 

DeusAres

Distinguished
[citation][nom]Soldier37[/nom]Tired of seeing these cheap 1080p displays being churned out week to week. Where are the LED 30 inch 2560 x 1600 models at to replace my current LCD model? Get with the program guys. Once you go that size you wont ever want to do 1080p again.[/citation]

Not everyone has the cash to burn for that kind of monitor. Nor is it needed or wanted by some such as myself.
 
[citation][nom]waxdart[/nom]I totally agree. Why are they proud of these 2004 specs? We should be on 4k Native on a PC buy now!Dell make what you want.http://accessories.us.dell.com/sna [...] u=224-9949It aint cheap.Are people really happy to wear sunglasses that block out 50% of the light for each eye? I can still see two images flicker for passive and active so the generation of 3D screens isn't for me.[/citation]
How could you see a flicker on a passive system using an LCD? LCD's do not flicker. Passive systems do nothing to cause a flicker. Are you taking your movie theater experience and thinking that this would be the same? The passive system has its own draw backs, flickering is not one of them.
 

waxdart

Distinguished
May 11, 2007
199
0
18,690
[citation][nom]bystander[/nom]How could you see a flicker on a passive system using an LCD? LCD's do not flicker. Passive systems do nothing to cause a flicker. Are you taking your movie theater experience and thinking that this would be the same? The passive system has its own draw backs, flickering is not one of them.[/citation]

Sorry if flicker isn't the right term, and I've not seen this monitor in action. The main 3D screens I've seen outside of the cinema would be for any brand of TVs at the store. PC monitors may be better. The cinema isn't great to watch in 3D.

Active screens, to me, look like Interlaced images. Worse than old CRT monitors. I keep checking if the glasses are working. On Passive screens, I see both of the images. Like the glasses are not cutting out 100% of the image for that eye. The flickering would be from my headache in. Plus the image is much darker which ruins things a lot.

I'm not a 3D screen naysayer. I watched Wings of Courage (1995) at an IMAX and lots of other films and I remember those being great! I want that in my front room. The screens I've seen compared to the IMAX or 15 years+ ago are really bad. They'll keep getting faster and better so It's just a matter of time before it doesn't bother me.
 
I think you got the terms active and passive systems mixed up.

The Passive systems are the ones that will look interlaced, as each eye gets light blocked so it only sees every other line, like the image is interlaced. (I have no experience with these personally, they don't flicker, but it does look interlaced).

Active systems will block the image all together from one eye or the other, for every other frame. With a poor quality display, you will often see ghosting, due to the monitor taking too long to update the image, allowing the wrong eye to see bits of the image. This problem is more pronounced the more contrast there is on the display, as it takes longer to change a color from dark to bright quickly. A higher quality display minimizes this problem. This can also appear to flicker, and appear darker. Some active systems are designed to display the images much brighter than normal do counter act the blocking of images.
 
G

Guest

Guest
"The HP 2311 gt’s ideal viewing distance seemed to be about 2.5 feet from the display; that's where its 3D effect really popped out."

You don't understand 3D... optimal viewing distance for 3D has nothing to do with the monitor but with the amount of 3D captured in the content. Changing the viewing distance stretches or flattens the z-axis. If you want to be sitting only 1.5 feet from the screen then there needs to be more depth in the 3D content.
 

cleeve

Illustrious
[citation][nom]stereoscopist[/nom]You don't understand 3D.[/citation]

No, you don't understand the context. I was comparing the same content on a passive and active screen.

This is a real-world observation, not theory.
 

perkele666

Honorable
Aug 5, 2012
1
0
10,510
I have AMD 6800 series and I cannot make my HP2311GT 3D work! NO 3D. Tri Def does not even regognize the monitor. So there is on contarary to this article problems with AMD also not only NVidia...

If there is any hints what to do to make AMD runned computer to work with this monitor please let me know...I have tried everything!!! Even changed the whole monitor and it did not help...
 

cleeve

Illustrious
[citation][nom]perkele666[/nom]I have AMD 6800 series and I cannot make my HP2311GT 3D work! NO 3D. Tri Def does not even regognize the monitor. So there is on contarary to this article problems with AMD also not only NVidia...If there is any hints what to do to make AMD runned computer to work with this monitor please let me know...I have tried everything!!! Even changed the whole monitor and it did not help...[/citation]

You need to install the special HP 2311GT driver, not the generic tridef driver.

Contact TriDef tech support for details.
 
G

Guest

Guest
how can i use this monitor as tv without tv tuner card because tv tuner is not giving good picture and colour as tv gives. pl.help me
 
Status
Not open for further replies.