Stereo Shoot-Out: Nvidia's New 3D Vision 2 Vs. AMD's HD3D

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

exposed1234

Distinguished
Oct 17, 2011
9
0
18,510
[citation][nom]bystander[/nom]The problems you experience on having low settings is likely a result of the distance you sit from the TV/Monitor. Just leaning into my monitor and away shows a noticeable difference in depth. The further you are away from your monitor/TV in relation to its size, the more convergence you'd want.Incidentally, if you are playing on a 42" TV, I assume this means you either play at 720p or 24hz. I'd think that would be frustrating.[/citation]

The further away the TV is, the less convergence is needed (because the image on the TV is already "converged" away from you). On my 22" IZ3D monitor where I'm about 2 feet away, a game like GTA 4 needs -.43000 convergence. On my big screen 3DTV where it's about 8 feet away from the couch, the same game needs only -.1000 convergence.

My Samsung 3DTV, although limited to 1080p 24hz in full HD mode, interpolates frames so 24fps feels more like 60+. This works well for slow paced games. For fast paced games, interpolation introduces too many artifacts and I just use side by side mode instead (60hz).
 

exposed1234

Distinguished
Oct 17, 2011
9
0
18,510
[citation][nom]bystander[/nom]I am able to use Tridef at full HD with an Nvidia card. Are you referring to an HDMI limitation?[/citation]

That post was directed at Abaddon. We both have Samsung 3DTV's. Tridef doesn't support 1080p/720p frame packing with Nvidia via HDMI, it only supports this method with AMD via AMD HD3D.
 
[citation][nom]exposed1234[/nom]That post was directed at Abaddon. We both have Samsung 3DTV's. Tridef doesn't support 1080p/720p frame packing with Nvidia via HDMI, it only supports this method with AMD via AMD HD3D.[/citation]

Gotcha. You may be right on that, although 3D vision may. I know they do support it with some TV's through HDMI.
 

cleeve

Illustrious


I did, you'd just prefer not to address it and pretend that 2D-3D conversion is comparable to Virtual 3D mode. :)

Regardless, the fact remains I'm more than satisfied with my conclusions, I've stated why they are what they are.

So I'll once again re-iterate: I simply don't agree with your opinions, and you've said nothing to convince me otherwise.

I apologize that state of affairs seems to frustrate you so, but I have no remedy for you.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
Looking at the last chart in Metro 2033 game. It seems that SLI Scales poorly in this game or in 3D Vision altogether. And the 6970 is just a few frames behind although the 580's cost more than double the 6970's price
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
[citation][nom]Badelhas[/nom]And by the way: thanks Tom´s Hardware for being the first to benchmark 3D Vision at 720p resolution! I just bought a 3D Vision compatible 720p projector (Benq W700) to watch movies and game with 3D Vision but didnt know what graphic card(s) I should buy. Ended up buying 2 Gtx 570 to work in SLI, I wanted to play with everything maxed out, including AA, at 60fps, so I now now I should be able to! I am, however, kind of sad to know that with just one 6970 I would be able to have a nice experience at that resolution...[/citation]

Nvidia= Overpriced
AMD= Great Price/Performance
 

Badelhas

Distinguished
Oct 4, 2011
120
0
18,690
[citation][nom]youssef 2010[/nom]Looking at the last chart in Metro 2033 game. It seems that SLI Scales poorly in this game or in 3D Vision altogether. And the 6970 is just a few frames behind although the 580's cost more than double the 6970's price[/citation]

Exactly what I noticed, why is that? I´ve ordered 2 gtx 570 to work in SLI mode in a 720p Projector, I´m starting to think that the difference between one card or to cards is very very small, even with maximum details, including AA and ansiotropic filtering. What do you guys thing about it?
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790


Glasses free 3D IS possible, but like you said yourself, you're limited on where you need to sit to view the image. (You basically corrected yourself in your own post.)

I prefer the 3D glasses over the nintendo 3DS style "Glasses free" 3D because you have to sit almost perfectly still and in the perfect spot to get the correct image with glasses free tech.

Basically the best way a Glasses free 3D monitor would work would be to include an extremely good stand / wall mount with arm (So you could freely adjust tilt, height, distance, and pivot).

Is it impossible? No (it's already been done with the Nintendo 3DS )

Does it have limitations? (Yes, but so does every electronic on the planet, so whats your point?)

Only thing that matters is if the limitations bother the buyer. Some people love Nvidia 3D vision, some people can't stand the glasses or darkness of the image. The product still sells and some people still enjoy it, that's all the matters.

I wouldn't be surprised to see glasses free 21-24" monitors coming out in the next 2-4 years. Not everyone will like them, sure, that's true enough. But they WILL sell.



 

Badelhas

Distinguished
Oct 4, 2011
120
0
18,690
[citation][nom]exposed1234[/nom]The further away the TV is, the less convergence is needed (because the image on the TV is already "converged" away from you). On my 22" IZ3D monitor where I'm about 2 feet away, a game like GTA 4 needs -.43000 convergence. On my big screen 3DTV where it's about 8 feet away from the couch, the same game needs only -.1000 convergence. My Samsung 3DTV, although limited to 1080p 24hz in full HD mode, interpolates frames so 24fps feels more like 60+. This works well for slow paced games. For fast paced games, interpolation introduces too many artifacts and I just use side by side mode instead (60hz).[/citation]

exposed1234:
Can you please explain me what is that " -.1000 " mesaure and where do I see it in 3D Vision settings? I tried my 3D setup yesterday for the very first time and played some metro 2033, the only thing I could change was the depth in the back wheel of the emiter. I selected 50%, based on your opinions to not stay with the conservative 15% nvidia standard setting. I also selected the the shortcut keys option for changing convergence within gain but it seems that nothing happens when I try to change it. I game 9 feet away from my 80´´ screen 720p projector.
Thanks in advance
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790


That's misleading information for sure.

Metro 2033 is a horrible game to use for GPU comparisons. Aside from the fact that even the most extreme GPU set ups drop down to under 20-30 FPS during benches / gameplay.

Also, Price / performance is almost always so close between Nvidia / ATI that the only real deciding factor of what GPU brand to go with, is what software you want to run (3D vision, triple monitor setups *if you have display port monitors, or DVI monitors*, etc)

Further more people should consider driver quality (ATI has a horrible track record when it comes to drivers, even recently)

Personally I think the best buy right now for anyone running 1080p or 1600p or greater resolutions, the best value atm is the Nvidia 590 GTX. 3GB of Vram, runs almost as good as 2x 580 GTXs and only costs 750 bucks ($1100 worth of GPU power for $750 is amazing). It's Phsyx capable for any game that doesn't do well with SLI or doesn't support it all together, 3D vision 2 is the best 3D option out now if you're into stereoscopic gaming, and their Surround monitor support is amazing. Not to mention great driver support.

Honestly I don't even know why anyone would care to save 15 or 20 bucks on an ATI card with all the pros that Nvidia has over ATI....

Just my opinion though.
 

Badelhas

Distinguished
Oct 4, 2011
120
0
18,690
[citation][nom]airborne11b[/nom]It's Phsyx capable for any game that doesn't do well with SLI or doesn't support it all together [/citation]

I think that if the game dosent do well in SLI the gtx590 will not perform well also. That card is SLI but already connected, but the tech is the same.
 

airborne11b

Distinguished
Jul 6, 2008
466
0
18,790
[citation][nom]Badelhas[/nom]I think that if the game dosent do well in SLI the gtx590 will not perform well also. That card is SLI but already connected, but the tech is the same.[/citation]

Point is you can always switch it to 1 GPU video and 1 GPU physx if the game doesn't do SLI or crossfire well (Not saying there are a lot of games that don't use SLI, and a lot of them scale very well. I'm just saying it leaves another option.) *I would know, I used to run 2x 295 GTXs in quad SLI, games that didn't scale well with quad SLI *but used phsyx* I used to enable 1x phsyx and had the other 3 cores as GPUs.*

Worked great in some cases.

I think I should have emphasized what I meant by "misleading information" anyway in my last post.

Simply put, Nvidia is very competitive in pricing now (even though they don't really need to be).

And AMD's CPU price / performance crown was lost as soon as Intel released the 1155 chips. (But looking at my system today, i'd say Intel has had it for longer then that, because my at-the-time-$270 X58 i7 1366 processor is going on 2 and a half years old and still no games out that challenge it.

AMD / ATI have lost their "competitive price" edge they used to have imo. Now a days, even if you go with an AMD / ATI build and save a few bucks with them, you're really short changing longevity of your build, imo of course.

I am without a doubt an enthusiast-type PC builder *for gaming*. I always want the latest and greatest stuff and I almost always buy it. tri / 4 way SLI systems, high end motherboards, SSD's, Multi monitor set ups, Stereoscopic monitors, competitive gaming mice/keyboards/headsets, case modding, I love buying and building it all.

So when a processor like the X58 920 comes out and keeps someone like ME happy for 2.5 to 3 years? I chuckle at anyone who thinks it's not worth the few extra bucks for an Intel / Nvidia system.
 
[citation][nom]Badelhas[/nom]exposed1234:Can you please explain me what is that " -.1000 " mesaure and where do I see it in 3D Vision settings? I tried my 3D setup yesterday for the very first time and played some metro 2033, the only thing I could change was the depth in the back wheel of the emiter. I selected 50%, based on your opinions to not stay with the conservative 15% nvidia standard setting. I also selected the the shortcut keys option for changing convergence within gain but it seems that nothing happens when I try to change it. I game 9 feet away from my 80´´ screen 720p projector.Thanks in advance[/citation]

3D vision's convergence setting is a bit wonky. In order for it to change, hold down the increase convergence button for up to 20 seconds, then it'll start to change. Once it's changing, you can just press in short durations for it to change.
 

Badelhas

Distinguished
Oct 4, 2011
120
0
18,690
[citation][nom]bystander[/nom]3D vision's convergence setting is a bit wonky. In order for it to change, hold down the increase convergence button for up to 20 seconds, then it'll start to change. Once it's changing, you can just press in short durations for it to change.[/citation]

Thanks, I will try that! Is it possible to change depth and convergence in .mkv 3D movies, with Powerdvd 11 for instance? And is there any program or site where we can test depth and convergence in real time, like solidworks does?
 
[citation][nom]Badelhas[/nom]Thanks, I will try that! Is it possible to change depth and convergence in .mkv 3D movies, with Powerdvd 11 for instance? And is there any program or site where we can test depth and convergence in real time, like solidworks does?[/citation]

I don't know. The only way I've viewed movies is through my PS3 (about it's only use these days. I prefer PC gaming). I also wished to change depth and convergence with it, but couldn't figure out how.
 

Badelhas

Distinguished
Oct 4, 2011
120
0
18,690
Cleeve,

When will Toms Hardware start to use "The Witcher 2" in benchmarks. Its very pretty, has the "ubersampling" pre-rendering option, which is a future-proof effect, it responds well to GPU overclock, multi-core and CPU overclock. I personally would very much like to see some 3D Vision Benchmarks, in 720p and 1080p.
Cheers
 

Shinobi_III

Distinguished
Oct 13, 2011
18
0
18,510
Holy cow batman! They actually made the shutter glasses see through!

About god damn time, I had my first shutter glasses around 1999 with the elsa glasses, and Sega released it around 1987.

They've been the same until now.
 

Badelhas

Distinguished
Oct 4, 2011
120
0
18,690
[citation][nom]iwares[/nom]The guy in the thumbnail looks like Arnold Schwarzenegger.[/citation]
Your the second guy who notices that. It´s Arnold´s cousin.
 

Travis Beane

Distinguished
Aug 6, 2010
470
0
18,780
3D gaming is starting to look more and more appealing to me. I tied triple monitor, but my extra monitors ended up being used so someone else to use a Xbox/Ps3 for gaming and Netflix (We don't own a television).

Better viewing angles, bezel free monitors and glasses free 3D, sounds like a dream to me.
I don't think it'll actually take too long until we have those. Well, maybe not the bezel free monitors; a man can hope.
 


The context of the statement was a wide view screen like a theater or home movie watching, which is what the poster was talking about. Same with home office work, making a 27 inch screen with "glass's free" would not only be expensive but nearly useless. Samsung tried this awhile back and it didn't go over too well here (South Korea). It works only on things like the DS where your face is extremely close and in a very predictable spot.

That shouldn't actually be that much of a problem though. The monitor wouldn't need to be made for you, since each eye just needs to be outside the path of light intended for the other. Granted, a wider screen would require more restrictive positioning of one's head, but it shouldn't cause alignment issues for people of varying pupillary distance. It may require you to maintain a specific distance from the screen though, which probably wouldn't be ideal. There have been autostereoscopic screens made with lenses that automatically adjust to the viewer's eye positions though, so that might not necessarily be a problem in future consumer displays.

Unless the light has been lazed then it'll disperse uniformly as it travels away from it's point of emission. In practicality what this means is that light travels in a fan shape and not straight lines, you need to prevent ALL parts of that fan from hitting the wrong eye, not just the central ray. This becomes a monstrous task the bigger the screen is and the further you are from it. Your eyeballs horizontal alignment is different from other peoples, its a very slight difference (for some) but it's still different. A big screen would have such a small margin of error for the viewing angle that it'd practically have to be made just for you.

As for the person who made the comment about incandescent lights, you do realize those flicker at 60hz too right? The filament is being heated / cooled 60 times per second by the AC circuitry.

What your talking about has NOTHING to do with the 60/70/75/120hz cycle rate but instead has to do with artificial cold light vs natural hot light. Even though our eyes don't see in the infrared range our eyeballs can still sense infrared light striking them. Human eyeballs react differently to light with IR vs light without IR mixed in. Thrown into this mix is that we've spent that past ten+ thousand years evolving eyeballs for the natural light produced by our sun which has a different mixture then the pure white light we use for displays. A display could be set to 75hz and you'd still get a headache if you stared at it long enough. If your eyeballs hurt, get away from artificial white light and rest then with dim yellow / orange lights.

And this is coming from someone who has optic photo-phobia, a condition where your eyeballs express an aversion for bright light. Bright lights not only give me headaches but cause me pain if I go outside on a sunny day.
 

pnorman

Distinguished
May 20, 2010
30
0
18,540
[citation]As for the person who made the comment about incandescent lights, you do realize those flicker at 60hz too right? The filament is being heated / cooled 60 times per second by the AC circuitry.[/citation]

This is incorrect in a couple of ways. Firstly, the power supplied is V^2/R, which ends up being a sin^2 curve with a frequency of 120 Hz. Secondly, the intensity of light emitted is a function of the temperature, and the bulb filament does not cool off significantly in the timescales involved.

As proof, look at a scene lit with an incandescent light under a high speed camera. I've done it with camera speeds above the Nyquest frequency, and there is no brightness variation. I've seen incandescent lighting used for high-speed work in excess of 3000 fps.
 


Their not strobing but there is a noticeable increase / decrease in luminosity between periods.

And ultimately it doesn't matter, the 60hz isn't what's stressing your eyes. Movies are done at slightly less then 30hz and most monitors today run at 75hz. Its the artificial while light that is hurting your eyeballs, not the number of times it's oscillating every second.
 

kanaida

Distinguished
Mar 29, 2010
61
0
18,630
Ugh, TriDef sucks. I bought their software and a pair of glasses before, It wasn't a very nice experience. Ended up returning everything.
In particular the 2D-3D upconverting media player really sucked. All it did was make the stuff near the bottom appear close and the stuff near the middle/top appear far. I got a better result from red/blue glasses and a crappy directshow filter made by some kid online.
 

cleeve

Illustrious
[citation][nom]kanaida[/nom]Ugh, TriDef sucks... ...In particular the 2D-3D upconverting media player really sucked. All it did was make the stuff near the bottom appear close and the stuff near the middle/top appear far. [/citation]

While the TriDef 2D-3D movie conversion does suck (like every 2D-3D movie conversion software and hardware to date), the game driver actually renders each eye separately. It's apples to oranges.

But yeah, 2D-3D movie conversion sucks. No mater WHAT you use, it sucks.
 
Status
Not open for further replies.