Primer: The Principles Of 3D Video And Blu-ray 3D

Status
Not open for further replies.
Nice article, but I think there's a few issues with regards to the overall balance of the information being put forth.

I understand the author's preference for shutter glasses (especially since it's a certain product's preferred method of choice) even if I don't share it, the major limitation is having to buy a pair for all your friends coming over, which gets impractical until they are more commonplace.

Also polarized solutions are not limited in resolution if they are set-up beyond just the example provided in this article (like they do in the theatre with dual projectors [like the THG review by Don see: http://www.tomshardware.com/reviews/3d-polarized-projector,2589.html]) and may have an improving single source future with 2K and 4K displays on the horizon. It's a question of preference, but it seems like the full story wasn't explored on that subject.

Now on to a pet peeve: I love the part about "While set-top Blu-ray players will need to be replaced, PC-based Blu-ray player software can be upgraded." as a subtle product benefit plug.

Unless it's a free upgrade, you are still replacing the software, not upgrading it (it's not a plug-in), and you're likely forking out nearly the same amount of money for the 1/100th of the cost to produce that software update, so it's not like it's a major advantage. Especially when upgrading requires a FULL upgrade to the most expensive model Power DVD (version #) Ultra 3D, and I can't simply add it to my existing PowerDVD bundles thus potentially changing my backwards compatibility (Ultra 9 already removed my HD-DVD support from Ultra 7 that I upgraded on my LG HD-DVD/BR burner [that I also used for my old Xbox USB HD-DVD player too).

Make it a ~$20 independent 3D add-on and then you have a point [ooh I can save $5 'til May 25 :sarcastic: gee thanks ! ], until then it's $99 (or $94.95 for loyal saps) vs $150-200, plus with the set-top route now I have a second BR-/DVD player for another room or to give to a friend (the BR software on its own is useless to give to someone else without a drive), and that's not even compared to the free PS3 upgrade.

Also can someone explain this statement;
"Blu-ray 3D video decoding solutions can be expected for ATI Radeon 5000-series graphics in the future."

Didn't Cyberlink already show their BR-3D solution on ATi hardware last year? So what's the issue?

Also why is it limited to "GeForce 300M-series mobile graphics" when often the core is the same a previous generation 200M series (example GTS 350M / 250M )?

And this section "Full-quality 120 Hz frame-sequential 3D video (such as Blu-ray 3D) is only supported through a High Speed
HDMI cable to a HDMI 1.4-compliant TV. " seems to miss the DVI dual-link to monitor option currently being used for 3D on PCs, and also the dual 1.3 input monitors/TVs.

A nice little article for people unfamiliar with 3D, but there's a subtle under-current of product preference/placement in it, and far too many generalities with little supporting information. :??:
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
well done. I would of liked more detail on the hdmi 1.4 spec, specifically framepacking and the mandatory standards (no mandatory standard for 1080p60 framepacking).
also some info on AVRs and how a 1.3 hdmi AVR might pass on 3d video and still decode bitstream audio, or not - do we need 1.4 hdmi AVRs to decode audio from a 1.4 source? we shouldn't need 1.4 receivers since the audio standards haven't change, but I'm understanding that in fact we do neeed new receivers. :/
 

ArgleBargle

Distinguished
Jul 17, 2008
150
0
18,680
Unfortunately for people with heavy vision impairment (astigmatism, etc.) which require corrective lenses, such 3D technology is out of their reach for the time being, or at least next to useless. Until some enterprising company comes out with 3D "goggles", people who wear corrective lenses might as well save their money.
 

boletus

Distinguished
Mar 19, 2010
69
0
18,630
3D is cool, and high definition video is cool. But Sony's moving target of a BD standard is not cool, and Cyberlink's bait and switch tactics are not cool (unless you have bundles of money you can throw at them every 6-12 months). I sent back my BD disk drive (retail, with Cyberlink software) for a refund after finding out that I would have to shell out another $60-100 just so I could watch a two-year old movie. As far as I'm concerned, high definition DVD video is dead until some more open standards and reliable software emerge.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
Great,

This piece is a prelude to tomorrow's coverage, by Don, of Blu-ray 3D on a notebook and a desktop. Perhaps that one will answer any of the questions you were left with here?

As for AMD, Tom and I went back and forth on this piece, and we agreed that it was critical to get AMD's feedback on Blu-ray 3D readiness. The fact of the matter is that it isn't ready to discuss the technology. It's behind.

The mention of dual-link DVI was in the first revision of this piece and removed in a subsequent iteration. I've asked the author for additional clarification there and should have an answer shortly.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
So it turns out there were two sections on this and one was cut accidentally. Should be good to go now, though--dual-link DVI is discussed with PC displays!
 

cleeve

Illustrious
[citation][nom]TheGreatGrapeApe[/nom]Also can someone explain this statement;"Blu-ray 3D video decoding solutions can be expected for ATI Radeon 5000-series graphics in the future."Didn't Cyberlink already show their BR-3D solution on ATi hardware last year? So what's the issue? [/citation]

It turns out the demo (I think it was at CES?) only used CPU decoding over an ATI graphics card; the Radeon did no software decoding.

The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU. He said all four threads were being stressed under software decoding, not sure what quad-core CPU they were using though.

Definitely something I'd like to test out in the future...
 

Alvin Smith

Distinguished
This was a very informative and well written article BUT, I chose to skip to the last two pages ... Because ...

These implementations, while ever more impressive, are still being threshed out. Because of possible physiological side effects, I think I will NOT be a first adopter, with this (particular) tech (3D).

Anyone ever watch that movie "THE JERK", with STEVE MARTIN ??

= Opti-Grab =

... I can see all these class-action suits by parents of cross-eyed gamers ... hope not, tho ... I *AM* very much looking forward to the fully refined "end game", for 3D ...

Additionally, the very best desktop workstations are only just now catching up to standard (uncompressed) HD resolution ingest and edit/render ... since that bandwidth IS shared, between both eyes, this may be a non-issue.

I will let the kiddies and 1st adopters take-on all those risks and costs.

Please let me know when it is all "fully baked" and field tested!

= Alvin = (not to mention "affordable").
 

cyberlink

Distinguished
Jun 15, 2009
9
0
18,510
[citation][nom]cangelini[/nom]As for AMD, Tom and I went back and forth on this piece, and we agreed that it was critical to get AMD's feedback on Blu-ray 3D readiness. The fact of the matter is that it isn't ready to discuss the technology. [/citation]
While AMD has not yet announced their specific plans and schedule to support Blu-ray 3D MVC hardware accelerated decoding on ATI graphics, they were willing to confirm that a solution is coming for Radeon 5000 series graphics.

Tom Vaughan
Cyberlink
 
[citation][nom]Cleeve[/nom]It turns out the demo (I think it was at CES?) only used CPU decoding over an ATI graphics card; the Radeon did no software decoding.[/citation]

Ah that makes more sense (of what was trying to be said, not ATi/AMD's method) which is Ala AVIVO X1K series, make it 'sound' hardware accelerated, brilliant! [:thegreatgrapeape:5]

So, it's still available, just not hardware assisted. It's not like it's not possible as that statement would suggest, just you don't get any hardware benefit. Notice they kept the intel portion separate mentioning only the dual stream HD decoding (available since the HD4600 series, and GF9600 series) infering it's doable on intel, but not on the next stated option which would be in the future, not well written in that section if providing clarity is the goal. One would assume by the statement that A) 3D BR is not possible if running on an new HD5770 with a Core i7 920-980X, and B) that when it is 'made possible' it will only be on the HD5K series.

The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU. He said all four threads were being stressed under software decoding, not sure what quad-core CPU they were using though.Definitely something I'd like to test out in the future...

Yeah sorta gets back to the VC-1 H.264 decoding of the early generation HD-acceleration GPUs.
Still unclear why it's nV G300M-centric though based on the relationship of the chips as stated above.

BTW, need to get you some new projectors for a 1080 stereo projector setup. Isn't it tax return time? :whistle:
 

cyberlink

Distinguished
Jun 15, 2009
9
0
18,510
[citation][nom]Cleeve[/nom]The Cyberlink rep tells me that Blu-ray 3D software decoding is extremely CPU-dependant and might even require a quad-core CPU.[/citation]
To clarify, while it's possible to play Blu-ray 3D on a PC without video decoding acceleration (video decoding on your graphics processor), it takes most of the CPU power of a quad-core CPU to do software decoding of Blu-ray 3D MVC. GPU accelerated decoding is really the way to go, if possible.

Tom Vaughan
Cyberlink
 

cleeve

Illustrious
[citation][nom]TheGreatGrapeApe[/nom]So, it's still available, just not hardware assisted.[/citation]

Well grape, that's where things get interesting. It might be *possible*, but it can't be *available* until they develop something.

in Nvidia's case, they have their own 3D Vision infrastructure in place, so you plug in the 3D Vision stuff and you're off to the races.

Radeons on the other hand, I think it's safe to say they'll never be 3D Vision compatible. So AMD has no way I can think of that they will be able to provide a full-resolution 3D solution... in the near future anyway. maybe they'll someday be able to plug into 3D TVs and utilize their proprietary glasses, but for that they'd need HDMI 1.4, not sure if the 5000 series can handle that with the current hardware.

There's a lot to talk about, but it's easier to direct you toward my article that's coming out tomorrow. Then we can chat. :D

Take care,

- Cleeve
 

geok1ng

Distinguished
Jun 25, 2008
111
0
18,690
[citation][nom]hixbot[/nom]well done. I would of liked more detail on the hdmi 1.4 spec, specifically framepacking and the mandatory standards (no mandatory standard for 1080p60 framepacking). also some info on AVRs and how a 1.3 hdmi AVR might pass on 3d video and still decode bitstream audio, or not - do we need 1.4 hdmi AVRs to decode audio from a 1.4 source? we shouldn't need 1.4 receivers since the audio standards haven't change, but I'm understanding that in fact we do neeed new receivers.[/citation]

Great comments, but ATI is not showing for the game. If a product is not on the shelves, it will not sell. It is simple like that, as NVIDIA learned the hard way with Fermi.

The 3D modes are a lose-lose alternative: it is either an expensive display coupled with inexpensive glasses, or a mildly expensive display coupled with mildly expensive glasses.

NO matter which one goes, you lose performance or resolution: single DVI and HDMI cant display 3D over 1080p60Hz links. HDMI 1.4 was the salvation of 3D, if one can accept 24 Hz signals...

DisplayPort would be the way to go, but TVs are HDMI domain, and will remain so for the next decades, thanks to HDMI audio.

The point is that i see more benefits from higher resolutions than from 3D, and ther is no consumer grade cable today that can deliver 60Hz of 1080p or higher resolutions on 3D. But even modest systems demand such computational power that heat dissipation issues comes into play, much like the Graphics war of performance and Heat.

It would take a massive change on the way consumer grade Tvs and players are manufactured to bring the high end visual experience of 3D images and 4K resolutions to the living room. There is no way to produce viable chips on 90nm or bigger-hotter processes.
 

cyberlink

Distinguished
Jun 15, 2009
9
0
18,510
geok1ng - HDMI 1.4 supports 1080P60 stereoscopic video with frame packing. I'm not sure what you are referring to when you say "if you can accept 24 Hz signals". While the full spec is confidential and only available to HDMI adopters, you can go to HDMI's website and request a subset of the HDMI 1.4 specs. This "extraction" document provides all of the detail about the 3D modes.

hixbot - an HDMI 1.4 3D source (HTPC, Blu-ray player, or other device with HDMI 1.4 output) can choose to support one of several mandatory 3D video signal formats. If an HDMI 1.4 sink (device with input) signals that it supports 3D, it must support all mandatory 3D modes (it can advertise support for additional modes).

Tom Vaughan
Cyberlink
 
G

Guest

Guest
It would be cool to be able to manually offset the depth of the video (intensity of the 3D effect)too.
 

Guimar

Distinguished
Jan 16, 2009
4
0
18,510
Your 3d model of vision is incomplete, 3d can be sensed with one because of eye wobble. The effect is ,ore subtle than the two eye effect, but it is real and unaccounted for by any system that relies on showing each eye a separate image. since the wobble doesn't occur objects that are relatively close to the viewer don't have the expected parallax and the 3d illusion can breakdown or the use gets headaches and eye strain.
 

Guimar

Distinguished
Jan 16, 2009
4
0
18,510
Your 3d model of vision is incomplete, 3d can be sensed with one because of eye wobble. The effect is ,ore subtle than the two eye effect, but it is real and unaccounted for by any system that relies on showing each eye a separate image. since the wobble doesn't occur objects that are relatively close to the viewer don't have the expected parallax and the 3d illusion can breakdown or the use gets headaches and eye strain.

Test this yourself close one eye and objects that are nearby still look 3d.
 
G

Guest

Guest
RE: Unfortunately, consumer-grade head mounted displays today are not capable of displaying a high-definition video signal.
eMagin has a 720p OLED 3D Headset but it is not consumer available yet.
 
Status
Not open for further replies.