help me connect an HDTV to a Dual DVI video card

Status
Not open for further replies.

redheat

Distinguished
Jun 18, 2007
7
0
18,510
I guess that title can't get any more obvious :)

Hi Everyone

This is my first post around here so bear with me a little:

I have a sony bravia 32" HDTV, the V series, and here's a link to this tv set: http://www.sonystyle.com.hk/ss/product/tv/klv_32v200a_e.jsp

The video card I use is a XFX 8600 GT video card with Dual-DVI-I connections. Take a look at it here: http://www.xfxforce.com/web/product/listConfigurationDetails.jspa?series=GeForce%26trade%3B+8600&productConfigurationId=1062920

in addition I have a viewsonic VX2025wm 20.1" LCD monitor, take a look at it here: http://www.viewsonic.com/products/desktopdisplays/lcddisplays/xseries/vx2025wm/

and to connect the HDTV to my video card I bought this cable: http://www.amazon.com/Premium-DVI-I-Component-Video-Cable/dp/B000FZXKLQ/ref=sr_1_2/104-5296207-8730305?ie=UTF8&s=musical-instruments&qid=1182169453&sr=8-2

Which is a DVI-I-to-Component cable.

My video card is installed on a Gigabyte GA-965P-DS3 (ver 3.3) motherboard. In addition, I have windows xp professional as my main operating system.

The problem I get is I can't get my HDTV to run. I mean whether I connected the HDTV to my video card along with my monitor or I connected it alone, I forgot to mention that my vieowsonic monitor is connected to my video card through a DVI-I link, I don't get any picture on my HDTV no matter how hard I try. Just nothing. The screen remains black.

The HDTV set I got has the following connections on its back:

One HDMI port which is already used by my PS3
two Component (RGB) ports which could be used by a HDDVD and support hd resolutions up to 720p and 1080i
Two S-vido ports, and three composite (Red, Yellow and White).

Now, from what you've read above, is there something that I've done wrong. I mean did I get the wrong cable? should I have gotten a DVI-to-Component adapted instead? Why isn't the HDTV playing or showing anything? I don't know and I'm about to go out of my skull because of this... The XFX support just told me "the card should be able to play them both", that is the monitor and the HDTV should run simultaneously on the same card using the two DVI links with no problems at all.

Please any help. I would really appreciate any help at all.

regards
redheat
 

firemist

Distinguished
Oct 13, 2006
209
0
18,680
Your card specs list "Dual Link DVI - Supporting digital output up to 2560x1600 ". I do not see analog or component support on the specs. Are you sure your card supports component?
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
If you're anxious to get yourself up-and-running, go buy an HDMI switcher, plug your graphics card and PS3 into that, and then connect the output from the switcher into your TV's HDMI input. DVI-I to HDMI conversion is pretty simple (it's just a different connector configuration; the signals and conductor counts are the same, albeit of course your TV won't get sound from your computer).

For component, I'm more familiar with the ATi cards, which usually have a special S-Video-like connector to which you attach a component dongle, which then allows you to connect to an HDTV. That's a different set of technologies than a DVI-I or VGA (which = the analog portion of DVI-I) dongle which must supply the EDID (TV capability info) to your graphics card before your card will pump out a signal.

If you can keep it digital, do so. You'll avoid all of the headaches associated with timing, framing, alignment, etc.
 

redheat

Distinguished
Jun 18, 2007
7
0
18,510
Firemist and TeraMedia, thank you both like a million times for pointing out where I went wrong, and for telling me where to get a hardware to get a digitial signal.

Firemist, dude/dudette, you pointed out for me where I went wrong when you said "I don't see component or analog" exactly and that's where the problem was. I thought if I bought a dvi-i-to-component cable I would simply get a digital signal, how stupid and ignorant that was. First of here's a little tutorial on what dvi is all about: http://www.pacificcable.com/DVI_Tutorial.htm

It's really very simple, my video card sends out digital signal, they even sent with it in the same retail box a dvi-to-vga adapter. I mean assume I had an old VGA monitor, the beige box as people loved to call them, and all it had was a VGA connection then I will have to hook the vga cable to the vga-to-dvi adapter to be able to connect the beige box to my computer or I won't be able to get signal at all.

the same thing applies in the case of a DVI-to-Component cable. First of all, if you've read the tutorial above on DVI, you'll notice that DVI has many variations, DVI-I, DVI-D, and DVI-A, DVI-I can work both ways as analog and digital, DVI-D works as strictly digital, and DVI-A works as analogue only. So, If I wanna connect an analogue device to my video card I need an adapter, I don't just plugin the cable, like I stupidly did. First I need to get a dvii-to-vga..or any analogue connection, and still that analogue device will be seen as an analogue device since it is actually an anlogue device.

But in my case, I wanted to connect an HDTV set which has many different connections, which as Firemist pointed out above, don't all work with a pc. Let me iterate on this: but first guys/gals, visit this link to know what kind of connections to expect on the back of your current or future-to be HDTV set: http://hdtv-basics.classes.cnet.com/lesson-2/lesson-2-page-1/

First of all, my HDTV has the following connections:
1. VGA connection, or sub-D as known, which is the connections we all had on the back of our older monitor, the beige boxes, this could be easily connected to my video card using a DVI-I-to-VGA adapted and in this case it will be seen as an analog monitor and you won't be able to use any HDTV features from your pc.

2. HDMI: this is the latest of the latest when it comes to digital audio and video connections. Some new monitors are coming out with HDMI connections on their backs, so do many new video cards with HDMI connections which support HDCP, and you have to watch for this little name because without HDCP you won't get HD. Read about HDCP here: http://www.digitalconnection.com/FAQ/HDTV_12.asp

3. My HDTV also has a 2 component, that is Red, Green and Blue connections on the back. Component connections are analog connection but they can support High Definitions, and this where I went wrong and as Firemist above cleared it out "I don't see any component support." Component connections can support High Definition, if the device they're connected to like HDDVD or Bluray DVD players players or HDReceivers can send HD signals over analog connections. For example like this HDDVD player from samsung: http://www.samsung.com/products/dvdplayer/hidefconversion/index.asp

4. And then there are about 4 composite connections on my HDTV set. Composites are the yellow, white and red connections which we've been using for the past two decades or so to connect our VCRs, does anyone know what VCR is...wow we're getting too old...Video Recorders..the VHS recorders..1980s..bad hair days...doesn't ring a bill..anyhow..these are still around for compatibility only..since no one uses them anymore..

5. and finally I have two s-video connections, worse than the composites connections.

So back to my problem how do I solve that problem connecting my Video card to my hdtv. which means, I need a digital connection on both ends of the game. Then I should buy a DVI-I-to-HDMI cable, and plug it in. But wait my one and only HDMI connection on the back of my tv is already occupied by my "invincible" PS3, so what do I do? Thank you TeraMedia for pointing out what to buy: an HDMI switch, sort of a HDMI router, where you can connect many devices that uses an HDMI connection to one HDTV only and you can see it here:http://www.amazon.com/HM-501-5-PORT-HDMI-SWITCH-control/dp/B000FHQMFK

Dudes/Dudettes, thank you all for your help. Here's a bunch of really cool gadget websites to chill out with:

www.gizmodo.com
www.x2vga.com
http://crave.cnet.com/
http://www.engadget.com/

For Those of you who want to use their applications on the run and plug their flash drives into any PC and use their data rather than starting from scratch: go to these to apps on the go:

http://portableapps.com/
http://www.ceedo.com/

also for the inner geek in most of us: go here:

http://www.thinkgeek.com/

and if you all love linux and to know the latest knows on the latest packages and distros go here:

http://distrowatch.com/

again thank you all

regards
redheat
 

Bruxbox

Distinguished
Apr 3, 2001
185
0
18,680
First, I'd recommend that you test the Bravia by connecting the PC to the HDMI on the Bravia. This would involve disconnecting the PS3 temporarily. You will need a DVI to HDMI adpater or a DVI to HDMI cable.

You want to confirm that the Bravia will receive signals from the PC via the HDMI connection. If not, we know that you can only connect the PC to the Bravia using the VGA connection. The Bravia manual recommends connecting the PC via the VGA. You will need a DVI to VGA adapter for the graphics card. Most people can't tell the difference between a VGA connection and a DVI connection.

If the Bravia will receive signals from the PC via the HDMI, you may need to decide whether you should keep that connection, but to connect the PS3 to the Bravia using the component connection instead. You may not notice a difference in quality.
 

redheat

Distinguished
Jun 18, 2007
7
0
18,510
BruxBox, the card is designed to deliver HDTV signal, it has HDTV ready sign on it. I guess they kept the vga port on the back of the tv, because they never thought, remember the time this tv model made, like in mid june 2006, cards which support hdmi or hdtv were's even out yet. So with the new breed of 8xxx cards coming out with HDMI, I mean they have HDMI port instead of that ill-fated s-video port. things are bound to change

I was thinking of something else, what if the only cable available was the DVI-D-to-HDMI I mean not your regular DVI-I ( which is dula link) but DVI-D to HDMI then I will be left with no choice but to stick with a VGA to DVI adapter and that's it.

regards
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
I am using DVI-I to HDMI on my Sony. Different model, different vintage (see below), but it works just fine... most of the time. The catch is that when the computer boots, the boot screen and startup is at standard VGA resolution, which the TV doesn't seem to support. Either that or the boot sequence doesn't support rendering at the screen's supported resolutions. So when things start up, I get a black screen for a few moments, and then I get the "Windows Loading..." screen. I get the RAID config screen, but I do not get the Intel boot splashscreen.

If I connect the VGA connector and tune that on the TV, I then get the missing boot screens. I seem to remember something about the VGA 640 x 480 resolution being "implied" for one of the two digital connector types, so I don't know if that perhaps is the root cause of the missing-bootscreen problem. But I don't care b.c. everything works fine in normal operation anyway.

If the OP cares, he can connect VGA, or even connect both. But I still recommend using the DVI-HDMI option for normal operation, as it seems to work better - at least on my TV. The VGA option seems to have a timing issue on my TV, as part of the image is off-screen. Also, many VGA connectors on older HDTVs only support 1280 x 1024, which isn't very useful. The OP's original post didn't mention that he had VGA, so I didn't go into detail on any of this in previous responses.

OP, if you need to see the boot screens, you should be able to do that with straight composite / S-Video, or with an analog VGA connection. Your GPU should be able to generate those during the boot sequence, in addition to generating DVI-I output. I fully agree that you should test out the DVI-HDMI connection before you buy an HDMI switch, in case it doesn't work. But it should work given the way the specs are written. You might need to connect first with VGA (with DVI-I also connected), run in Dual mode, and then configure your DVI-HDMI connection by using the ability to see from your VGA or S-Video or composite connection. For example, if your TV isn't passing the right EDID info to your GPU, then your GPU might not show the right set of available resolutions, or might be configured to an incompatible resolution, until you set it or force it as necessary.
 

Bruxbox

Distinguished
Apr 3, 2001
185
0
18,680
It may be true that your graphics card is designed to deliver HDTV to an HDTV monitor. However, you Sony Bravia may not be designed to accept an HDMI connection from a PC.

I could not download a copy of the owner's manual for your model Bravia, as it appears to be a model intended for Hong Kong. I could only locate a copy that is in Chinese. However, I was able to find an owner's manual for a Bravia model for the USA that seems to be like yours. It stated that a PC should be connected to with VGA. But, it does not necessarily mean that the PC can not be connected with HDMI

Anyway, some brands of LCD HD monitors say that the PC should be connected via VGA connection. My Olevia LDC monitor manual says to do a VGA connection. However, these models can work with an PC to HDMI connection as well. My Olevia works with a DVI connection with my PC.

I suspect that Sony and other manufacturers don't want to support HDMI and DVI connections as it does require some fine tuning by the end user. If you are an ordinary end user who is not very experienced, making an HDMI to PC connection may give you some problems. Sony may not want to give tech support to thousands of end users wanting PC to HDMI connections.

I still recommend that you test out the HDMI connection. Make sure that your PC graphics card Display settings are set at the native resolution of the Bravia and that the screen refresh rate is at 60 hertz, which is standard for LCD panels.

If you can get the Bravia to connect via HDMI, you should be pleased with the results. Otherwise, you should still be pleased with a VGA connection, I"m sure.

Again, if the Bravia to PC will work with the HDMI, then you will have good options. The PS3 should look pretty good with a component connection as well.
 

redheat

Distinguished
Jun 18, 2007
7
0
18,510
I am using DVI-I to HDMI on my Sony. Different model, different vintage (see below), but it works just fine... most of the time. The catch is that when the computer boots, the boot screen and startup is at standard VGA resolution, which the TV doesn't seem to support. Either that or the boot sequence doesn't support rendering at the screen's supported resolutions. So when things start up, I get a black screen for a few moments, and then I get the "Windows Loading..." screen. I get the RAID config screen, but I do not get the Intel boot splashscreen.

If I connect the VGA connector and tune that on the TV, I then get the missing boot screens. I seem to remember something about the VGA 640 x 480 resolution being "implied" for one of the two digital connector types, so I don't know if that perhaps is the root cause of the missing-bootscreen problem. But I don't care b.c. everything works fine in normal operation anyway.

If the OP cares, he can connect VGA, or even connect both. But I still recommend using the DVI-HDMI option for normal operation, as it seems to work better - at least on my TV. The VGA option seems to have a timing issue on my TV, as part of the image is off-screen. Also, many VGA connectors on older HDTVs only support 1280 x 1024, which isn't very useful. The OP's original post didn't mention that he had VGA, so I didn't go into detail on any of this in previous responses.

OP, if you need to see the boot screens, you should be able to do that with straight composite / S-Video, or with an analog VGA connection. Your GPU should be able to generate those during the boot sequence, in addition to generating DVI-I output. I fully agree that you should test out the DVI-HDMI connection before you buy an HDMI switch, in case it doesn't work. But it should work given the way the specs are written. You might need to connect first with VGA (with DVI-I also connected), run in Dual mode, and then configure your DVI-HDMI connection by using the ability to see from your VGA or S-Video or composite connection. For example, if your TV isn't passing the right EDID info to your GPU, then your GPU might not show the right set of available resolutions, or might be configured to an incompatible resolution, until you set it or force it as necessary.

touché...Absolutley an HDMI-to-DVI will work, and for that boot screen I got one word, actually two, "screw it". I had the same thing when I used to connect my bravia to the pc using the VGA connection. It used to play the boot sequence in an overscan mode where the screen gets really small, and the moment the windows screen loads, the whole thing goes back to normal and I get a resolution up to 1280x1024. I'm sure its gonna work fine, I'm just waiting for the cable to be sent over from Amazon. The tv is almost blind to what gets connected to it. I mean think of it this way, how is it possible for a digital play device like HDDVD to get connected to the HDTV using component, i.e. analog, connection yet it plays at, not all HD resolution, but within a certain range like up to 1080i but I think a 1080p is going needs definitely an HDMI connection. I mean here you go you got something digital over an analog connection and it also provides resolution almost identical to those of a digital connection? Most of the HDDVD players today are using an RGB connection.

Anyway, right on the money, teramedia, the cable will work fine, I'm just a little bit hesitant when it comes to buying the hdmi switch its so friggin' expensive. You should read this article on Gizmodo about the ripoffs on hdmi-related merchandise read it here: http://gizmodo.com/gadgets/home-entertainment/hdmi-ripoffs-continue-89-hdmi-switcher-selling-for-350-239589.php

regards
redheat
 

redheat

Distinguished
Jun 18, 2007
7
0
18,510
It may be true that your graphics card is designed to deliver HDTV to an HDTV monitor. However, you Sony Bravia may not be designed to accept an HDMI connection from a PC.

I could not download a copy of the owner's manual for your model Bravia, as it appears to be a model intended for Hong Kong. I could only locate a copy that is in Chinese. However, I was able to find an owner's manual for a Bravia model for the USA that seems to be like yours. It stated that a PC should be connected to with VGA. But, it does not necessarily mean that the PC can not be connected with HDMI

Anyway, some brands of LCD HD monitors say that the PC should be connected via VGA connection. My Olevia LDC monitor manual says to do a VGA connection. However, these models can work with an PC to HDMI connection as well. My Olevia works with a DVI connection with my PC.

I suspect that Sony and other manufacturers don't want to support HDMI and DVI connections as it does require some fine tuning by the end user. If you are an ordinary end user who is not very experienced, making an HDMI to PC connection may give you some problems. Sony may not want to give tech support to thousands of end users wanting PC to HDMI connections.

I still recommend that you test out the HDMI connection. Make sure that your PC graphics card Display settings are set at the native resolution of the Bravia and that the screen refresh rate is at 60 hertz, which is standard for LCD panels.

If you can get the Bravia to connect via HDMI, you should be pleased with the results. Otherwise, you should still be pleased with a VGA connection, I"m sure.

Again, if the Bravia to PC will work with the HDMI, then you will have good options. The PS3 should look pretty good with a component connection as well.

BruxBox, you're putting too much emphasis on the VGA port vs. HDMI port regarding PC connectivity. The tv doesn't care what kind of device let it be PS3, PC, HDVD or a VHS players are connected to it. As long as you're connecting it to a device capable of playing certain video resolution, and capable of identifying itself. I mean for example, the PS3 which is a console was turned by Sony through the use of Yellow Dog Linux into a PC. The TV set didn't mind that at all, though there's the problem of overscan which is a black border surrounding the ydl desktop from all sides due to a programming problem within ydl itself not the tv. The same thing applies here, if my video card stands up to its promise of delivering fault-free HD signal up to 720p, then it will play HD smoothly and trouble-free. The process of Modulating and demodulating from analoge to digital and vice versa is not a problem for both the tv and the video card on my pc. The same thing I explained above in my reply to termedia when I said if you connect an HDDVD player to the tv set using RGB cables which is mainly an analog connection to deliver a digital signal, and yet it delivers it at HD-quality level this should be an indication that the tv is literally blind when it comes to which device is connected to it. I mean when I connected the PS3 for the first time the tv scanned the port, and it then it choose the one that is suitable for me because the hdmi port on the ps3 was capable of identifying itself correctly. The same thing applies to the video card, if the video card is fault-free which I know it is, then it shoud define itself much as my ps3 defined itself as a HD-capable device that can deliver up to 720p in HD resolution. Not a big deal really. So it is not "PC" vs. "other HD" devices, you know what I'm saying. Think of it like this Video card=HDDVDplayer=PS3=whatever out there capable of playing HD.

regards
redheat
 

xstec

Distinguished
Oct 26, 2006
34
0
18,530
Hi Folks.

Sorry to hijack this thread, but im sure it will be better than creating a new one.

I have just bought a 42" nec plasma, it has an HDMI cable coming from the monitor. I am assuming its a monitor as it doesnt have a tv tuner lol, just hdmi cable from what i can see.

Now my GFX is a 9600 pro, will the below convertor work to enable my computer to be hooked up to the monitor?

http://www.maplin.co.uk/hdmi

Thanks in advance.
 

redheat

Distinguished
Jun 18, 2007
7
0
18,510
Hi Folks.

Sorry to hijack this thread, but im sure it will be better than creating a new one.

I have just bought a 42" nec plasma, it has an HDMI cable coming from the monitor. I am assuming its a monitor as it doesnt have a tv tuner lol, just hdmi cable from what i can see.

Now my GFX is a 9600 pro, will the below convertor work to enable my computer to be hooked up to the monitor?

http://www.maplin.co.uk/hdmi

Thanks in advance.

xstec, can you do me favor, take a look at the back of your video card and see what type of DVI it has? is it DVI-I, DVI-D or DVI-A. DVI-I works both analog and digital, so you can connect one end of the dvi-d cable to your video card and the other to your dvi-to-hdmi adapter, and from the other end just connect a hdmi-to-hdmi cable and that's it. Here's a tutorial on what dvi is all about:

http://www.datapro.net/techinfo/dvi_info.html

regards
redheat
 

xstec

Distinguished
Oct 26, 2006
34
0
18,530
Hi Redhat, Im in work atm so i have only been able to find you a picture of my card.


I will read the DVI tutorial, and i hope this will clear things up.

Thanks for the help.

Ste

EDIT: After more searching i have found that it is a DVI-I interface.
EDIT x2: That dvi guide was a great read, I am sure alls i need is the convertor and my shiny new monitor will come alive.
 
Status
Not open for further replies.