hooking up 1080p tv to video card

quanger

Distinguished
Jun 6, 2005
164
0
18,680
my friend has a zotac amp! 8800GTS and he was wonder if his video card supported 1080p. His package did not come with a DVI->HDMI adaptor like my 3870x2 did. His came with a composite cable instead. I am assuming his only option is to connect his HDTV to the composite cable? Is 1080p even possible with composite cable? I cannot actually find any specs that the video card supports 1080p tv-out. Can anyone confirm this?

my 3870x2 has a DVI->HDMI so I just plug that to a 1080p tv and good to go? I am 100% sure that the videocard supports 1080p.

I think they should call 1080p SuperHighDefinitionTV or something. I find this very confusing.
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
A DVI->HDMI converter should work. They are identical signals (electrically) in a different interconnect. I'm using an DVI->HDMI adapter for my computer (Athlon XP 2400+, 1GB, FX5200) to display 1920x1080 on my Panasonic TH-42PZ700U (42" 1080p Plasma).

1080p is considered "Full HD", BTW. I know the naming convention sucks, but it's already set in stone pretty much, so we just need to deal with how stupid the naming is.
 

gwolfman

Distinguished
Jan 31, 2007
782
0
18,980
His probably came with "component" cables, which are reg, green, blue colored (composite is yellow with white/red for audio), and that can run 1080i and possible 1080p depending on the card and TV capabilities. If he wants to use DVI or HDMI because of HDCP, then do what ^^^^^^ (person above) mentioned, get a $5 DVI->HDMI converter or a DVI->HDMI cable.

Something like this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16812226015
12-226-015-03.jpg


OR

http://www.newegg.com/Product/Product.aspx?Item=N82E16812226013
12-226-013-02.jpg


For the adapter: Make sure the DVI=male and HDMI=female
For the cable: Make sure the DVI and HDMI connectors are male

Good luck
 

stoner133

Distinguished
Mar 11, 2008
583
0
18,990
DVI to HDMI is a pure digital interface, the only difference is that HDMI to HDMI also includes digital audio.
DVI= Digital Video Interface.

DVI to HDMI is how I connect my video card to my HDTV and it looks fantastic.
 

San Pedro

Distinguished
Jul 16, 2007
1,286
12
19,295
The TV might have a VGA or DVI input which would be a great way to hook up the pc to the TV. This is the way I do it, and I really like using the PC to watch video files because it upscales them very well, much better than the my roommates xbox 360 or a regular DVD player.
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165
only the ATI HD 3800s series have a true "HDMI" converter (Integrated HD audio controller with multi-channel (5.1) AC3 support and 1080p display). Nvidia can probably only do the 1080p display without the 5.1 audio.

However, if you use a VGA to DVI converter or vice versa. You will have a resolution around 1360x768, but not 1920x1080. I haven't tried composite cables yet, but it worked for my friend's plasma tv at 1080p.

p.s. i have a ATI HD 3850 videocard with HDMI hooked up to a 52 inch sharp aquos widescreen LCD TV.
 


DVI->HDMI can include audio too, as demoed by the ATi HD series, just not on most GF8/9 series cards.

 


Neither DVI or VGA are limited to that resolution (don't even know where that weird resolution comes from [most are either 1280x720 or 1366x768, 1436x736 is off ratio neither 16:10 or 16:9 and not even 1.85:1]).

It's probably a default setting of his monitor or card if anything.

HDCP requirement wouldn't down-convert to that either.
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165


okay, to make you happy. i tested the DVI to VGA converter and i got 1360x768, but that is the highest resolution i can achieve. i was just too lazy to list the exact resolution and test out the converter. so, i do believe you are limited by DVI and VGA.

i have the hardware to test this out. do you?
 
Sounds like something to do-with or wrong-with the interface/converter, because I can out put from both to 1080P on various displays, either that or an artificial driver limitation.
It's not the output itself, it may however be the drivers however, as detailed by Cleeve in his review, the drivers on the HD2400P defaulted to a 1440x? resolution, but it's not a limitation of the interfaces;
http://www.tomshardware.com/2007/10/26/avivo_hd_vs_purevideo/page7.html
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165


i don't think it's the drivers. i'm using the latest ATI Catalyst drivers 8.3 and i have a ATI HD 3850, not a low-end budget card ATI HD 2400.

it could be my TV's limitations for VGA input, but HDMI input can display a resolution of 1920x1080 for me. however, there is no DVI limitation. just VGA i believe.
 
OK you completely changed your post, so here's a second reply, the reply above was to your original post.

 


Yeah, and that could be the case too, as it's not the output that's the limiting factor, but the TV may treat all PC inputs the same based on how they achieve their '1080P' image.

If it's displaying at 1366x768 it's probably only a 720P panel like the Aquos 43U, otherwise they're messing with your image quality twice, once by limiting your resolution, and then again by interpolating from the down-converted resolution to the full 1080P.

Either way that would suck. Sounds like a non 1920x1080 monitor though if it's pushing you to 720P.
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165


ummm.. no. it only pushes me to 1360x768 when i use the DVI to VGA converter.

works perfectly fine with HDMI cable at 1920x1080.

i have one of the newest Sharp Aquos LCD TV models. you can search it online. LC-52D64U http://www.sharpusa.com/products/ModelLanding/0,1058,1920,00.html
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
I know my tv (Panasonic TH-42PZ700U 1080p) is perfect if you use the DVI at 1920x1080. The limitation of the screen is 1280x1024 if you use the 15-pin VGA connector. This is a limitation of the SCREEN, not any other hardware. That being said, I'm going to side with GGA, not only because of personal experience, but because I believe he knows his stuff.
Edit:

Only if you mean the VGA input of the screen, but not the VGA interface itself. It's (the VGA specification) limitation is 2048x1536.
 
Still no VGA limit though VGA goes HIGHER than single link DVI VGA goes almost as high as dual-link DVI. VGA running off of 400MHz RAMDACs like found on pretty much all cards since the Radeon 8500 and GF6 series (some FX not all) will output 2048 x 1536 @ 75 Hz - 32 Bit Color or 25x16 @ 60fps /32.

I have a feeling it's takinng a higher input on HDMI because it's set to downconvert automatically (like my projector which prefer 1080P rather than the native 720P) because it prefers more info in converting to 736P.
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165


okay you guys are probably right. maybe my hardware just don't like each other, but i'm just telling you guys this because i don't want the person who posted this topic to feel like he got cheated if he used a DVI to VGA output instead of HDMI output and ended up with a lower resolution.

but back to the main topic..

i think component cables should work fine at 1920x1080.
Geforce cards can probably display 1080p, but with no audio through HDMI.
ATI 3800 series can display 1080p and output audio though HDMI.

like the guys mentioned before. it all depends on the setup.

thanks for making this clear for me. i never did understand why the DVI to VGA output had a lower resolution than HDMI output for me.
 

vic20

Distinguished
Jul 11, 2006
443
0
18,790


I don't think component has the bandwidth for 1080p, only 1080i ?

I haven't gotten component to do 1080p on from a PC or PS3. The again it could be a limitation of whatever chip they use for component output rather than a cable bandwidth issue.

I am using VGA to run 1080p to my LCD TV because the video drops to 1360x768 if I use a DVI to HDMI adaptor. No idea if its my TV or video driver limiting it.
 
Looking at their specs, it is a full 1080 panel (no mention of where/why/how PC support though), so it's screwed up that they're limiting the VGA input to a non-native resolution, it makes no sense other than for them to try and avoid singal noise over longer cables, but to me for such a panel I would hope the mfr wouldn't dumb things down just to help the dumb users and put it to the lowest common denominator.

Oh well, weird property of the panel, but it's not the VGA it's the panel that's limiting it in that case. I'm pushing 1920x1440 @ 75hz right now on my P260 at work over both DVI-A and VGA/DB-15.
 

aylafan

Distinguished
Feb 24, 2006
540
1
19,165
i guess it all depends on the TV screen, videocard, and the type of input you use. seems like everyone has different experiences with different setups. HDMI, VGA, DVI, and component cables.

i paid $2200 for this LCD TV and it won't do VGA at 1080p, but will do HDMI at 1080p just fine. sighs.
 


Thats correct depending on how it's used, as most component out is YPbPr, and the norm for those TVs is 1080i due to signal quality over longer lengths.
You could push 1080P over YPbPr component, but it will require a TV that supports it (similar to the issue aylafan has with VGA) and then also require good cable quality, because even at 6ft it is very suceptible to line noise.
The most commonly used way to push it (hi res/bandwidth) over component is to use the RGB format which has higher tolerances and doesn't rely on timing differences like the YPbPr, and it essentially mimics BNC connections, just using RCA jacks. There really isn't a limit to these setups, and they can theoreticall under spec carry more info than HDMI 1.3 deep colour (edit - which is a known spec, the cable is limited aroudn that, not sure the HDMI wire spec though), but you need the support from the panel (and output of course).

That's the biggest problem with these things, the physical and spec limitations aren't always the limiting factor, it's the artificially imposed stuff, like TV support, like driver support, app support, crappy DRM, etc that usually limits this.

For HD though the biggest limiting factor (other than the artificial ones) is single-link DVI and even that usually isn't an issue since most of the TVs that accept DVI input allow for reduced blanking as well.

Anywhooo, main thing for most people is check the specs and support for your TV, you are less likely to be limited by your PC than your TV, althugh annoying things like the overlay changes (mainly for windowed/dual screens), the pixel scaling issue, and overscan are often a driver/PC limit, and so if those come into play (more for off resolution TVs) then you should look into that too.
 

San Pedro

Distinguished
Jul 16, 2007
1,286
12
19,295
Um I use a DVI to VGA adapter from a 6800GT to a 42" Samsung LCD and I have no problem displaying 1920x1080. I don't know why you would say that's not possible.