Video card for TV output

Status
Not open for further replies.

jwarren

Distinguished
Apr 3, 2011
1
0
18,510
I'd like to upgrade my old video card to be able to send the output to the TV (15 pin rgb or hdmi). What are my options? I assume I would also need to upgrade the sound card?

Dell Dimension E310
Slots Information :
Slot PCI : 32-bit [5.0v, 3.3v] - Empty
Slot PCI : 32-bit [5.0v, 3.3v] - Empty
Slot PCI-Express : x1 [3.3v] - Empty
Slot PCI-Express : x16 [3.3v] - In Use

On-Board Device Information :
Device : Intel Graphics Media Accelerator (Video)
Device : Intel PRO/100 VE Network Connections (Ethernet)
Device : High Definition Audio Controller (Sound)
Embedded Controller : No

current card:
General Information :
Manufacturer : Intel
Model : Intel(R) 82915G/GV/910GL Express Chipset Family
Bus Type : PCI
Texture Memory : 118MB
Processor : Intel(R) 82915G/GV/910GL Express Chipset
Converter : Internal
Refresh Rate (min/max) : 60/75 Hz

power supply
output max 230w
+5v + 3.3v max output is 108W
 

rcschmie

Distinguished
Mar 6, 2011
3
0
18,510
I have the same problem, and also need a response. I've actually installed two cards that claim to have dual monitor capability, a Radeon (ATI) and a Nvidia, both having HDMI, DVI, and VGA output, both having 1G DDR2 memory. Both my monitor and TV are new, both have various multiple inputs, and both will accept 1080P inputs. No matter what video card outputs I use or what higher resolution I set up, the output of the video card (in both cases) resets to 1024 X 768. This is the native resolution of the Samsung Plasma TV, but the tv is supposed to accept a 1920X1080 input and downscale it. Does anyone know what the problem is, and what it takes to drive two display devices simultaneously at higher resolution ??
 

MrJustinMr

Honorable
Feb 8, 2013
1
0
10,510
Your plasma TV is most likely reporting 1024x768 as its maximum input resolution when asked by the graphics cards for its EDID capabilities. This isn't altogether silly because the picture's nicer if it doesn't have to downscale, but if you want to run both monitors with the same image then, as you've recognised, you need to persuade your graphics card to use 1920x1080 for the TV.
You might achieve this by somehow getting your TV to change how it replies - perhaps by updating its firmware if that's possible - so that the higher resolution sizes are made available. I doubt this will be possible though, as TVs are appliances and are rarely significantly upgradeable in the field.
Otherwise you can override the resolution and monitor timings in your graphics settings manually, forcing the higher resolution into the list of those allowed. This is done in different ways in different operating systems, and also depends on the graphics card driver. Many of Intel's on-board drivers for instance allow this right from their "Graphics and Media Control Panel" which installs as part of the driver package. I'd suggest you check the driver's possiblities for this before resorting to the suggestions below. Look in the driver's control panel for such things as "Custom Resolutions", and work on the TV only to begin with (disconnect the other monitor) until you've established working values for that. Then add-in the other monitor and it should then work alongside it.
There's plenty of reference on the web of how to do this. Here's a good example of how to do it without needing the card manufacturer's tools:
http://hardforum.com/showthread.php?t=1605511
This article seems to describe the same process in fewer words, but possibly with a bit less helpful info:
http://www.ehow.com/how_7649449_add-custom-resolution-ati.html

Best of luck,
Justin.
======
 
Status
Not open for further replies.

Latest posts