Can I use Motherboard VGA and Graphics Card will still perform?

Status
Not open for further replies.

FoxwoodForgemoor

Honorable
Oct 8, 2012
27
0
10,530
Hi, I am making a new build, but I have a dilemma

Can I use the VGA ports on my motherboard and my graphics card will still work through the PCI-E and will output the renders on my VGA port to my monitor? OR do I have to buy a DVI to VGA adapter which will go into the GFX card and render games on my monitor? I don't know, I just bought some parts today and a £10 VGA cable 😛

Thanks!
 
You can only use either the Onboard (connected to the mobo) or Dedicated graphics card (graphics card in the pci e slot) You cannot use both.

Most graphic cards will come with an adapter for Hdmi to vga, DVI to vga etc.
 
You can use both if you are planning on using a multi display setup yes. Otherwise I would just use your graphics card.
[flash=560,315]http://www.youtube.com/v/N8nMShuSfgI?hl=en_US&version=3[/flash]
 


That depends on if the MOBO supports it. Typically your going to be forced to use one or the other. (in my experience)

For multi display I'd still only use the graphics card.
 


DVI to AVG adapters are cheap. Like $5. Run to your local best buy and grab one when your card comes in.
 
An adapter for what VGA. Typically they do but its hard to say however you normally can pick them up. I was watching a Linus tech tips video and I recall him saying that not all dvi connections are capable of going back to vga.

[flash=560,315]http://www.youtube.com/v/xMVDejZH4kw?hl=en_US&version=3[/flash]
 



I agree less of a hassle honestly.
 

Great Point! +1

@OP here is a list of the different interfaces for DVI. What bigshootr is talking about (if you refer to the link) you'll see that DVI-D interfaces lack the C1, C2, C3, C4. I may or may not be correct here. But im assuming these would be an example of DVI that cannot convert to VGA because of the lack of those "analog" pins. So worth checking first.

Again, Don't take what im saying as 100% a little research would be healthy for the wallet 😉.

http://en.wikipedia.org/wiki/Digital_Visual_Interface

*edit* just found this part. "A passive DVI-to-VGA adapter. Requires the analog signals provided by DVI-I or DVI-A."
 

To my understanding yes. But I have zero "live" testing with it. Tbh i haven't really paid much attention to it when i have used an adapter 😛

But ya.. The DVI-D are the interfaces that are missing the analog 1 2 3 and 4 pins/slots.
 
Yea, I gotcha. VGA has been heavily phased out anyway. I'm using a Display Port cord now. I like it a lot thus far. With the lower quality of the VGA cable it just doesn't make sense anymore really other than to support legacy monitors.
 


Agreed. Love display port, I think it will be to hdmi what Dvi was to vga eventually. Just wished the monitors that supported display port weren't so darn expensive!

Ah and just an fyi. Found in the reading.

DVI-A = Analog only
DVI-D = Digital only
DVI-I = Integrated and supports both (love how we can't just use this as a standard and save everyone money 😉 )
 
I really hope you are right with display port its just really great no royalty cost on the cables which leads to cheaper cables. And DVI-I is great unless you are aiming for a higher resolution displays they won't work.
 


Is that a bandwidth limitation? Or is it because it is limited to RGB only?
I don't know to to much about graphics but always learning.

I know display ports bandwidth is something ridiculous like over 21 Gbits/s where as dvi is like 4
 
Status
Not open for further replies.