Converting DVI to VGA Problem

Seyed Sadra

Prominent
Apr 14, 2017
3
0
510
Hello guys! I have recently bought an Asus ROG STRIX-GTX1050TI-4G-GAMING Graphics Card for my desktop PC because my previous one was too slow. My monitor is a Samsung 1360*768 monitor that only has a VGA port but the problem is this that my new Graphics Card doesn't have a VGA Port. It only supports HDMI, DVI and something else called Display Port. I wanted to buy a DVI to VGA Converter, But i heard that if i use this converter, i might lose resolution. Please tell me what should I do? Will i really lose resolution and quality if i use converter?

PS: I can't afford a monitor at this time. Please don't suggest me to buy another monitor.
 
Solution


An user review:
"Using this adapter to connect my new Nvidia 1050 Ti video card to my NEC EA261WM monitor I was only able to achieve 1680x1050 maximum with 1280x1024 native"
Since the monitor is 1360x768. I don't think that you'll experience any issue.

Seyed Sadra

Prominent
Apr 14, 2017
3
0
510

Are you sure that I won't lose quality and resolution with this converter?

 
'' Are you sure that I won't lose quality and resolution with this converter ''

that's the chance you take until you get a supported monitor for these cards .. lot of guys overlook that and find these newer cards just aint compatible with there goos solid older monitors that need analog support

reason I passed on a 10 series and got a 900 series your monitoe would of worked native on them or older series

today you now support them cause there not supporting you and your needs $$$$$

 


An user review:
"Using this adapter to connect my new Nvidia 1050 Ti video card to my NEC EA261WM monitor I was only able to achieve 1680x1050 maximum with 1280x1024 native"
Since the monitor is 1360x768. I don't think that you'll experience any issue.
 
Solution
like said it it does or not your kinda stuck using it or one like it or a new monitor - .

like I said my 900 card and below works with my old from the 1990's CRT vga monitors right up to my newest flat screen NATIVE , no added cost or lost good hardware use ''out of the box works as needed and expected . [also works with XP and up . cant even say that for a 10 series or AMD latest , [fully covered ALL my needs ]

see how much that card upgrade costs ?? now more then just the price of the card to have it work .

thumbs down list [not just asus ]

''•DVI output no longer includes analog VGA signals ''

[page 3 ''the card'' ]

''Display connectivity options include two DVI ports, one HDMI port, and one DisplayPort. Unlike previous NVIDIA cards, the DVI port no longer includes an analog signal, so you'll have to use an active adapter.''

https://www.techpowerup.com/reviews/ASUS/GTX_1050_Ti_Strix_OC/34.html
 

USAFRet

Titan
Moderator


I'm not sure what the point of this rant is.
Not everything is automagically compatible with everything else, forward and/or backwards.

At some point...to use a new item...you have to upgrade other stuff.
Or, you could hang on to the old stuff forever, until it falls apart.

Oh noes...my 15 year old monitor is not compatible with the new GPU.
Or, Oh noes...I have to get new RAM to go with the new motherboard to go with the new CPU.

Times change.
Interfaces change.
Capabilities change.

Live with it or move on.
 
the rant is if he looked befor he leaped he would of see this coming .

all right there to see and understand in the cards reviews .. see you pay 100$ 400 for a card that cant do 1/2 of what the 100- 400$ card from the last series can do out of the box native

get it ???? .lol...... now add that crazy adaptor to that cost or a new monitor to enjoy your new 1/2 butt card , right ???


sorry the truth hurts , but good luck
 

USAFRet

Titan
Moderator


He already has a low end monitor.
But now, he has a good GPU that will work quite well when he gets a new monitor.

And the visual difference between that old Samsung and something new will be dramatic.
 


I'm sorry to have to say it, but you are not sounding too good in this discussion. Analogue outputs have been on the decline for generations, both as inputs and outputs, it's cost for functionality that very very few people use nowadays, analogue on a digital display is not clear and crisp, so get rid of it.

The 1050ti has better performance than the 950, not sure where the 1/2 capability comes from, unless you are talking about the analogue outputs being 1/2 of it's capability, gut feel maybe 1-2% of the PC population are using analogue monitors now, so that 98%.

Can I suggest that you take a nice long look at what you are about to post, and see if it makes you look more or less foolish, and then edit it to go down the less foolish path.

As to Seyed, going from digitial to analogue will always have some loss in image quality, in a gpu you would ideally want to go from digital (inside the GPU), and maintain that digital path all the way to each individual pixel. What you are going to do is take a digital signal, approximate it with an analogue one, and approximate the original digital signal once inside the monitor. They all work the same though, no one is investing in the 'best' solution to this, it'll be a bit blurrier.
 
''I'm sorry to have to say it, but you are not sounding too good in this discussion. Analogue outputs have been on the decline for generations, both as inputs and outputs, it's cost for functionality that very very few people use nowadays''

so I throe away a good solid working crt monitor in the trash and spen 100's on a new one that don do anything or look better ???
ya ok..... or trash my xp and vista rigs and spend 100's on new windows ???

dang how much does that 10 series cost me now ???? $$$$$$$$$$

'' 900 series supports all my needs xp vista crt monitor native ? yes no extra money spent to support the card and works with all mu stuff . 10 series = no xp support ? no analog ? no

see you may like to spend unnesserly maybe you like to buy a 200$ card and then buy a 200$ monitor when the 900 series all you need is the card ??

like I said a pascal does ''LESS'' then a 900 series support wise ?

why do I need to throw away and not be able to use all my hardware over that card when I can get a card for the same piece that will ???


I guess for some a fool and there money are soon parted lets se he got a 200$ card now needs a 100-200$ monitor for it ??? or some crap adaptor that may work out ???

glad to see you think all that's great ? show me how anything adds up ? see he could of got a 900 series and all would of been plug and play right out of the box not looking to spend more to enjoy his card , right ????

sounds like im silly but my money is still in the bank not going to support my new card I found don't support my needs fully LOL.......

defend that

anyway good luck wuith your new card , oh, I forgot you cant until you spend extra to support it with your hardware that's working well with something that did support you

 


Like I said, think carefully and edit, I guess you didn't. >95% of users will not have your problem of using hardware that should have died years ago, you happen to be lucky or have large stocks, that makes you quite rare, so why build for the rarity. Why add design elements and DACs for the 1 in 1,000 potential purchasers (if not lower).
 

4745454b

Titan
Moderator
Seeing as you are still using CRT, I'd just get the adapter. Any loss is video quality isn't going to be something you'd notice, and you are unwilling/able to buy a new monitor. So get the adapter and enjoy your new setup. I held onto my 21" CRT monitor for the longest time as it was expensive, and it had truer colors compared to LCD monitors. But after I set my Ex wife up with an LCD monitor I noticed the "not true blacks" issue didn't bother me. I now run three 1080 monitors in an Eyefinity setup on my desk. To be brutally honest yes. I would junk the working CRT monitor and buy a new LCD monitor. The picture is so much better. Not to mention cheaper to run.