[SOLVED] Can I use at higher than 1080p resolution a brand new 4K TV as a second monitor with my very old PC?

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
I have a very old desktop PC running Windows 7, the video card is ATI HD 5800 Series and I'm using a relatively new computer monitor BenQ EW3270ZL 32 inch WQHD without any problems using its full resolution of 2560 x 1440 connected to a Display port of the video card. The video card also have one HDMI port which I connected to a 4K TV I recently bought LG 65NANO90UNA and trying to use it also as a second (extended) display for the PC. The problem is the highest (recommended) resolution for the TV shown in the Windows Display control panel is 1920 x 1080. The LG TV drivers website has a TV driver but it looks like that is just a color profile file for the TV and installing it on my PC made no difference for the resolution problem.
I will greatly appreciate your help with this and any input.
 

Darkbreeze

Retired Mod
I'd ask to see the purchase receipt AND to see the card working on EACH of the video output ports before buying it. I'd also probably want to see it run Furmark or the Heaven benchmark for at least five to ten minutes, in person. There is a huge problem with people trying to pass of faulty graphics cards to other people these days, especially since so many of them have been used for mining bitcoin and other crypto currency.

If it's actually that new, and is in good shape, then it's worth it. But I'd need some proof that it actually is or I'd be VERY skeptical about it. Also, if you buy a used card, you get ZERO warranty, no matter what anybody says, unless you have the original purchase receipt AND access to the email account it was originally registered under. If you don't mind paying a little more, this is probably a better option because you avoid those potential pitfalls, get a warranty and have zero chance it won't work without being able to get it replaced. With a used card, if it stops working three days later, you're just out of luck. Period.

It's significantly more expensive, but comes with three HDMI ports and one Displayport. Obviously, the 1050 ti is a better option if it's in good shape, but again, "deals" aren't much of deals when there is so much question about condition, and graphics cards are probably the highest failure rate item in any PC even without adding in the additional potential for user caused damage from crypto mining, abuse such as overclocking, being used with a cheap power supply that has bombarded it with high levels of ripple and weakened the capacitors on it or even just gaming hard as hell on it under conditions it was never meant to be used for.

There HAS to be a reason why somebody would buy a card like this in September and already be selling it in January, and usually that reason is because something is wrong with it.

PCPartPicker Part List

Video Card: Gigabyte GeForce GTX 1650 4 GB WINDFORCE OC Video Card ($239.00 @ Canada Computers)
Total: $239.00
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2021-01-05 20:38 EST-0500
 
Reactions: Emil3D

Darkbreeze

Retired Mod
Drivers for TVs and monitors have nothing to do with what resolution something can be displayed at anymore. Everything that is DP or HDMI is plug and play and does not require any kind of driver for optimal configuration and performance.

The drivers for the graphics card AND what type and generation of display output are in use are what will determine the maximum possible resolution.

ATI HD 5800 series doesn't really tell us all that much. What is the ACTUAL model of the graphics card? That is what will tell us things like which HDMI or DP versions are used and by that we can determine what the maximum resolution is, although, if it SAYS the maximum resolution is 1080p then it is probable given the age of that graphics card that the maximum resolution using HDMI on that graphics card IS likely 1080p. 4k did not likely exist since 4k standards weren't released until 2012 and that graphics card series was released in 2009.
 
Reactions: Emil3D

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
Thank you for replying Darkbreeze,
I don't know how to check more about the card model. It says only ATI HD 5800 Series when I go Display > Resolution > Advanced Settings > Adapter tab. I bought this card long time ago second hand from someone on Kijiji because it worked on my PC. I was hoping to get higher resolution display on the TV because the card provides higher resolution 2560 x 1440 for the monitor. I will be happy to get the same for the TV, it doesn't have to be 4K. The problem is I'm losing screen space when I use the TV as extension monitor and worst of all If I want to duplicate the computer monitor on the TV, both displays drop to the lower 1920 x 1080 resolution which displays images and videos worse.
 

Darkbreeze

Retired Mod
Try the following.

Click Start.
  • On the Start menu, click Run.
  • In the Open box, type "dxdiag" (without the quotation marks), and then click OK.
  • The DirectX Diagnostic Tool opens. If asked, click no when it asks if you want to check for signed drivers.
  • Click the Display tab. Usually the info will be on the Display 1 tab if there are multiple display tabs.
  • On the Display tab, information about your graphics card is shown in the Device section. You can see the name of your card, as well as how much video memory it has.
If that doesn't work, you can try downloading GPU-Z, installing it and running it, then looking on the main specs tab for the model. dxdiag should give you the model though.
 
Reactions: Emil3D

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
Thank you again Darkbreeze, and happy new year

I followed your instructions as shown on the linked image below

https://www.dropbox.com/s/l03tvlfpnyc5x49/My GPU model search.jpg?dl=0

and I ended up with this model :

Cypress XT [Radeon HD 5870]

Meanwhile I realized that if I connect my computer monitor to the PC with the HDMI cable it also limits its resolution to 1080p. The HDMI cable came with my monitor, so I guess it is the HDMI port on my GPU card that limits the resolution. My card has one display port, one HDMI, and two DVI dual link connectors. My previous, older monitor was also with WQHD 2560 x 1440 resolution and connected with one of the DVI connectors without a problem. I guess I need to find DVI to HDMI converter that supports 2560 x 1440 resolution?
 
Last edited:

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
Everything looks the same except that I can't remember mine has that image, logo, and model printed on it. That side is facing the bottom of my case and is hard to see because there is very little space between the power supply and the card which is all jammed with many cables. I have to disassemble it to make sure but all else looks the same.
 

Darkbreeze

Retired Mod
You have the same outputs on the back of the card though, right?

Looks like the HD 5870's HDMI version supports a maximum resolution of 1920x1200 (So, 1920x1080 for most displays) unlike newer HDMI versions that can support higher resolutions.

Over displayport or DVI though, that card DOES support 1440p resolutions, but if you adapt from the HDMI output on your card to anything else, it's still only going to support 1080p. You need to use either DP from your card to DP on your monitor, or DVI from your card to DVI on your monitor, and preferably if you have to buy a DVI cable you would want to get a dual link DVI-D cable so that it supports the maximum possible resolution and refresh rate for DVI on that card.

Since that TV only has HDMI inputs, but they support up to 4k resolution, then you will probably need to get both a DVI-D dual link cable AND an active DVI-D to HDMI adapter, because I'm not very confident that a passive adapter is going to work in this case. It might, but I'm not positive. One of our other members, Glenwing, could tell you for certain but I can't without looking into a lot further.

So let me ask this, WHAT, exactly, is the GPU to TV connection going to be used FOR? Gaming? Watching movies or other multimedia? Just a second monitor for more real estate?

Because to be honest your best move here would REALLY be to ditch that old graphics card and move up to something newer. If this isn't primarily a gaming system OR if this connection to the TV itself isn't meant for gaming, then even a cheaper graphics card like the GT 730 or GT 1030 would be perfectly capable of supporting that higher resolution, but they aren't strong enough to do any realistic gaming on unless it's very weak games.

Knowing more about the purpose of everything involved, and the full hardware specifications like motherboard, CPU, power supply model, would be really helpful in helping to find the RIGHT solution here.
 
Last edited:
Reactions: Emil3D

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
Thank you again for your reply,
I don't play games at all. It would be nice if I can use the TV as extension monitor with more screen space, it is wall mounted above my desktop computer. I work from home and in addition to open programs that I use, I also monitor streaming information from the web using a desktop program (not web browser) on many windows that would be nice to watch next to each other instead of switching. Also after work when I use the TV as an entertainment to watch movies or photos it would be nice just to mirror my computer screen to the TV. Currently the TV can see whatever I put in the Windows media player library's Play list and it works fine, it displays photos and videos in their full resolution but it requires the step of choosing Play to TV which takes a very long time to load if I select more than a few photos or large video/s. My desktop is wired to the modem while the TV is wireless. Maybe that's why loading of media is slow. I guess speeding it up with this setup will require another very long Ethernet cable to the TV or Ethernet hub (which I have somewhere at my place but haven't found it yet ) to split the cable because I can't put my modem closer.
Regarding connecting the video card to the TV, my computer monitor has two display ports and two HDMI connectors. From what I check online for DVI to HDMI adapters it seems that none of them can do 1440p and it looks like is the same with DVI to display port too.
Maybe a better GPU with higher resolution HDMI connectors is the best solution but I don't know what is available for my motherboard which is quite old
https://www.overclockersclub.com/reviews/asus_rampage_formula/7.htm
 

Darkbreeze

Retired Mod
This card should be fully compatible with your motherboard and it has one HDMI and one Displayport. HDMI version is 2.0b which supports 1440p @ 144hz and 4k @ 60hz. The Displayport is version is 1.4a and supports pretty much anything realistic you want to throw at it.

https://www.amazon.ca/dp/B071L4VKF6?tag=pcp0f-20&linkCode=ogi&th=1&psc=1

If you want something with additional video out ports to add even more monitors later, the price is going to go up sharply, especially in your region.
 
Reactions: Emil3D

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
That's a great find and is very helpful, I really appreciate your replies.
I wonder if there is a version of the card with two HDMI ports instead of one DP and one HDMI. There is a minor but annoying bug with my current DP connection to the monitor, if I turn off and on the monitor it moves and resizes all my currently open windows to the top left corner as if I have switched to a monitor with lower resolution. The workaround is to turn the monitor off after minimizing all windows (I use Windows key + D to go to desktop) or just use screensaver instead of power off. It is a bug that is reported by many on the web and some have managed to fix it by editing the registry but I'm using the monitor that way, not a very big deal but since I'm going to get this new GPU I wonder if i can clear that issue too. Maybe the card you found won't have that problem with the Display port.
 
Last edited:

Darkbreeze

Retired Mod
Try this. The free version works for me to stop this from happening. I had the exact same problem. Also, turn off auto arrange by right clicking the desktop, selecting "view" and then make sure there is no checkmark next to "auto arrange" by clicking on it. Double check by going back to see that there is still no checkmark next to it.

 
Reactions: Emil3D

Darkbreeze

Retired Mod
I'd ask to see the purchase receipt AND to see the card working on EACH of the video output ports before buying it. I'd also probably want to see it run Furmark or the Heaven benchmark for at least five to ten minutes, in person. There is a huge problem with people trying to pass of faulty graphics cards to other people these days, especially since so many of them have been used for mining bitcoin and other crypto currency.

If it's actually that new, and is in good shape, then it's worth it. But I'd need some proof that it actually is or I'd be VERY skeptical about it. Also, if you buy a used card, you get ZERO warranty, no matter what anybody says, unless you have the original purchase receipt AND access to the email account it was originally registered under. If you don't mind paying a little more, this is probably a better option because you avoid those potential pitfalls, get a warranty and have zero chance it won't work without being able to get it replaced. With a used card, if it stops working three days later, you're just out of luck. Period.

It's significantly more expensive, but comes with three HDMI ports and one Displayport. Obviously, the 1050 ti is a better option if it's in good shape, but again, "deals" aren't much of deals when there is so much question about condition, and graphics cards are probably the highest failure rate item in any PC even without adding in the additional potential for user caused damage from crypto mining, abuse such as overclocking, being used with a cheap power supply that has bombarded it with high levels of ripple and weakened the capacitors on it or even just gaming hard as hell on it under conditions it was never meant to be used for.

There HAS to be a reason why somebody would buy a card like this in September and already be selling it in January, and usually that reason is because something is wrong with it.

PCPartPicker Part List

Video Card: Gigabyte GeForce GTX 1650 4 GB WINDFORCE OC Video Card ($239.00 @ Canada Computers)
Total: $239.00
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2021-01-05 20:38 EST-0500
 
Reactions: Emil3D

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
Thank you again for the great advice. I'm not going for the used card on Kijiji, its too much hassle and doesn't worth the risk. I'll buy a new card but have to decide 1650 or 1030
I installed the DisplayFusion and right now i'm trying to find the feature that will restore the windows after turning the monitor off and on to see how that works.

edit: following the DisplayFusion instructions from the link below now keeps my windows unchanged when turning the monitor off/on, however the triggers I added in the setup were marked as pro version only features so, I'm not sure if they will remain after the trial period.

https://www.displayfusion.com/Discussions/View/automatically-saving-and-restoring-window-positions-on-dockundock/?ID=5540c67a-5e2b-4f41-b88c-de4d9c2fa354

edit 2: oh, it's just US $15 life time license, not a problem :)
 
Last edited:

Darkbreeze

Retired Mod
Yes, it's not expensive. All of their software is pretty useful. I use Displayfusion and Clipboard fusion. Clipboard fusion is very useful to me in saving snippets of code and links for use on various forums like this one which I need to reference pretty frequently, but it's also great for other things as well since it allows you to "pin" pretty much anything you've copied to the clipboard and give it an easy to remember name for future reference. Anyhow, all their software is at least half decent and some is really good.

The only problem really with the GT 1030, for a non-gaming type machine, is that it's limited to usually only two outputs.

Out of curiosity, what CPU are you running?
 
Reactions: Emil3D

Darkbreeze

Retired Mod
Just wanted to see if maybe you had one with integrated graphics, but that's not the case.

Yes, that's very old. Honestly, it's time to upgrade. Rather than spending money on a graphics card, you might consider doing something like this instead which would give you the same solution with extra video outputs (Both HDMI and Displayport, plus you would be able to use your current video card still if you wished to, along with the integrated graphics. These integrated graphics are more than capable enough for anything you want to do aside from gaming, and even gaming would be possible but only at the lowest resolutions and settings for any demanding games. I know you don't game, but I mention it only to point out modern iGPU capability.) from the integrated graphics, and no need to buy a video card. Plus you get the added benefit that your money went towards a platform change and now you are good to go for probably another five to eight years, if not more. This i3 is very capable with four cores and an additional four hyperthreads for a total of 8 thread capability. Very good performance for the price. It's something to think about anyhow.

PCPartPicker Part List

CPU: Intel Core i3-10100 3.6 GHz Quad-Core Processor ($149.75 @ Vuugo)
Motherboard: ASRock B460M Pro4 Micro ATX LGA1200 Motherboard ($122.43 @ Vuugo)
Memory: G.Skill Ripjaws V Series 16 GB (2 x 8 GB) DDR4-3200 CL16 Memory ($78.99 @ Newegg Canada)
Total: $351.17
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2021-01-06 17:47 EST-0500
 
Last edited:
Reactions: Emil3D

Emil3D

Distinguished
Jun 16, 2008
66
1
18,635
0
Thanks again for the advice.
Yes, I will upgrade but what is holding me is the whole system, programs, settings, and customization setup for the work and things I do that was build over the years. Maybe it won't be that hard but I'm anxious about transferring easily and without problems everything to the new computer. I'm also used to Windows 7 and getting used to 10 will take some time that I don't feel like doing it now. I know I will have to eventually upgrade, I guess I just need a few more pressing issues to do it :)
 

Darkbreeze

Retired Mod
I don't like Windows 10's user interface and start menu either, which is why I don't use it. I use Windows 10, but I install Classic shell which allows me to tailor the start menu, task bar and much of the Windows "shell" to look and act just like Windows 7, or XP or even much older classic Windows versions if you desire.

Transferring easily is a different story. You will likely need to be prepared to do a full clean install when you upgrade, as it's improbable that an upgrade to 10 on a installation that used different, older hardware, is going to run without problems if you try to use the same existing installation on the new system. Most programs that will run with Windows 7 will run with Windows 10, so reinstalling them shouldn't be that big of a deal but any that are critical you will of course want to seek out verification or alternatives first to be sure you'll still be able to do whatever that application offered.
 
Reactions: Emil3D

ASK THE COMMUNITY

TRENDING THREADS