Hi all,
I'm completely stumped, having built a new machine, and spending hours updating and all to get everything working perfectly - which it was for 2 days, until when briefly playing an old game from the secondary hard drive, the screen started flashing on and off, then went dead. The monitor is a dell U2711 and i had been running on DisplayPort in 2560x1440, on windows 7. The grahpics card was ripped out of the old machine - a radeon HD7850.
Ever since then, i've only been able to run a reduced resolution (1920x1080) as the maximum via DP or HDMI, and forced to use HDMI, as the displayport is highly unstable (often doing the same flashing or similar during common tasks, such as viewing photo's, or particularly when playing games again). I've tried swapping the graphics card out to a spare Radeon 4670 I have lying about, but via DVI that still only runs in 1080p.
The only way to get above this 1080p is DVI on the HD7850 - BUT, though windows suddenly allows full 2560x1440, it really doesnt work nicely, text unreadable and a stretched image etc. It turns out on closer examination, in the monitors own menu under "display info" - it shows that its dispaying in "1280x1440" - hence the awful image.
Any ideas anyone? I've completely run out fixes and attempted everything I can, installing and uninstalling drivers and such so often now that its gotten painful, as well as graphics card bios. I just dont understand what it can be, swapping bits in and out as much as i can....
the base spec is that of an i7 4790k on Asus Z97-AR, 2x8Gb Corsair vengeance Pro 2400 and windows 7 64bit.
please help if yu have any ideas
I'm completely stumped, having built a new machine, and spending hours updating and all to get everything working perfectly - which it was for 2 days, until when briefly playing an old game from the secondary hard drive, the screen started flashing on and off, then went dead. The monitor is a dell U2711 and i had been running on DisplayPort in 2560x1440, on windows 7. The grahpics card was ripped out of the old machine - a radeon HD7850.
Ever since then, i've only been able to run a reduced resolution (1920x1080) as the maximum via DP or HDMI, and forced to use HDMI, as the displayport is highly unstable (often doing the same flashing or similar during common tasks, such as viewing photo's, or particularly when playing games again). I've tried swapping the graphics card out to a spare Radeon 4670 I have lying about, but via DVI that still only runs in 1080p.
The only way to get above this 1080p is DVI on the HD7850 - BUT, though windows suddenly allows full 2560x1440, it really doesnt work nicely, text unreadable and a stretched image etc. It turns out on closer examination, in the monitors own menu under "display info" - it shows that its dispaying in "1280x1440" - hence the awful image.
Any ideas anyone? I've completely run out fixes and attempted everything I can, installing and uninstalling drivers and such so often now that its gotten painful, as well as graphics card bios. I just dont understand what it can be, swapping bits in and out as much as i can....
the base spec is that of an i7 4790k on Asus Z97-AR, 2x8Gb Corsair vengeance Pro 2400 and windows 7 64bit.
please help if yu have any ideas