Monitor is blinking at me! weird

hartski

Distinguished
Feb 12, 2002
403
0
18,780
Funny subject but serious problem. My monitor just started blinking at me. It goes black let's say every 20 seconds in 5 minutes then it stops for 30 minutes and then starts blinking again...sometimes it blinks completely on random times...However it only blinks when I'm in a game...never when I'm just on my desktop surfing the web or watching DivX etc. I currently play Warcraft 3, Counter-Strike and The Thing and my monitor blinks in all of them so it's not the games. Could it be the videocard though. But I suspect the motherboard coz it only started happening after I changed to a new motherboard

my system specs are:

HP L1520 LCD Monitor connected via DVI (old)
AMD AXP 2200+ (new, upgraded from 1800+)
Gigabyte 7VRXP (new, upgraded from Abit KR7A)
ATI Radeon 8500 64MB running on Catalyst 2.2 (old)

I also found a difference in settings on my Abit BIOS that is enabled on my Giagabyte mobo. The AGP fastwrite and another setting is enabled for AGP. Another problem too is when I run 3dmark 2001 SE my computer restarts at I think one of the lest 5 tests which is at like 80% of the testing progress. I really think it's the mobo! But please help too...thanks!!!
 
If your game changes the resolution and refresh rate to one that is close to or out of the monitors range, this may happen.

Jim Witkowski
Chief Hardware Engineer
Monitorsdirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
My game prolly doesn't change the resolution coz my desktop is set at 1024x768 and I play all my games with the same resolution. I also doubt the refresh rate is being changed. On my old motherboard this never happened. Can it be a power issue? I have 300W PS from Future Power. Could the new AXP 2200+ be eating a lot of power and it kinda steals some power from the AGP slot? I'm not familiar with this but that's one of my assumptions. It could prolly be the new mobo...it only happened after I switched new mobo and CPU...I'll try playing using Analog mode instead of Digital...but this shouldn't happen!
 
Almost every game I know of changes the resolution and refresh rate, this is the the most likly suspect. Power is not the problem, cards cannot steal power from other parts of the machine.

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
If the problem are the games changing the resolution and refresh rate how come it didn't happen on my old motherboard and only on my new motherboard. How can you stop the game to change the resolution? ALso how come it needs to change it...what's annoying is it changes it several times when I am in the start, middle, end of a game! Can't it choose one refresh rate and stick with it...Will this happen if I switch to a CRT monitor?
 
The blinking stating with the motherboard change could be a simple coincidence or there is a problem with the video card. Likely problem is damage by static electricity. Many games change resolution default to a lower resolution in order to be compatible with the older 15” monitors that many people still use. It depends on the game and the amount of control it allows, as to if you can change it back.

The only way to find out for certain is try another monitor and or video card to isolate the problem.



<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
I had planned to try using a different monitor but it won't be a flat panel but a CRT. I also considered it was a videocard problem but didn't wanna think so coz it'll cost me around 100 bucks.

Let's say I did damage the videocard with static electricity while I was tinkering with it when I changed my mobo, will it be dangerous to be still using it right now on my new mobo or the videocard being damaged won't affect the whole system?
 
It prolly won't do damage to the rest of the system, although if it should fry it may mess up ur agp port (unlikely, especially since I doubt it'll fry, even if damage is done). Other thing is that ur mobo is a Via chipset, and Radeons are known to have problems. U have any freinds whom u can borrow a vid card for a minute?

What if you had admin rights to life?
 
I am pretty sure if there is damage it's only coz I gave it a shock while I was doing the upgrades...My old mobo is an Abit KR7A and the blinking never happened with that...Won't the VIA 4-in-1 drivers fix it, though I've read you shouldn't use the 4-in-1 with Windows XP.

Well first I'll try to use a 17" CRT monitor and see if it happens...I don't have friends who have DVI on their cards but I can use a very old vidcard from Trident, 4MB, AGP-CRT connection only...Is it advisable to use that old Trident vidcard on a new mobo...what if the vidcard is only AGP 1x and the mobo is AGP 2x/4x?

If things are still f'ed up after I use a different monitor then I'll prolly buy a new R8500...
 
here are the results...instead of trying to use a diffrent monitor which is a CRT I tried switching from Digital mode to Analog mode on my LCD monitor...and gee wiz it enver blinked...even after 3 hours of gaming!

it prolly been the Digital mode that was damaged when I spiked it...but does anyone think it's the motherboard? I will get the replacement for my old Abit motherboard and I can compare but I don't wanna risk f'ing up my CPU and vidcard so I guess if Analog mode works I'll stick with it for now...