9600XT display problem

rlong

Distinguished
Mar 23, 2004
16
0
18,510
Hey everyone,
It seems that my 9600 XT 256 is displaying games worse then my G3 ti200 128.
I'm not sure how to describe it but.....when I'm in gmae and I'm moving I can see, what looks like graphics from the last frame.
Like a pole seems like theres a trial on it. Not sure if that explains it.

system:
P4 2.53ghz
1 gb 2700
asus p4p800 deluxe

please let me know as soon as possible
thank you
 

rlong

Distinguished
Mar 23, 2004
16
0
18,510
the odd thing is it started after a clean install.
and I got so pissed I ran a zero fill on the HDD and re-installed again but its still happening.
The only time I can remember seen something like this is when I overclocked my G3 way to much...but I haven't OCed this card.
 

rlong

Distinguished
Mar 23, 2004
16
0
18,510
its in all games. But I found out what it was...the 4.4 drivers have v-synch off by default. I turned it to always on at it runs so much better. no more trails or jagged edges.
Thanks for the help guys
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Now. That's weird. V-sync, actually is a performance killer and should be disabled. If you get "remnants" of the previous frame (especially evident when you turn around in a first person game - am I right or am I right?) you have a refresh rate problem in the first place.

Check your monitor is set to the maximum refresh rate supported for your desktop resolution and make sure your Adapter is set to the same refresh rate. 85Hz should be optimal for most modern monitors/adapters...

Finally, many games behave badly if the resolution at which you play the game is different from your desktop resolution. If you want to play a game at 1152x864, for example, set your Desktop resolution to 1152x864.......

<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

BigMac

Splendid
Nov 25, 2003
5,636
0
25,780
Now. That's weird. V-sync, actually is a performance killer and should be disabled.
Can you elaborate a bit on why you think V-sync is a performance killer?

V-sync means that the framerate of the card will be upperbounded by the refreshrate of your monitor (it's not gonna compute more frames than needed for the refresh rate of the monitor because the monitor will not be able to display them anyways.

The problem as stated with the original poster seems to be that the number of frames is bounded (as it should) but the result is (sometimes) actually displayed in the middle of a frame refresh. By disabling V-sync apparently the bug is circumvented (but possibly useless frames are computed), but V-sync should just work.


BigMac

<A HREF="http://www.p3int.com/product_center_NWO_The_Story.asp" target="_new">New World Order</A>
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Taken from FiringSquad Forums

<A HREF="http:// http://forums.firingsquad.com/firingsquad/board/message?board.id=pc&message.id=3020" target="_new">http:// http://forums.firingsquad.com/firingsquad/board/message?board.id=pc&message.id=3020</A>

<font color=green>Big THANKS and KUDOS to Trogdor</font color=green>

I hope this answers your question fully. See details below.

***
Vsync <font color=red>ON</font color=red>
2004-04-12 16:00:23 - UT2004
Frames: 4756 - Time: 93582ms - Avg: 50.822 - Min: 29 - Max: 61

Vsync <font color=red>OFF</font color=red>
2004-04-12 16:03:07 - UT2004
Frames: 6787 - Time: 88385ms - Avg: 76.789 - Min: 51 - Max: 112

Those frame drops are noticeable in Far cry too when I look at some video hungry scenes. They do not happen when VSYNC if off. It sucks pretty much since VSYNC is a must. I'm using the latest ATI video drivers.

***
That's because vsync synchronizes the frame rate to your monitor's vertical refresh rate, that's how it prevents tearing. This won't have any noticeable effect because you can't see more frames than your monitor can display anyway, which is determined by the refresh rate. Besides, your video card still has just as much power to prevent the frames from dipping too low.

Apparently your refresh rate is set to 60hz in Windows, for the resolution you're playing in anyway, and your monitor is running it a little bit high at 61hz.

You can get any number of utilities to change the Windows refresh rate for inactive (non-desktop) resolutions, which will allow you to see more frames with vsync enabled. For those running the Nvidia drivers, you can do this from the driver utility <font color=red>[Slava: this is what Refresh Rate Override in nVIDIA driver is for]</font color=red>.

The 60hz refresh rate is the Windows default for non-desktop resolutions.

Actually, if you're running at a 60 Hz refresh rate, there's really no point in trying to get a game to run at super high speeds. Get a new monitor first! I mean, so what if your graphics card can run at 180 fps in a game.

What that really means is that each refresh of your monitor will show information from three different frames. Yuck!

<font color=red>Personally, I think anyone that's serious about gaming should have a monitor that's capable of running at 85 Hz for their desired playing resolution. Then the performance hits from enabling vsync aren't as bad.</font color=red>

As to the numbers he lists, the 61 Hz max is might also be caused by low-detail situations where his system manages to render two frames in less than 1/60th of a second. (It's not that likely, with the 112 fps max he gets, but it could happen and the game might not be tracking rendering times accurately enough to report this.) The first frame will never actually be seen if vsync is enabled! Or maybe it's the second frame that gets skipped? Anyway, you won't ever see one of those frames.

Now, for the min and avg. frame rates getting hit so hard, think about this: In an ideal situation, your system would be able to render a frame in less than 1 refresh cycle. So if you're running at 85 Hz, you want your PC to be able to render in 1/85 of a second or less. Your max frame rate would then be 85, as would your average and min rates. But that's just in an ideal world, and games aren't ideal. So what really happens?

Well, <font color=red>if vsync is enabled,</font color=red> let's say your refresh rate is 60 Hz (as his is). Now, let's say that your system is only capable of rendering 59 frames per second. In other words, each frame that could be rendered is completed just after the screen starts to refresh. Now, instead of running at 59 fps (with tearing), you will get 30 fps for min, avg, and max. Ouch! That's sort of the worst case scenario, but it's still not what really happens.

<font color=green>Here's what really goes on.</font color=green> The complexity of a given scene will change as you move around in the game. Scenes with low complexity might render at 100 fps and scenes with high complexity might render at 30 fps, but the average will be somewhere inbetween. Generally speaking, fram rate fluctuations between a few frames will be minimal, so if frame 555 takes 1/50 of a second to render, the preceding and following frames will probably take close to 1/50 of a second as well. However, over time you will have slowdowns, i.e. an explosion suddenly increases scene complexity for 1 or 2 seconds. Now, if your system can't render a frame in less than one refresh cycle, the minimum frame rate for a sequence of frames will drop down to 1/2 of your refresh rate (the worst case scenario). If it can't render in two refresh cycles, your minimum rate would drop to 1/3 of your refresh rate, or 1/4 for missing rendering in 3 cycles, etc.

Look at his numbers, and you'll see this. With no vsync, his max rendering speed was 112 fps, or 1/112 seconds per frame. Generally speaking, then, he maxes out at rendering 1 frame in a cycle and can almost render 2, so 60 fps max (61 in his case). His max sits at his refresh rate. The min he got was 51 fps, which is now lower than his refresh rate, so with vsynch he now ends up at 30 fps (29 in his case). The average will obviously be inbetween these two extremes, and in his case it looks like about 66% of the time he is rendering at faster than 60 fps and 33% of the time he is slower than 60 fps. (Mathematically: 2/3 * 60 + 1/3 * 30 = 150/3 = 50)

Now, let's say he bumps up to an 85 Hz refresh rate. He can now max out at 85 fps (he won't get higher than that since his system never renders at 170 fps or higher), and if he can always render at greater than 42.5 fps, his min will be 42.5. Well, looking at his non-vsync scores, he can do this, so his min fps and max fps increase by roughly 33%, and the average will likely be much closer to the 77 fps he gets with vsync disabled. If he maintained the 2-to-1 ratio of his 60 Hz rates, he would now have an average of 73 fps. Of course, he probably won't, so his average will be more like 65 to 70 fps, but that's a big step up from 50 fps.

<font color=red>So what's more important? High frame rates or high refresh rates?</font color=red>

Personally, I think it's a middle ground. You want frame rates of 60+ fps, but you also want the image quality that vsync gives. If you have a really nice monitor, any refresh rate of 120 Hz or more will make the question more or less meaningless. Worst case, you would get 60 fps (assuming your system is fast enough). <font color=red>Otherwise, you need to live with either lower frame rates or image tearing.</font color=red> Sorry.

<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool: