Ok, I've done a couple posts about this but haven't really gotten any helpful feedback yet. I've just noticed lately that whenever I run ANYTHING in Direct3D in 32-bit I get a major performance hit that never occured before. I don't know what date this changed but I know that I haven't changed my drivers or directX or anything. I'm running a Radeon 8500LE 128mb, with a Barton 2500+ and a Leadtek nForce2 mobo. I decided to check the performance on something I knew wouldn't stress my vid card too much so I started up counter-strike in 1024x768x32 in direct3D and found that my fps suffered HORRIBLY. If I was lucky, it would hit 60fps. I've always run counter-stike in OpenGl and get a contant 100fps, no matter what. It just doesn't make any sense that my system should suddenly hate direct3d and 32-bit mode. I didn't even think I'd have to run anything in D3D because OpenGl worked so well for everything else, now there aren't new games out there that even support it. A lot are even permanently set to 32 bit color, which screws me over completely.
I used to get a 10,000+ 3dmark 2K1SE score in 1024 and 32 bit mode, now I get 4,000. WTF? Also, there was only a minute change in performance when I ran it in 16 bit mode before (maybe 100marks) and now the difference is 4,000 in 32 bit and 9000 in 16 bit. It's like the performance hit from enabling 4xFSAA! I'm completely baffled on this one.
I used to get a 10,000+ 3dmark 2K1SE score in 1024 and 32 bit mode, now I get 4,000. WTF? Also, there was only a minute change in performance when I ran it in 16 bit mode before (maybe 100marks) and now the difference is 4,000 in 32 bit and 9000 in 16 bit. It's like the performance hit from enabling 4xFSAA! I'm completely baffled on this one.