TabrisDarkPeace would now exactly about the reason.
SIS chipset doesn't support dual channel.
Well, I wasn't expecting that 😳 - Thanks. (Should've just PM'd, e-mailed or tried to MSN me, I am often online).
I'd recommend a visit to: http://www.sis.com/download/ ; just to get the last drivers fot the SiS chipset, especially if still using the MS Windows supplied ones. (SiS are good in that the Win default drivers generally don't give issues, but performance can be raised using the latest ones directly from SiS).
The drivers he is likely after are:
- AGP (GART), IDE or SATA/RAID *, Network *, Audio *
- * = (if SiS chipset)
Without them overall system performance, including in games, may be impaired.
================================================
Extra Notes:
================================================
But yeah, all Socket 754 systems are single channel (64 bit wide interface) to RAM. Of course if running any DDR333 (PC2700) DIMM it is going to limit performance. Size mismatched DIMMs won't hurt performance much on Socket 754 platforms, but speed mismatched ones would, if it has to revert to the lower speed.
As most (all ?) Socket 754 boards only have 2 x DIMM slots they shoule be able to run at 1T CMD RATE almost all the time under almost every condition. The Socket 754 memory controller (in the CPU) is like 95% efficient. The Socket 939 one is only 90% efficient (or less in 2T CmdRate). However 90% of 6.4 GB/sec (dual-channel) is far more than 95% of 3.2 GB/sec (single channel). 😛
I also aggree with the CPU comments in Half-Life 2, I run Half-Life 2 heaps as I enjoy the fast paced Team and Everyone for themselves DeathMatch it offers. (Although I am not keeping pace with the younger players anymore, used to be damn fine in FireArms Mod in HL1).
Half-Life 2 doesn't benefit much from running on 1 CPU core, or 2, let alone 4, they are gradually improving it with updates though. However I clocked my system from 2.0 GHz to 2.35 GHz and I sure as hell noticed it in Half-Life 2.
Coincidentially he also runs a similar video card to me (Radeon X800 XL 16 pipelines at 400 MHz, 1 GHz x 256 bit Video RAM, PCIe x16, 256 MB here), and someone posted a screen shot of over 5,000 3DMarks (which, if that was from 3DMark 05 is actually faster than my system in games lacking CPU isolated threading 😛).
Just because I disagree with FutureMark not enforcing Artifact detection into their 'benchmark database' (ORB) doesn't mean I don't use it:
3DMark doesn't benefit from 4 cores. 😛 (Well '06 did in the CPU test).
- http://service.futuremark.com/compare?3dm05=1656988
- http://service.futuremark.com/compare?3dm06=84767
I run my textures in 'Quality' (3/4 max setting), but force 4x or 8x AF in the driver, and set FSAA to 'Application Controlled ' (this actually makes a difference in Half-Life 2, you need to set FSAA in HL2 and set driver to Application Controlled or FSAA performance in the Source engine sucks).
I can't vouch for HDR in HL2 as I don't like it, and only use Blooming, if any at all (HDR is bad for your score 😛).
================================================
Back to easier stuff again:
================================================
HD: FSAA: Let Application decide
Why: Some applications have there own FSAA patterns, that are far faster for the app in question. Forcing it here and in those apps may decrease FSAA performance substantially.
HD: Anisotropic Filtering: 8X (sometimes 4X). Leave High Quality AF off unless running a X1800 series or greater card.
CATALYST A.I.: Set to Advanced or Standard. (Higher end CPUs I recommend Advanced, lower end ones Standard. Only disable it if a given application has problems with 3D).
Mipmap Detail Level: Quality (3/4) (Unless you need the finer texture details for some reason). Same for texture detail if it is listed as a seperate option in your driver still.
Wait for vertical refresh (V-Sync): Off, unless application specifies. (You can use max_fps 100 in HL2 to stop tearing anyway, and then togglte V-Sync on/off in game to taste).
Adaptive Anti-Aliasing: Should be greyed out on X800 series cards. Should only really be used on X1800 series or higher cards.
Enable geometry instancing should be enabled for real SM 2.0b support.
Support DXT texture formats should always be enabled.
Triple Buffering in (OpenGL games only) will make you two frames behind the game, instead of only one frame as Double-Buffering does. It may feel smoother but that extra 25ms is all it takes to get a kill, or to not get killed.
Force 24-bit Z-buffer depth may help with Z accuracy in Quake 3 engine (OpenGL) based games, such as Return to Castle Wolfenstein which really pushed the viewdistance for the Quake 3 engine at times.
Alternate pixel centre - Turn it on if textures appear misaligned in a given application, usually has no side-effects if left on. However I leave it off as can then report the issue to authors of said applications if they are still supported or in development.
================================================
In Half-Life 2 you should be able to set everything on high, but consider setting texture detail to medium for performance reasons. You should be able to run 2x FSAA (Set in game, with driver set as above) at high speeds, Also consider disabling HDR or using the blooming only feature (The game looks fantastic without HDR IMHO).
This one is directly from Valve Tech Support:
If you need even more speed (+10% or so on high end cards) in HL2 / Source games:
- Right click the icon in Steam
- Click properties
- Set launch options
- enter "-dxlevel 81" in the launch options (parameter is passed to game exe)
- Load said game (this is per game btw)
- Set graphics to taste
- Remove "-dxlevel 81" in the launch options (it actually remembers it, unless you specify "-dxlevel 90" to undo it.
The game requires DirectX 9.0 be installed, however it only requires and only really makes use of the DirectX 8.1 features on the GPU. The water, etc still looks just as fantastic in DX8.1 mode.
Better yet, the above '-dxlevel 81' tweak, when applied, per game in Steam, will double performance on lower end cards like the GeForce FX 5200.
They may have incorperated the above into a recent update, but try it anyway and see the difference in the Counter-Strike Video Stress Test results, and Half-Life 2: Lost Coast Video Stress Test results for yourself, as it will different from system to system.
================================================
It just so happens I was 'semi-pro' in FireArmsMod, and play Half-Life 2 / Source games frequently still. As such I do everything I can to gain just 1ms over my opponents (and in OpenGL that means not using Triple Buffering sadly).
For example: http://users.on.net/~darkpeace/pwnage/
My favourite was this one (screenshot taken in safe area using F5 key, not a PhotoChop job either 8) )

Stryongly recommend Firefox for image viewing, as it can zoom-out to size to window, and zoom in to get 1:1 detail is only a left mouse click away.
================================================
Damn I am so saving that for a FAQ 😛. (Working on getting my own website domain up, sometime in the next decade 😛).
OK i will try out those settings. I really appreciate it. I don't want to remove the 256MB ram chip, so I won't! ha ha ha.
I have another question. I ran the game after some minor overclocking with ATItool last night. I used the FIND MAX CORE, but I canceled it myself after I saw some artifacts, and then I used the FIND MAX MEM and it jumped too.
I ended up with like 427.00 / 492.00 or so... BUT when I woke up this morning and used the ATI tool, it had CHANGED, by itself, to 398 / 492 !!!!
What is UP with that? Is something WRONG?
Until I hear back from someone I am going to revert it back to DEFAULT...