Finally got my new card...

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
TabrisDarkPeace would now exactly about the reason.
SIS chipset doesn't support dual channel.

Well, I wasn't expecting that 😳 - Thanks. (Should've just PM'd, e-mailed or tried to MSN me, I am often online).

I'd recommend a visit to: http://www.sis.com/download/ ; just to get the last drivers fot the SiS chipset, especially if still using the MS Windows supplied ones. (SiS are good in that the Win default drivers generally don't give issues, but performance can be raised using the latest ones directly from SiS).

The drivers he is likely after are:
- AGP (GART), IDE or SATA/RAID *, Network *, Audio *
- * = (if SiS chipset)
Without them overall system performance, including in games, may be impaired.

================================================
Extra Notes:
================================================

But yeah, all Socket 754 systems are single channel (64 bit wide interface) to RAM. Of course if running any DDR333 (PC2700) DIMM it is going to limit performance. Size mismatched DIMMs won't hurt performance much on Socket 754 platforms, but speed mismatched ones would, if it has to revert to the lower speed.

As most (all ?) Socket 754 boards only have 2 x DIMM slots they shoule be able to run at 1T CMD RATE almost all the time under almost every condition. The Socket 754 memory controller (in the CPU) is like 95% efficient. The Socket 939 one is only 90% efficient (or less in 2T CmdRate). However 90% of 6.4 GB/sec (dual-channel) is far more than 95% of 3.2 GB/sec (single channel). 😛

I also aggree with the CPU comments in Half-Life 2, I run Half-Life 2 heaps as I enjoy the fast paced Team and Everyone for themselves DeathMatch it offers. (Although I am not keeping pace with the younger players anymore, used to be damn fine in FireArms Mod in HL1).

Half-Life 2 doesn't benefit much from running on 1 CPU core, or 2, let alone 4, they are gradually improving it with updates though. However I clocked my system from 2.0 GHz to 2.35 GHz and I sure as hell noticed it in Half-Life 2.

Coincidentially he also runs a similar video card to me (Radeon X800 XL 16 pipelines at 400 MHz, 1 GHz x 256 bit Video RAM, PCIe x16, 256 MB here), and someone posted a screen shot of over 5,000 3DMarks (which, if that was from 3DMark 05 is actually faster than my system in games lacking CPU isolated threading 😛).

Just because I disagree with FutureMark not enforcing Artifact detection into their 'benchmark database' (ORB) doesn't mean I don't use it:

3DMark doesn't benefit from 4 cores. 😛 (Well '06 did in the CPU test).
- http://service.futuremark.com/compare?3dm05=1656988
- http://service.futuremark.com/compare?3dm06=84767

I run my textures in 'Quality' (3/4 max setting), but force 4x or 8x AF in the driver, and set FSAA to 'Application Controlled ' (this actually makes a difference in Half-Life 2, you need to set FSAA in HL2 and set driver to Application Controlled or FSAA performance in the Source engine sucks).

I can't vouch for HDR in HL2 as I don't like it, and only use Blooming, if any at all (HDR is bad for your score 😛).

================================================
Back to easier stuff again:
================================================

HD: FSAA: Let Application decide
Why: Some applications have there own FSAA patterns, that are far faster for the app in question. Forcing it here and in those apps may decrease FSAA performance substantially.

HD: Anisotropic Filtering: 8X (sometimes 4X). Leave High Quality AF off unless running a X1800 series or greater card.

CATALYST A.I.: Set to Advanced or Standard. (Higher end CPUs I recommend Advanced, lower end ones Standard. Only disable it if a given application has problems with 3D).

Mipmap Detail Level: Quality (3/4) (Unless you need the finer texture details for some reason). Same for texture detail if it is listed as a seperate option in your driver still.

Wait for vertical refresh (V-Sync): Off, unless application specifies. (You can use max_fps 100 in HL2 to stop tearing anyway, and then togglte V-Sync on/off in game to taste).

Adaptive Anti-Aliasing: Should be greyed out on X800 series cards. Should only really be used on X1800 series or higher cards.

Enable geometry instancing should be enabled for real SM 2.0b support.

Support DXT texture formats should always be enabled.

Triple Buffering in (OpenGL games only) will make you two frames behind the game, instead of only one frame as Double-Buffering does. It may feel smoother but that extra 25ms is all it takes to get a kill, or to not get killed.

Force 24-bit Z-buffer depth may help with Z accuracy in Quake 3 engine (OpenGL) based games, such as Return to Castle Wolfenstein which really pushed the viewdistance for the Quake 3 engine at times.

Alternate pixel centre - Turn it on if textures appear misaligned in a given application, usually has no side-effects if left on. However I leave it off as can then report the issue to authors of said applications if they are still supported or in development.

================================================

In Half-Life 2 you should be able to set everything on high, but consider setting texture detail to medium for performance reasons. You should be able to run 2x FSAA (Set in game, with driver set as above) at high speeds, Also consider disabling HDR or using the blooming only feature (The game looks fantastic without HDR IMHO).

This one is directly from Valve Tech Support:

If you need even more speed (+10% or so on high end cards) in HL2 / Source games:
- Right click the icon in Steam
- Click properties
- Set launch options
- enter "-dxlevel 81" in the launch options (parameter is passed to game exe)
- Load said game (this is per game btw)
- Set graphics to taste
- Remove "-dxlevel 81" in the launch options (it actually remembers it, unless you specify "-dxlevel 90" to undo it.

The game requires DirectX 9.0 be installed, however it only requires and only really makes use of the DirectX 8.1 features on the GPU. The water, etc still looks just as fantastic in DX8.1 mode.

Better yet, the above '-dxlevel 81' tweak, when applied, per game in Steam, will double performance on lower end cards like the GeForce FX 5200.

They may have incorperated the above into a recent update, but try it anyway and see the difference in the Counter-Strike Video Stress Test results, and Half-Life 2: Lost Coast Video Stress Test results for yourself, as it will different from system to system.

================================================
It just so happens I was 'semi-pro' in FireArmsMod, and play Half-Life 2 / Source games frequently still. As such I do everything I can to gain just 1ms over my opponents (and in OpenGL that means not using Triple Buffering sadly).

For example: http://users.on.net/~darkpeace/pwnage/

My favourite was this one (screenshot taken in safe area using F5 key, not a PhotoChop job either 8) )

dm_runoff0015.jpg


Stryongly recommend Firefox for image viewing, as it can zoom-out to size to window, and zoom in to get 1:1 detail is only a left mouse click away.
================================================
Damn I am so saving that for a FAQ 😛. (Working on getting my own website domain up, sometime in the next decade 😛).



OK i will try out those settings. I really appreciate it. I don't want to remove the 256MB ram chip, so I won't! ha ha ha.

I have another question. I ran the game after some minor overclocking with ATItool last night. I used the FIND MAX CORE, but I canceled it myself after I saw some artifacts, and then I used the FIND MAX MEM and it jumped too.

I ended up with like 427.00 / 492.00 or so... BUT when I woke up this morning and used the ATI tool, it had CHANGED, by itself, to 398 / 492 !!!!

What is UP with that? Is something WRONG?

Until I hear back from someone I am going to revert it back to DEFAULT...
 
I don't think ATI Tools loads overclock settings at login unless you set it to do that.

Recommending just picking a GPU / VRAM clock speed then doing a manual artifact check for 60 seconds.

:LOOP
If it fails drop it down, 5 MHz on each (or just one to isolate).
If it passes raise it 3 MHz on GPU or VRAM to isolate.

Run artifact check for 60 seconds again.

If it fails then GOTO LOOP, and repeat until stable. 😛

The 'autofinder' in ATI Tools only uses 10 - 20 sec artifact tests, which are not long enough IMHO. Each time it'll give different results on auto find. Doing it manually is faster, as you know for sure if the Artifact test goes for 5 minutes on given settings with no artifacts that it is stable.

I also recommend the 'ATI Crowd' demo (search www.ati.com for it), for overclock testing, as if you go it high, even just by 3 MHz or so, it will show up in Crowd within 10 minutes of testing.
 
OK I set it at 432.5 X 519.25

That was after a few smaller tests, then manual checking for artifacts for 60 seconds.

Only problem is that it REALLY raise the temp, from regular around 49 degrees to a 54 degree temp sitting, and up to like 65 - 68 degrees during the test.

If I play Half Life for a few hours, I am afraid that could get dangerous....

What do you think?
 
FYI
Socket 754 = single channel (type of mobo don't have anything to do with dual or single channel due to socket type..)
Yeah, you are correct. I was thinking the NF3 250Gb supported dual channel, but forgot that was one of S939's improvements. I've only used a couple S754's and haven't put much thought into them. With almost every build in recent years being i865pe, NF2, NF3U, and NF4's, Dual channel support has almost become the expected norm in my mind.

On another thought, has anyone heard of successful GTO pipe unlocks with AGP versions? I have not.
 
ATI Tools reports my GPU is flat-lining around 82 C with it's stock (GeCube ?) cooling heatsink + heatpipes on it, when logged and gaming for a few hours. 82C is just inside the safe area, and as I can only make it hotter than 82C by raising the GPU clock, I don't raise the GPU clock. 😛

It will decrease the life of the GPU, but the GPU would last 30+ years normally anyway. The fan will fail on it first.

So long as it is not miscalculating (artifacts) it will be stable.

To be sure run ATI Crowd on it for 30 min or so, it just loops and is good for testing both GPU and VRAM. Also when deciding on the final settings run the ATI Tray Artifact tester for 30 min aswell.

If the card can do 30 min in each of them at those settings, you may be able to push it harder. When VRAM ceiling it hit just raise TRP and TRAS in advanced timings by 1 each. (and possibly a few others, and drop some other timings while there as when TRP and TRAS are raised you can, and should, drop other timings,... the exact ones escape me atm though.).

Sure 83C for a GPU is 'bad' but I hammer the hell out of it, and I know the GPU will still last 3-5 (or more, but by then it'll be replaced 8) ) years like that. Just not sure about the other parts of the card. 😛

There are many people who will say over 70C for a GPU is too hot, and yes I agree with them, but if it is stable and passing the kinds of tests I throw at it (my card) then I know she ain't going to break.

When logging temps make sure to do it for 2+ hours in a heavy game, not while on desktop, or after 30 min, or after returning from a game (the GPU temp could drop 15C in under 10 sec, the time it takes to close most games).

Chances are you may fail artifact tests before getting to 80C flatlines during extensive testing anyway.

Overclocking is not about taking shortcuts and only doing 5 minute tests btw. 8) , More testing can save you alot of money in overclocking.

PS: NF3 250Gb with Socket 939 should support dual-channel, just with AGP, as memory controller is in the CPU. Just more traces to/from RAM. Pretty sure I've seen nForce 3 boards for S-939 /w dual-ch.
 
Also I guess this card uses the chipset from the X850PRO... since thats what it has listed...
But you said it had an R430 core?

Yep...

I have an X800GTO AGP

ATI says it's a R430 Core...

And 3DMark has it under the X850PRO Chipset....

I'm still lost!
 
The only risks that I am aware of with flashing the bios are:

1. A bad flash due to system crash
2. A bad flash due to incompatible bios

Ther is a solution that should be able to fix the problem if it occurs. You just need a PCI (not PCI express) grahics card to boot from so that you can flash it back.

With Nvidia cards I have heard of bad pipeline that cause problems when unlocked but I have not heard of this happening with an X800. If you do encounter glitches after the bios flash you can always flash back to your original bios. I don't think there is a risk of "frying" your card.
Yeah - I don't have a floppy disk drive.

I can email you the bios that I am using if you want. I still have a copy of the 16 pipe bios and the 12 pipe bios so you should be safe either way. You will need some way of booting into DOS though. You can try this link if you can't borrow a floppy drive from somebody

http://www.nu2.nu/bootcd/clean/
I haven't ever done it with a CD so I can't help you too much with making it.
 
The only risks that I am aware of with flashing the bios are:

1. A bad flash due to system crash
2. A bad flash due to incompatible bios

Ther is a solution that should be able to fix the problem if it occurs. You just need a PCI (not PCI express) grahics card to boot from so that you can flash it back.

With Nvidia cards I have heard of bad pipeline that cause problems when unlocked but I have not heard of this happening with an X800. If you do encounter glitches after the bios flash you can always flash back to your original bios. I don't think there is a risk of "frying" your card.
Yeah - I don't have a floppy disk drive.

I can email you the bios that I am using if you want. I still have a copy of the 16 pipe bios and the 12 pipe bios so you should be safe either way. You will need some way of booting into DOS though. You can try this link if you can't borrow a floppy drive from somebody

http://www.nu2.nu/bootcd/clean/
I haven't ever done it with a CD so I can't help you too much with making it.

It seems to be a bit too messy. I don't want to screw anything up, so i don't think i am going to bother with it.

It's not that I don't appreciate it, again, I am just afraid of what happens if I do something wrong.

Thanks for the help...
 
Well, I can understand your aprehension. I won't hold it against you. :wink:

It does make it a little more complicated since you don't have a floppy drive.
I actually found out how to modify the original bios so I reflashed mine last night to the original bios with all pipes enabled. Now the drivers say X800GTO again instead of X800 series. Still 16 pipes enabled.

My understanding is that any defective pipes on the ATI cards are disabled permanently so the flashable ones don't have defects.

Anyway, good luck with getting your system to run better.

I'll probably keep checking this post in case you change your mind.
 
well if I didn't have to do it with a burnt cd, perhaps I would... but I just KNOW something will go wrong! ha ha ha. Thats the last thing I need.

thanks again!
 
The X800 Pro core is 12 pipelines at 475 for:
12 x 475 = 5.7 Gpixels / sec

The X800 XL core is 16 pipelines at 400 for:
16 x 400 = 6.4 Gpixels / sec


However I suspect thie X800 GTO, if based on X800 Pro would be:
12 x 400 = 4.8 Gpixels / sec

If it is based on the X800 Pro core, then the GPU has heaps of overclocking headroom (assuming cooled well). Since he knows how to monitor temps it would make sense (and be far less risky) to just overclock the GPU gradually to 475 MHz, testing it the whole time, 5 MHz at a time, to make sure it is safe.

The video RAM on the X800 GTO is also clocked at 980 MHz (post DDR), and has the same 256 bit wide bus for over 30 GB/sec peak video memory thoughput. Just like the X800 XL. Which gives it a slight edge compared to the standard X800 Pro if the GPU is overclocked to 475 MHz.

If the die has one quad disabled, assuming it is there, it will also run cooler at stock settings.
 
The X800 Pro core is 12 pipelines at 475 for:
12 x 475 = 5.7 Gpixels / sec

The X800 XL core is 16 pipelines at 400 for:
16 x 400 = 6.4 Gpixels / sec


However I suspect thie X800 GTO, if based on X800 Pro would be:
12 x 400 = 4.8 Gpixels / sec

If it is based on the X800 Pro core, then the GPU has heaps of overclocking headroom (assuming cooled well). Since he knows how to monitor temps it would make sense (and be far less risky) to just overclock the GPU gradually to 475 MHz, testing it the whole time, 5 MHz at a time, to make sure it is safe.

The video RAM on the X800 GTO is also clocked at 980 MHz (post DDR), and has the same 256 bit wide bus for over 30 GB/sec peak video memory thoughput. Just like the X800 XL. Which gives it a slight edge compared to the standard X800 Pro if the GPU is overclocked to 475 MHz.

If the die has one quad disabled, assuming it is there, it will also run cooler at stock settings.

Well I have been running it all day at around 426 core... I played some serious Half Life 2 and Counter Strike, and could really hear that fan spinning. It got the temp up to 70 degrees max.

The question is does the extra 26mhz help with gameplay?????
 
If it is based on the X800 Pro core, then the GPU has heaps of overclocking headroom
If he has an R430 core(which he probably does by the sign of his posts), then his card will never reach 500Mhz core.

I would return any GTO that didnt have an R480 core, but then again, I dont put with any shit that the vendors give me.
 
The R430 is the core used in the X800XL not the X800pro. The GTO can come with the R430, R423, or R480 there are no guarantees.

The R480 overclocks the best then the R423.
The R430 maxes out with around 10% overclock because it is manufactured with a different process.

But if you unlock an R430 and overclock it the little that it will overclock. It is as good as having an R480 superoverclocked with only 12 pipes.

All of these chips have 16 pixel pipes but in the GTO 4 of them are disabled. There are three basic ways that the pipes can be disabled:

1. Disabled durring production.
2. Disabled after production via laser cut.
3. Masked by the bios.

As I mentioned earlier in this forum ATItool provides information that determines if a card falls into cattegory 3, which are the unlockable cards.

From discussing the situation with him, his card should be unlockable to 16 pipes. He just dosn't want to risk flashing the bios on his card.
 
You can set up profiles in ATItool which will allow you to only overclock when you are playing a game. I actually underclock when in 2D. You can also control the fan speed for each of the profiles if you click on settings.
 
The R430 is the core used in the X800XL not the X800pro. The GTO can come with the R430, R423, or R480 there are no guarantees.

The R480 overclocks the best then the R423.
The R430 maxes out with around 10% overclock because it is manufactured with a different process.

But if you unlock an R430 and overclock it the little that it will overclock. It is as good as having an R480 superoverclocked with only 12 pipes.

All of these chips have 16 pixel pipes but in the GTO 4 of them are disabled. There are three basic ways that the pipes can be disabled:

1. Disabled durring production.
2. Disabled after production via laser cut.
3. Masked by the bios.

As I mentioned earlier in this forum ATItool provides information that determines if a card falls into cattegory 3, which are the unlockable cards.

From discussing the situation with him, his card should be unlockable to 16 pipes. He just dosn't want to risk flashing the bios on his card.

Yeah I am pretty happy with it. I don't know if 4628 is a decent score on 3DMARK05 or not, but I also have to keep in mind it's a cheap MOBO with a cheap SEMPRON 3100+ running at 1.8ghz only, so even with the best card, the performance I am going to get is only slightly better than what I have now.

I downloaded and played the MULTIPLAYER DEMO for F.E.A.R... (I don't think this game holds a candle to Half Life 2 and a CS:source BTW).

It tried to get me to do all LOW settings... I was pretty disapointed. I overran it, and went to medium, and it ran fine, only slight choppiness... however, it still looked like crap, and I hate the controls, so I quickly went back to Half Life2 Dethmatch, and died in glorius graphics... :)

So - what do you guys think I would be able to run with this setup? I mean, would I be able to run OBLIVIAN at all?

My birthday is Tuesday, I am turning 27, and I want to buy myself a really good Role Playing Game... (I just quit WoW, and bored with Guild Wars...)

Any suggestions on what I can play, or what you guys like?

Thanks in advance!?
 
If you want to try overclocking your processor, here is a link that I ran across that you might want to check out:

:arrow: http://www.xbitlabs.com/articles/cpu/display/sempron-3100-oc.html

It looks like these guys got a CPU just like yours up 2.5GHz. That would probably be nice.

BTW my 3D-Mark 2005 score is about 5500.
My system:
AGP X800GTO flashed to 16 pipes and clocked at 438core/528mem
Pentium 4 2.8c @ 3.64GHz
1GB OCZ PC3200 @ dual channel DDR416 & 2-3-2-5 timings
 
my 9800 mobility was based on x800 architecture and it could do it in HL2:LC and DOD:S
HL2 used a ps2.0 form of HDR so that it was usable with ATi cards, and DOD:S was built on the HL2 engine, so of course it's the same.

The normal type of HDR, like is used in Far Cry, Serious Sam 2, and 3DMark06, uses floating point 16 blending and filtering......something not available on the X800 architecture

Try running Serious Sam 2, Oblivion, 3dmark06 or Far Cry with HDR and an X800 series card.....you'll see that you cant.

Better yet, read this 3DMark06 overview written by a friend of mine.
http://www.elitebastards.com/page.php?pageid=13397&head=1&comments=1
 
If you want to try overclocking your processor, here is a link that I ran across that you might want to check out:

:arrow: http://www.xbitlabs.com/articles/cpu/display/sempron-3100-oc.html

It looks like these guys got a CPU just like yours up 2.5GHz. That would probably be nice.

BTW my 3D-Mark 2005 score is about 5500.
My system:
AGP X800GTO flashed to 16 pipes and clocked at 438core/528mem
Pentium 4 2.8c @ 3.64GHz
1GB OCZ PC3200 @ dual channel DDR416 & 2-3-2-5 timings

I can't do it with my MOBO...

" So, overclocking of Athlon 64 CPUs, and also of the Sempron 3100+, can’t be performed fully due to some problems with peripheral buses rather than to any limitations of the CPU itself. This problem was really serious until quite recently: there were no mainboards available to the masses that would clock the CPU and the AGP/PCI buses asynchronously."