I guess this means my machine is officially outdated now, if a card like mine (X800XT) is now considered "mainstream." Oh well, at least I particularly enjoy how it runs. I could probably get vastly better performance if I ran it at settings similar to how everyone else does, but as it is, most settings are above "maximum." I note that perhaps my x6 AA may be causing part of the performance hit.
This is WAY off topic, but what is the general feeling around bloom vs HDR (where applicable)? If one has a HDr capable card (x1900) for instance, then will alot of eye candy be missed if one chooses bloom above HDr, if performance is better?
I haven't tried this, but am curious on this point.
Well, I like the appearance of HDR, but I don't think it's worth sacrificing AA for. (I also have a lucky acquantance, with a pair of 7800GTX 256 cards, who rather plays with x16s SLi AA than HDR) For the most part, what people like about it is the "oversaturation" bits, which can also be largely mimicked by altering the "bloom" settings in the .INI file.
Interesting, but predictable, performances from the X1600Pro.
Sometimes terrible (just barely equal to the GF6600GT in Mountains and indoors) and sometimes stellar (like foliage where is beats a GF6800GT).
Overall great to see such a wide range of tests, settings, and cards. Too bad no X800s that they used in the 256/512 tests though (why not use the X1600 256/512? then). I think alot of people are considering the GF6800GS and X800GTO for this game since their R9600SE/FX5xxx kinda blow for Oblivion.
Indeed, I'm a bit dissapointed to see the absence of any non-SM 3.0 card in the list. Sure, you can't include them in the HDR tests, but if you're going to be testing a 6600GT, a card that's known to choke at a lot of games when set to "insane," you may as well include the whole list.
I guess Oblivion uses sm 3.0 and HDR,
That should give the x1600 Pro some advantages.
It's also has only Twelve pixel shader processors, which should make it lag sometimes.
I will only consider using a 7800 or x1900 series if you care about frame rates and HDR. I will get the game today then I can stop talking trash.
As GGA said,
Oblivion doesn't actually default to SM 3.0. Rather, it appears to go no higher than SM 2.0b. (and yes, the files list it as SM 2.0b.) There is indeed an option in the INI file for SM 3.0 usage, but as also commented, it appears to do nothing. Of course, I've yet to test it myself (I don't know why) and if I can enable it on my X800XT and run the game no problem (without coming back to find it reverted that setting) then it does nothing.