Mainstream Oblivion Benchies

pauldh

Illustrious
FiringSquad has their Oblivion performance part 2 out looking at the mainstream cards. Unfortunatetly, they kept the Radeon X8x0's out of the charts. The X1800 GTO dominates, especially in the foliage areas. Also interesting is the 256MB vs. 512MB comparison.

http://www.firingsquad.com/hardware/oblivion_mainstream_performance/
 
I've now had a chance to look this over more. The NV cards do well indoors and the ATI outdoors. But seeing the lowest fps are usually going to be outdoors, and we have to tweak our display settings so we don't stall in those areas, winning indoors doesn't mean much as both cards do icredible well indoors.

Second thing I wondered earlier, but now confirmed it. The X800XL is indeed running the same settings as the other cards, so we can stick those numbers into their 2Xaa/8xaf charts. Doing that, the X800XL absolutely destroys the 6800GT and I'd say beats the 7600GT too. Seeing how much it crushes the 6800GT, I'd say it is safe to say that it easily beats my 6800U also when playing 2/8x bloom, which I tend to like using over HDR anyway do to slow HDR performance. Of course, the X800XL doesn't have the ability to play or even try-out HDR. But one thing for sure, if the X800XL >= 7600GT in performance with fsaa/af and bloom, no doubt the X850XT would be far ahead and looks to be a great performing $160 PCI-e card for Oblivion with the drawback of no HDR of course. Honestly personally I like HDR, but man my 6800U crawls using it. Having the option to try it is nice though, and my Oblivion time has been probably 60% 2x/8x bloom and 40% HDR.
 
I haven't seen 7800GS oblivion results anywhere, but I wouldn't expect it to be much better than the 7600GT. Not worth the extra $100, that's for sure. Looking at the X800XL results, I bet a X850XT would be a much better 2x/8x bloom card than the 7800GS, but no HDR support of course. I think if some details could get turned up along with fsaa and bloom, my feeling is that would be the nicer oblivion experience. Oblivion could just be one of those games that the X850XT ouperforms a 7800GT with fsaa.
 
This is WAY off topic, but what is the general feeling around bloom vs HDR (where applicable)? If one has a HDr capable card (x1900) for instance, then will alot of eye candy be missed if one chooses bloom above HDr, if performance is better?

I haven't tried this, but am curious on this point.
 
That reminds me.
Do you think the Oblivion graphics engine is more like B&W 2- EOA 3 - COD 2 - Futuremark 2005 - or something completely different?
I don't even know of Oblivion uses shader model 2 or 3.
 
Interesting, but predictable, performances from the X1600Pro.

Sometimes terrible (just barely equal to the GF6600GT in Mountains and indoors) and sometimes stellar (like foliage where is beats a GF6800GT).

Overall great to see such a wide range of tests, settings, and cards. Too bad no X800s that they used in the 256/512 tests though (why not use the X1600 256/512? then). I think alot of people are considering the GF6800GS and X800GTO for this game since their R9600SE/FX5xxx kinda blow for Oblivion.
 
I guess Oblivion uses sm 3.0 and HDR,
That should give the x1600 Pro some advantages.
It's also has only Twelve pixel shader processors, which should make it lag sometimes.
I will only consider using a 7800 or x1900 series if you care about frame rates and HDR. I will get the game today then I can stop talking trash.
 
I guess Oblivion uses sm 3.0 and HDR,

Actually Oblivion uses very VERY little SM3.0, the primary paths for even the new cards is SM2.0a and SM2.0b.

There is an enable SM3.0 string in the .ini file but no one I know has figured out what it enables, and you don't need to enable it for HDR (doesn't require SM3.0, only require FP16 blending) or other advanaced features.

That should give the x1600 Pro some advantages.
It's also has only Twelve pixel shader processors, which should make it lag sometimes.

Well if they were full pipelines it would be faster but for a 4 piepline card it's pretty fast. Tradeoff of space on the die and performance. Would still have prefere a little more in the ROP and TMU department though.

I will only consider using a 7800 or x1900 series if you care about frame rates and HDR. I will get the game today then I can stop talking trash.

Forget the GF7800, it's old news, GF7900 is better and usually better bang/buck. The X1900 is the best for Oblivion and luckily enough, unlike the X1600, the X1900 is a solid all around card so you don't just get only performance in Oblivion, but you get it in other titles too.
 
I guess this means my machine is officially outdated now, if a card like mine (X800XT) is now considered "mainstream." Oh well, at least I particularly enjoy how it runs. I could probably get vastly better performance if I ran it at settings similar to how everyone else does, but as it is, most settings are above "maximum." I note that perhaps my x6 AA may be causing part of the performance hit.
This is WAY off topic, but what is the general feeling around bloom vs HDR (where applicable)? If one has a HDr capable card (x1900) for instance, then will alot of eye candy be missed if one chooses bloom above HDr, if performance is better?

I haven't tried this, but am curious on this point.
Well, I like the appearance of HDR, but I don't think it's worth sacrificing AA for. (I also have a lucky acquantance, with a pair of 7800GTX 256 cards, who rather plays with x16s SLi AA than HDR) For the most part, what people like about it is the "oversaturation" bits, which can also be largely mimicked by altering the "bloom" settings in the .INI file.

Interesting, but predictable, performances from the X1600Pro.

Sometimes terrible (just barely equal to the GF6600GT in Mountains and indoors) and sometimes stellar (like foliage where is beats a GF6800GT).

Overall great to see such a wide range of tests, settings, and cards. Too bad no X800s that they used in the 256/512 tests though (why not use the X1600 256/512? then). I think alot of people are considering the GF6800GS and X800GTO for this game since their R9600SE/FX5xxx kinda blow for Oblivion.
Indeed, I'm a bit dissapointed to see the absence of any non-SM 3.0 card in the list. Sure, you can't include them in the HDR tests, but if you're going to be testing a 6600GT, a card that's known to choke at a lot of games when set to "insane," you may as well include the whole list.

I guess Oblivion uses sm 3.0 and HDR,
That should give the x1600 Pro some advantages.
It's also has only Twelve pixel shader processors, which should make it lag sometimes.
I will only consider using a 7800 or x1900 series if you care about frame rates and HDR. I will get the game today then I can stop talking trash.
As GGA said, Oblivion doesn't actually default to SM 3.0. Rather, it appears to go no higher than SM 2.0b. (and yes, the files list it as SM 2.0b.) There is indeed an option in the INI file for SM 3.0 usage, but as also commented, it appears to do nothing. Of course, I've yet to test it myself (I don't know why) and if I can enable it on my X800XT and run the game no problem (without coming back to find it reverted that setting) then it does nothing.
 
what is the general feeling around bloom vs HDR (where applicable)? If one has a HDr capable card (x1900) for instance, then will alot of eye candy be missed if one chooses bloom above HDr, if performance is better?
yeah, there is a noticable difference between bloom and HDR. Bloom still looks great, but HDR really shines :roll: in some areas. If I had a X1900XT I'd definately run fsaa and HDR. Even 2X fsaa seems to make a very noticable difference IMO. With a 6800U, I am not really happy with the HDR performance and think 2x fsaa and bloom make for a better gaming experience. Not sure what I'll do with a 7800GT, probably see how well it plays with HDR first, but really it's nice to smooth out the jaggies too. They kinda stink once you get fixated on them. :?
 
Try linking to the actual review next time;

http://www.bit-tech.net/gaming/2006/03/31/elder_scrolls_oblivion/5.html

BTW, we've already seen it and discussed it. Biggest drawback of that review is their lack of apples-apples which is important for a game that has so many options you can change.

Gamespot's review gives a better picture than Bit-Tech's IMO;
http://www.gamespot.com/features/6147127/p-6.html

And shows where the cards are somewhat CPU bound across the spectrum.

FiringSquad does the best job because it's a variety of settings and situations. If the benchies were just in the mountains, just indoors or just in high foliage areas you wouldn't get anywhere near the same picutre of global performance.
 
wat FS failed to notice is that the x800xl 512 doesnt truly ustilize 512mb of ram. the x800 series cant even use 512ram only 256. wat it actually does is cache the 256 somewhere so that the main 256 can be used..hence 512... thats y the frames dont change in the benchmark. but when it comes to the 7900s its a whole different story.
 
wat FS failed to notice is that the x800xl 512 doesnt truly ustilize 512mb of ram. the x800 series cant even use 512ram only 256. wat it actually does is cache the 256 somewhere so that the main 256 can be used..hence 512... thats y the frames dont change in the benchmark. but when it comes to the 7900s its a whole different story.

Mind explaining that? Since the X800 series can address 512MB, and it shows obvious gain (even over the X850XTPE) in games like HL2 at higher reoslution AA.

I'd like to see the reasoning behind the caching you're talking about. I understand it often doesn't have the need or abiilty for the 512MB, but not sure about that part.
 
Actually the X800 series from R420 to R480 (including 430) can adress 512MB from the core.

When they launched and everyone laughed at it because we thought 'the R9800 series can't use 256 now!' despite Lars showing the beginnings of some 128MB limitations. It's been there from the start, and was there on the GF6 too.

BTW, the X1K can address 1GB, and the GF7800 is 512MB and the GF7900 is 1GB.
 
wat FS failed to notice is that the x800xl 512 doesnt truly ustilize 512mb of ram. the x800 series cant even use 512ram only 256. wat it actually does is cache the 256 somewhere so that the main 256 can be used..hence 512... thats y the frames dont change in the benchmark. but when it comes to the 7900s its a whole different story.
No actually they show and state how in Oblivion the 512MB doesn't help the X800XL at all. BUT, remeber, the 512MB X800XL beats the X850XTpe and suprisingly even a 256MB 7800GTX in COD2, showing that in the right game, the XL can benefit from 512MB of vram.
 
Why are the 6800 series cards performing like ass in this game?

What’s really surprising though is the GeForce 6800 GT’s showing in our performance testing today. At one point we actually loaded up Quake 4 and ran some benchmarks to make sure our card was running correctly. NVIDIA’s GeForce 7600 GT runs circles around the GeForce 6800 GT, and in some cases the 7600 GS is able to give the 6800 GT a run for its money!
 
Why are the 6800 series cards performing like *** in this game?
Because that's what I'm playing it on. :cry: :cry: :cry:


Yeah you are right, while not as bad as how the FX series looked after a while with a 9600 pro outpacing a 5900 in some games, the 6800's seemed to age quickly when a game like this comes out. While the X800 series looks alot more powerful for it. Funny how some people went nuts claiming the 6800 series was more future proof than X800 by supporting Sm3.0, and others said it's too weak, no biggie. I'm going to have to add to my long list of projects, to run some 6800U vs. X800XTpe vs. 7800GT oblivion benchies to see just how crappy my performance is now.(gotta love that Uli)