FPS on my EVGA 7900GT KO (512mb) seem sort of dismal...

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It Would have been nice if you had been given the right advice from the start, or if links of proof ended the conversation early. But Unfortuneatly that's not how things often work. Would it have been better to remain silent and let you and possible others believe it was your cpu so you and others could waste money and upgrade mobo/cpu and still have the same Oblivion dissappointent? :roll:

But again, while the X1900XT would have been much better for Oblivion, and someone gave you false hopes on how the 7900GT would do in it, I say keep your card. Just tweak Oblivion and be happy. Your card can play it just fine and rocks in many other games. But most of all, don't replace other components thinking it will help your Oblivion performance. Bad advice hurt your original decision some, and more bad advice on a cpu upgrade for Oblivion,would IMO, have been a shame.

Maybe you should get gaming and ignore whatever else comes up here. :wink: Even if arguements continue, I don't think anyone here wants you to pay a restocking fee and switch cards when you have such a nice card already. ........ Then again, that's coming from someone who yanked a 1 month old 7800GT CO out of his system and bought a X1800XT (X1900Xt was way more money back then), because I said no way am I going to put hundreds of hours into a game with such lousy performance and settings. 😛 In my defense, I always have use for another card, so no biggie really, and I didn't RMA it and pay a restocking fee. And yeah looking back the X1800XT was so worth it. Higher detail levels, fsaa+HDR, and better performance than the 7800GT.
 
Yeah this is true. I just want to get the best that my computer can manage, that's all. :) I'm just debating now whether jumping to an X1900XT would even be worth the upgrade or I would notice a substantial difference across the playing field (in all games, not just Oblivion, since I just ordered FEAR and have been looking at a few other games too).
 
I have been looking for concrete evidence on-line in comparison benchmarks between the 512mb 7900GT KO and the X1900XT but all I can seem to find is the KO Superclocked or Signature Edition.

Would it be a huge, noticeable step up to go from the 7900GT I have to the X1900XT or would it not be worth the extra cash? I'm so torn. 🙁
 
Meh, screw it. The 7900GT does well enough, and the elbow room and cooling in my case since it's so small (being Gateway and all) is only marginal. Being that the X1900 is a larger and hotter card, I think I probably (inadvertently) picked the best card for my setup.

Who knows maybe I can do the step-up program in a bit.
 
Apparently a whole lot of people love Oblivion, but the code is bloated. Maybe one day there will be a card able to run it well, but it shouldn't have to be that way. The code is crap (but very pretty).
 
Apparently a whole lot of people love Oblivion, but the code is bloated. Maybe one day there will be a card able to run it well, but it shouldn't have to be that way. The code is crap (but very pretty).
 
Yeah what is the deal with all of the Elder Scrolls seeming to be far more expansive and demanding and buggy than they should be? Morrowind was plagued by lots of problems and MAN lets not even TALK about Daggerfall or Arena.

Oh for the sake of the discussion for those of you comparing fps or whatever, this is the exact card I have:

http://www.evga.com/products/moreinfo.asp?pn=512-P2-N568-AR&family=22


With stock cooling it flutters around 45C. During 3dmark05 it cruised closer to 70ish during some spikes. Would overclocking this card be worth it or would I need to look into more cooling solutions you think? Like I said I only have the stock cooling available from the card and the fan of the gateway case, though my PSU has a fan that sucks hot air from the top and blows it out the back as well (Enermax Noisetaker495, the blue one). I also have taken the side of my case off to add extra cooling and it seemed to make a couple (1-3) degrees difference.

Now the topic has gone from whether to get an x1900xt or not to how to safely improve my 7900GT. :)
 
First thing, not a single card can run Oblivion at max setting right now. We'll need to wait for Dx10 card for this. I'm thinking about both G80 for NVidia and R600 for Ati.

Think of this game as the Doom3 of this year, only worst (or better depending on your point of view). When that game came out, nobody could reach 100+ fps average frame rate. But today it's pretty much any mainstream (think 250$ and more) card that can easily do it. My number might be offtrack a bit, but you should get the point.

What's beautiful with Oblivion is this game will look even better in a year or so with the best VPU available and all setting maxed. The engine is made to grow better with faster material. I don't think it will rival engine like the one that'll be used for Crysis, but it will hold it's own for quite some time.

And to the few last post, I really don't think the code is crap. I personally found this game graphic as good looking as FEAR but you should consider that it involve area about 10 times bigger (and way more even) with a whole lot more poeple. Add to this the vegetations and HDR lighting (not sure about that one) not present in FEAR and you get the difference in performances. I can't give any number, but I know the problem reside mostly with the current VPU with too few vertex unit, which calculate the geometry of any scene. Considering that vegetations is VERY high on the geometry count and you get the reason it is lagging in performances. With upcoming R600 with unified shader/vertex unit and G800 with apparently 16+ vertex unit (as oppose to 8 now for high-end R580/G70, and we should finally have decent performances. The games of late had a tendency to go light on the vertex unit and heavy on the pixels one. So while we saw a doubling of the vertex unit going from 9800Pro to x1900XT (4 to 8 ) we saw a 6 time evolution on the pixel processing unit (8 compare to 48 ) :twisted: . Correct me if I'm wrong please. So like I said, this game engine will grow better with upcoming VPU in theory at least.

Don't flame me anybody please. I'm on break at work now, so I don't have time to check if my information is 100% accurate. :?

As for myself, I put this game aside until I can get something better than my current P4 Northwood 3.0-800fsb and Radeon 9800Pro setup. Not that it look bad, but even with medium setting I get some skipping in the forrested area. I'll get back to it when I'll shell out for my new Conroe setup with either G80/R600 this november (most probably G80 at that time I guess :evil: ).

Still, your 7900Gt is a good one for this game now if you can't afford to give an arm for it. :twisted:

My 2 cents! :wink: