Maxing Out Your Graphics Card With Tomb Raider Legend

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
The test is obviously done on an nVidia card?! On my X1900XT i have all the fantastic next generation water effects.

Water effects are known to be missing on nVidia cards. It is said to be a drivers issue, but even the guru3d beta drivers don't fix it.

I can even get the watereffects on my old Radeon 9700Pro with next gen enabled using the registry hack. :wink:
 
Let me say... I'm not impressed, at all. To me, it looks like their "Next Generation Content" stuff is like along the lines of "HEYZ! Let's stick on simplistic specular lighting, rediculously low-resolution shadow-mapping, normal-mapping that was done in MSPaint, and a cheap "HDR" ap, and call it 'next-gen!' I are teh leeties!!!"

Seriously, it comes across to me that way. The thing looks hardly impressive, and I am serious on the points of low-resolution; just LOOK at those shadows; it's good to see crappy stencil volumes (what "normal" mode, and most games, use) replaced by shadow-mapping, (where shadows are texture-based rather than polygon-based) but by the gods, the resolution on them is worse than in UT2k4!

Then there's the normal-maps. Congratulations on not using bump-maps, but normal-maps instead. However, most games use a scheme to produce them, by judging the differences between low-polygon models, and models that have millions of polygons. Looks like they did that for Lara's body, but those floor normal-maps look they were whipped up in a few seconds using Photoshop at best. Perhaps simply by using some sort of "emboss" filter.

Perhaps the one good thing in the "NGC" mode is the specular lighting. I can't really see all that many flaws with it, aside from perhaps it being used TOO much, and not well enough, to the point where, when combined with the poor normal-mapping, everything looks like plastic.

One question I have is, "what parallax effects?" I think it's a mistake on the part of THG. Parallax-mapping is pretty much the only real "parallax"-related shader I know of, and I clearly don't see it at work in any of those screenshots.

Another point that draws my ire is how the "normal" mode, they appear to be intentionally cutting off many of the things that are considered STANDARD, like the highest-resolution textures, and in some places, even colored lighting.

"Next-gen?" No thanks. I'll stick to Oblivion. That game looks infinitely better, and it happens to be intensely demanding for its content and features, NOT for being a PC (poorly coded) game. And perhaps best of all, quality increases in virtually every other game on the planet don't come with "counter-balance" take-aways elsewhere.
 
I'm glad there are real time shadows at all! You may realise that if they were high resolution the game wouldn't remain playable even on a high end machine with X1900XT.
 
I'm glad there are real time shadows at all! You may realise that if they were high resolution the game wouldn't remain playable even on a high end machine with X1900XT.

i believe this is the point about the bad coding...in a game like tomb raider considering everything else looks terrible, high resolution shadows should mean the game would be playable. but because the game is badly coded(lets face it looking at the evidence) it looks crap and hogs more performance than the actual graphics need.

nottheking, is right oblivion looks so much better yes its a killer running at 15-30fps on the best rigs, but i would of prefered good graphics(like oblivion) and low frame rate, instead of poor graphics and a low frame rate with bugs to boot.

the game looks worse than some games before HL2.
 
I'm glad there are real time shadows at all! You may realise that if they were high resolution the game wouldn't remain playable even on a high end machine with X1900XT.
Not really; since the game uses shadow-maps instead of stencil-volumes, the process of making the shadows is simple, and has pretty much two steps:[*:0cba861e64]First, the game "renders" the scene from the POV of the light source. It's not done in color, but instead a simple, quick Z pass, so it only records the distance to the source.
[*:0cba861e64]It then compares where the previous pass hit an object, and draws shadows from there on out.Since the result is a texture, putting that into the scene is a negligable part of performance. The main part is calculating the shadows. However, that's done using the graphics card's ROPs, (raster operation pipelines) arguably the most under-utilized piece of graphics hardware.

Some other games manage to use far higher-resolution shadows; Oblivion, the game I mentioned before, is such an example; character shadows are done at a 1024x1024 resolution, to the point where individual texels are indistinguishable even at very high resolutions. (The game's questionable method of "softening" them is another matter, though) Likewise, other great-looking games do shadows at high resolutions as well, such as Splinter Cell III.
 
Okay. Could it be then that CD made them low resolution to keep the game in next gen playable on lower range cards with lesser ROP's? Shadows is one of the options to enable/disable in settings.
On my X1900XT (not short in rops as you know) i run the game all settings maxed out @1600x1200. Average fps using Fraps = about 32. Although that seems not fast the gameplay looks runs and feels smooth as butter.
So you say, higher resolution shadows would not make a difference for the fps?
 
Holy crap!.. this game BLOWS!... why the f#$% would anyone want to play this so that they have to switch from nextgen to normal mode in every friggin' level or room they walk in just so that they can experience the game to its fullest potential??? that's what I call CRAP codding... this is far from nextgen when compared to f%^&ing awesome games such as F.E.A.R... don't buy this crap people... save your hard earned cash and maybe games like this will be a thing of the past someday.
 
It's always funny to see people bashing a game without noticing the system requirements. Read them here: http://www.tombraiderforums.com/showthread.php?t=60118.
For next gen content running playable, you need a 512 MB graphic card. I guess yours is a 256 GTX?

Also, you should install the latest nVidia drivers from http://www.nzone.com/object/nzone_downloads_winxp_2k_32bit_91.28.html or Guru3D. :wink:

Dude, I didn't buy the game and don't have any intention on buying this crap... my point was on the article they said that to really get the full graphical experience with this game you should switch from nextgen to normal mode all the time because some areas of the game look better in normal mode than it does in nextgen... I think that when someone buys a game they shouldn't have to go thru all this just to play a game... I have F.E.A.R, Doom3, HalfLife2 and FarCry and I dont have to be switching my graphic settings on the games on and off just to play them... anyway, if you think switching modes on and off at EVERY room or area you get to just to see which one looks better doesn't bother you.. then go ahead knock yourself out.
 
But you haven't played it, you only read the article. Yet you call this game crap...
my point was on the article they said that to really get the full graphical experience with this game you should switch from nextgen to normal mode all the time because some areas of the game look better in normal mode than it does in nextgen...
Yes they said that in the article, and that is the big mistake here: it is the nVidia drivers!!!! Legend did not run fluidly and without missing shader effects (the water! and Depth of Field artifacts!) on ANY nVidia card untill very recently: latest beta drivers seem to have finally fixed it. The review should have been done on an ATI X1900XT: no problem whatsoever, all shadereffects.
 
You guys are harsh. Most you bash without even playing it. I have the game and actually enjoy it on a 6600GT. I've got a 7900GT coming on Friday :) and an going to install the beta driver which people have been claiming makes a world of difference.

The Tomb Radier games are the only adventure games I really like. I'm mostly a first person shooter fan. Sometimes the puzzles get a bit too much but that's okay. It's a pretty decent game compared to the last few in the series. The control of Laura is very smooth.
 
Yes! It's a very enjoyable and beautifull game. Lara controls are very smooth indeed. I hope your 7900GT is a 512MB one, cus you really need it in next gen mode.
 
No, it's not 512. I really can't afford that. But if it's playable for my now it should be better with the 7900gt. Maybe not with all the bells. I've got a 24" LCD so I'm sure it will not be maxed out.

But I plan on doing the 7900GT volt mod. I've ready everything I can on the subject and it shouldn't be too hard. The performance increase is amazing with the mod.
 
A volt mod? 😀 Please provide a link! I know a few nVidia users who would be interested.

But back to topic: Legend next gen needs 512MB videoram in order to load the complete textures onto the ram at once otherwise the card will start swapping them constantly, causing stutters.
 
hey is there a specific requirement to activiate the next generation content in tomb raider legends? I heard from my cousin that i need a nvidia 6 series or 7 series to activiate it cuz they have shader model 3. Does it really matter which shader u have? I use a Ati radeon 256 mb x800xl pci express with shader model 2.
 
Yes you can enable next generation content on ANY card with shader model 2.0 and 2.1 support, however on nvidia fx5xxx series i heard of no good result. I had very good result on my old Radeon 9700Pro. Your x800xl should give better result, read the details in this thread on tombraiderforums.com
 
New patch has been released.

Fix list:

- Fix for ‘High Priority’ task manager (lock up), in addition to fixing the occasional audio stuttering issue.
- Overall improved performance.
- Fixes and improvements made to Lighting when running the NextGen version.
- Implemented a separate ‘walk’ button for Lara.
- Fixed the issues regarding the breakables when running NextGen.
- Fixes made when running the game using a secondary monitor.
- Refresh Rate option now fully operational.
- ATI CrossFire Movie problem has been fixed.

Legend patch 1.2

Patch 1.2 incorporates patch 1.1.
This patch is for all DISC based versions of TombRaider. Europe and US.
 
First off, it seems this article was designed to dryly ridicule the game's poor programming as opposed to be seriously suggesting that you switch graphic modes constantly mid game.

I'm running the 91.28 drivers, 1205 Bios, 6.85 forceware, the may 06 amd patch, running at 1920x1600.

max fps constantly (60 w/ vsynch, 100-200 w/o) w/o next gen on.

Next gen on = 640x480 stuttering.
Tried almost every graphic option including all of the SLI profiles...

Actual game is pretty good, half of the game is getting it to run well.

Oh and no wet / dirty affects w/ next gen on, also there is a rainbow above the waterfall that appears w/o next gen on...

An interesting thread...

http://forums.eidosgames.com/showthread.php?t=59511



Dell 2405FPW, Viewsonic VP912b
AMD A64 4400+ X2 w/ ZALMAN CNPS9500
2x EVGA 7900GTX Superclocked
Asus A8N32-SLI Deluxe
Corsair TWINX2048-3500LLPRO 2GB Kit DDR433 XMS3500
Raptor 74gig x2 raid 0 / xp pro
SB X-Fi Elite Pro,Logitech® Z-5500 Digital THX
Antec p180 / Antec TRUEPOWERII TPII-550
 
eidos developers should be shot.

You get the same frame rate at 1280 as you do at 640.

I bet if you ran this game at 320x240 or even 160x120, you'd get the same frame rate with next gen turned on.

two words:
CRAPPILY CODED

Just like FEAR was crappily coded. there is no reason why a game can't look good and play well with 1gb system ram and a 256mb gfx card.

The doom and farcry engines can look great even on midrange systems.


Hitman blood money is also a crappily coded game.
all resource hogging games that look like crap are.

oh yeah, and a 6gb install size for this game? that's massive bloatware!!!!!!!
 
eidos developers should be shot.

You get the same frame rate at 1280 as you do at 640.

I bet if you ran this game at 320x240 or even 160x120, you'd get the same frame rate with next gen turned on.

two words:
CRAPPILY CODED

Just like FEAR was crappily coded. there is no reason why a game can't look good and play well with 1gb system ram and a 256mb gfx card.

The doom and farcry engines can look great even on midrange systems.


Hitman blood money is also a crappily coded game.
all resource hogging games that look like crap are.

oh yeah, and a 6gb install size for this game? that's massive bloatware!!!!!!!
 
Yeah, I'm playing HL2:E1 at 1920x1200 2xAA 4xAF, full HRD, reflections all, vsynch on... everything else maxed... and it's like butter...

Looks worlds better too... and when I lead Alyx to a colored light source, it lights her up properly, shadow on the back...

The HRD looks just as good as Obliv, GG valve for getting HDR + AA for us nvidia dweebs