Aliens Vs. Predator: DirectX 11 Game Performance Analyzed

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]NegativeX[/nom]Why isn't the new Geforce 400 series included in this at all?Holy fail Batman.[/citation]
The article was written before the GTX 4XX cards(even if it was released after.). It takes time to write an article and get it out. I am sure there will be a future GTX 4XX article soon.
 

cleeve

Illustrious
[citation][nom]nukemaster[/nom]The article was written before the GTX 4XX cards(even if it was released after). It takes time to write an article and get it out. I am sure there will be a future GTX 4XX article soon.[/citation]

Quite right nuke, I should have mentioned that the bulk of testing was done quite a few weeks ago before the new GeForces were even released.

Although, I admit I'm still waiting for a test sample...

 

n00bmaster

Distinguished
Apr 22, 2010
3
0
18,510
hahahahaha...DX11 is a marketing hype. Glad to see I never got a DX11 hardware. Nice to know I saved my money on a cheap 250 dollars 8800gt. No one is going to use DX11 features. Why? because of consoles.
 
[citation][nom]n00bmaster[/nom]hahahahaha...DX11 is a marketing hype. Glad to see I never got a DX11 hardware. Nice to know I saved my money on a cheap 250 dollars 8800gt. No one is going to use DX11 features. Why? because of consoles.[/citation]

LOL, well atleast you picked the right name for your account! :p
 

n00bmaster

Distinguished
Apr 22, 2010
3
0
18,510
[citation][nom]sincreator[/nom]LOL, well atleast you picked the right name for your account![/citation]
O really? Come back and show me a game which shows a HUGE difference in image quality.
 
[citation][nom]n00bmaster[/nom]O really? Come back and show me a game which shows a HUGE difference in image quality.[/citation]

More are coming, and IMO Dirt 2 is allready showing it. : http://www.tomshardware.com/reviews/dirt-2-performance-benchmark,2508-3.html. I like my cars with curves. I also like sharper textures. :p I'm sure tessalation, open cl, and direct compute are not just "trends". This is the way technology advances. The next gen consoles will adapt what the then current PC's have. Like they always do.
 

JonnyDough

Distinguished
Feb 24, 2007
2,235
3
19,865
From the sounds of it, the new game is more true to the series of movies of Aliens and Predator and AVP. The older title was probably developed to be a great game. That "not quite as fun as the original" feeling that you're experiencing is the same feeling that most people get from playing most video games based on movies. This is why original video games are virtually always better. Note to developers: If you want a hit title stop trying to milk franchises. Come up with something new.
 

liquidsnake718

Distinguished
Jul 8, 2009
1,379
0
19,310
Its still good to see a game stress the gpu on the highest settings like Crysis. I might just pick this game up to play since I can comfortably use it on its 2nd highest setting
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]liquidsnake718[/nom]Its still good to see a game stress the gpu on the highest settings like Crysis. I might just pick this game up to play since I can comfortably use it on its 2nd highest setting[/citation]
get metro 2033 if you want to stress your gpu!
I'm guessing the drivers aren't optimized for it, but on my rig with 3x4870 gpu's I'm scoring between 8 and 22fps @ 1680 with everything on max (including aa)
 

cleeve

Illustrious
[citation][nom]sincreator[/nom]More are coming, and IMO Dirt 2 is allready showing it. : http://www.tomshardware.com/review [...] 08-3.html. I like my cars with curves. I also like sharper textures. [/citation]

I wrote the Dirt2 review. I don't think Dx11 gives the cards sharper curves or provides sharper textures. The cars are not tessellated, and the textures are the same for Dx9 or Dx11 modes.

Frankly, it's almost impossible to tell the difference between Dx11 and Dx9 when you're actually playing the game:

http://www.youtube.com/watch?v=W3zfb8lLDH0


 
[citation][nom]Cleeve[/nom]I wrote the Dirt2 review. I don't think Dx11 gives the cards sharper curves or provides sharper textures. The cars are not tessellated, and the textures are the same for Dx9 or Dx11 modes.Frankly, it's almost impossible to tell the difference between Dx11 and Dx9 when you're actually playing the game:http://www.youtube.com/watch?v=W3zfb8lLDH0[/citation]

I got my game through steam, and steam updates automaticaly. The release notes for patch 1.1 mentions having to sharpen textures in DX9 mode, and a fix for custom settings that were not being applied while running the benchmark mode. Also improved the shadows as well. Is it possible that this is why I see a difference? Could some of the those fixes cause your initial benches/screens/videos to not have the correct settings applied I wonder?

http://www.bluesnews.com/s/108246/colin-mcrae-dirt-2-patches I would of just linked the codemaster site but I couldn't directly link to the patch download page. This patch only came out last month, and your article was done long before that. So could something of changed since then?

I checked out your video on youtube, but it is real hard to see any difference in a video at 720p. I'm playing at 1920x1080 4xAA on a 32" HDTV, so maybe that's why I see differences clearer as well.
 

cleeve

Illustrious
[citation][nom]sincreator[/nom]The release notes for patch 1.1 mentions having to sharpen textures in DX9 mode, and a fix for custom settings that were not being applied while running the benchmark mode. Also improved the shadows as well. Is it possible that this is why I see a difference? [/citation]

Probably not. One of the few reasons you could see a very slight difference between the Dx9 and Dx11 version of Dirt2 was that the DirectX 9 version had a very slight blur to it - I mention this in the review. This is something I mentioned to AMD/Codemasters and perhaps this addresses the fix. But if this is the case it will only serve to make the Dx9 output even closer to Dx11.

Custom settings wouldn't matter because I used the recommended settings when benchmarking.

As far as shadows, there were no notable shadow problems in either mode, so a shadow 'fix' is likely more to do with performance than quality.

I think you might be falling prey to the expectation that Dx11 must be better. Like I said in the review, the mind plays tricks and I encourage you to take screenshots of both modes to see if you're really seeing what you think you're seeing. I'll certainly check the patches out when I have time.

[citation][nom]sincreator[/nom]I'm playing at 1920x1080 4xAA on a 32" HDTV, so maybe that's why I see differences clearer as well.[/citation]

I don't think that's to blame, I never noticed any notable differences at 2560x1600 on my 32" monitor even when zooming in to screenshots. :)
 
G

Guest

Guest
For me, the sub par voice acting was disappointing in the new AvP as well and really detracted from the feel.
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
[citation][nom]bkFusion[/nom]For me, the sub par voice acting was disappointing in the new AvP as well and really detracted from the feel.[/citation]
Want good voice acting? play dragon age ; not exactly a fps game, but very good voice acting
 
G

Guest

Guest
Very nice article thanks Don and Tomshardware.

But there is a wrong information in the article.

First Alien vs. Predator game realased for Atari Jaguar in 1995 not 1991.

Check here http://www.rebellion.co.uk/#/about/

BTW İ'm waiting an update for GTX 4XX graphics cards. :)
 

n3ard3ath

Distinguished
Dec 11, 2008
270
0
18,780
[citation][nom]matt87_50[/nom]good points, all true.the biggest thing with DX10 was that it was vista only, no XP. as people flock to win7 with dx11 built in, that won't be such a big problem. also, dx11 doesn't really add much, it just kinda improves on dx10 and adds a couple of really useful things that should actually make life easier for everyone (kinda like win7 compared to vista I suppose) it adds proper multi threading in the drivers and allows the rendering engines to be multi threaded now. this is just a software thing too, so its not really hardware dependant (you don't need dx11 hardware to benefit) the other really useful feature added is tessellation. which is something that devs already do manually, and painfully in dx9 games. if anything its probably better for developers production times than it is for the end user! no complex art pipelines or engines, automatic performance scaling (the card knows how fast it is and can dynamically allocate the right proportion of triangles to every object in order to reach an exact total frame poly budget). it also adds compute shader. all of these things don't really add anything new, we had tessellation on GPGPU before, but it was all third party and more convoluted, so its more about ease of development than new stuff. don't take that the wrong way though. ease of development should lead to much bigger leaps and bounds in graphics than new features that everyone was too scared to use anyway.[/citation]

I have a question for you too. If those new APIs really make things easier for developers, and make proper use of multi-threading capabilities of new hardware, then why does the actual products do run so much like crap for the most part(especially console ports). It is, again, probably because those title aren't pure DX10/11 titles like the guy above is saying, but I'm still skeptical. Fact those titles are ports probably doesn't help either, but that's now approximatively 80% of the games we're noew getting on PC, damn ports. So in the end, why are we paying for this DX10/11 compatible hardware years after years if it's only to run future games that will be pure DX10/11, when we all know this same hardware will not be able to keep up when this actually occurs. We are all following religiously what theories of technological possibilities those hardware producers are putting out in the public arena to sell more of their stuff, but have almost no return on investment, besides maybe higher resolutions than consoles, on what we are buying with our hardly earned money. Personally, I'm really pissed off about all this. Back in around 1998/2004, you actually had a superior experience with a high end PC compared to a console or a low end PC. But right now? Not sure. For example, I got an ATI 4890, is it really worth it to buy an ATI 5970 so I can get AA filtering in AvP(which sucks anyway)? Also not forgetting that this looks like a locked feature gimmick more than anything....Just my 2 cents.
 

bildo123

Distinguished
Feb 6, 2007
1,599
0
19,810


Tell me about it, that is X3:TC. I was a moderate player of EVE online, but after the smoke and mirrors cleared and I found out the dull cycles you follow through in that game and I went looking elsewhere. I was browsing around Gamestop and found X3:TC for $10. It was literally like new (which I found out why later) included the big book, which did intrigue me and heck, it was only $10. The controls are pretty terrible, having to have the book in front of me to do just about everything the first several hours. At least in X3 they have that SETA drive or something. Regardless it became even more mundane than EVE and after about 24+ hours of play time I shelved it.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
It would be nice if this site GOT HONEST about 8xAA, 16xAA, and 16xAF, not to mention the now available at this date 32xMSAA for Nvidia.
What we have seen for SOME TIME now here, is the amazing 0xAA and 4xAA ONLY ONLY ONLY, on all the reviews, since ATI loses framerates at an extraordinary rate when you crank that setting up to 8xAA and 16xAA in the vast majority of reviewed titles.
Here's what Tom's found out, and why their POLICY is now 0xAA and 4xAA only !
" All of our cards serve up what I’d consider playable performance at 1680x1050. Most interesting, perhaps, is that the GeForce GTX 480 and 470 sacrifice less of their performance in switching from 1x to 8x anti-aliasing, allowing the GeForce GTX 470 to jump in front of the Radeon HD 5870 with 8xAA enabled, even though the 5870 is faster without AA. "
THERE YOU HAVE IT STRAIGHT FROM ONE OF THE REVIEWS -
http://www.tomshardware.co.uk/gefo [...] 46-13.html
---
Let's not forget things like GOD RAYS that make NVIDIA the winner, PERIOD, as well.
http://www.bjorn3d.com/read.php?cID=1831&pageID=8802
--
What we have now, and it's not just here, but all over the net at the vast majority of review sites, is the 0xAA and 4xAA ONLY reviews.
The reason of course is, ATI LOSES MISERABLY at higher rates.
Hardocp uses higher settings, but then has it's own twisted analysis that anyone in the know, knows is ATI favored since he and anand were denied the GTS250 for biased reviewing.
ATI fanboys love hardocp. Similar at Anandtech.
But the TRUTH, is out there.
How about we get it HERE ?
I doubt that will happen.
We got it ONCE, with 8xAA in the review I linked above from this site (UK version), but then the STANDARD was set - leave higher AA OUT, since Fermi does so much better with it.
I'm certain the review MUST, or their free card supply from ATI will dry up, just like ATI's frame rates when 8xAA and above is used.
So, we have DISHONEST reviews all over the web now, including HERE.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
Corrected link:
http://www.tomshardware.com/reviews/geforce-gtx-480,2585-13.html
--
Most interesting, perhaps, is that the GeForce GTX 480 and 470 sacrifice less of their performance in switching from 1x to 8x anti-aliasing, allowing the GeForce GTX 470 to jump in front of the Radeon HD 5870 with 8xAA enabled, even though the 5870 is faster without AA.
--
How much longer do we all have to pretend that only 0xAA and 4xAA exist ? As long as NVIDIA has an absolute and massive advantage there ?
It appears so.
( the reviewer will whine they have no time to test anything else, so then DROP 0xAA and use 8xAA or 16xAA INSTEAD ! DUH ! )
--
No, we are supposed to PRETEND we spend $300, $400, $500, and $700 dollars on one card or double that, then use 0xAA ! ROFLMAO !!
WHHOOO ! Or just 4xAA !! hahaha !
The BLATANT BIAS is INCREDIBLE!
 

cleeve

Illustrious
[citation][nom]silicondoc[/nom]How much longer do we all have to pretend that only 0xAA and 4xAA exist ? As long as NVIDIA has an absolute and massive advantage there ? It appears so.( the reviewer will whine they have no time to test anything else, so then DROP 0xAA and use 8xAA or 16xAA INSTEAD ! DUH ! )[/citation]

Wow. First off, take a pill, fella. Relax. You're not saving the world, and I'm not the antichrist.

Most major websites test no AA and 4xAA. Why? Because 0aa is relevant as a baseline, and 4xAA is ample. 2xAA is a bit of a waste.

Amping it up to 16xaa provides only one tangible benefit: that you can brag to people that you run 16aa. If you consider that a benefit. The massive hit in performance is no commensurate with the minor increase in visual fidelity.

[citation][nom]silicondoc[/nom]0xAA ! ROFLMAO !!
WHHOOO ! Or just 4xAA !! hahaha !
The BLATANT BIAS is INCREDIBLE![/citation]

What is incredible is that you actually seem to expect people to take your opinion seriously after communicating in this fashion.

If you want a serious discussion, try registering your concerns in an appropriate manner. You can leave the sensationalism at the playground where it belongs.


 
Status
Not open for further replies.