Deus Ex: Human Revolution Performance Analysis

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

we_san

Distinguished
Nov 26, 2009
91
0
18,640
[citation][nom]renz496[/nom]lol. if you put it like that both site are reporting the truth isn't it?[/citation]
I mean Scrum is right .. this just marketing strategy (cheerypicking, whatever..)and which side sponsoring which site. But then sometimes (if not many ..) It is too much and become too obvious .. And somebody get annoyed..hehehe...
 

amk-aka-Phantom

Distinguished
Mar 10, 2011
3,004
0
20,860


Many people like to overclock for the hell of it. My i7 isn't unlocked (not into OC, anyway) and it's a rocket, and so is a 2500K on stock. It's just fun to OC.
 
[citation][nom]we_san[/nom]I mean Scrum is right .. this just marketing strategy (cheerypicking, whatever..)and which side sponsoring which site. But then sometimes (if not many ..) It is too much and become too obvious .. And somebody get annoyed..hehehe...[/citation]

honestly the result that toms presents are surprising given that this game was under 'AMD Evolved' banner. do you know any other sites that also do performance review using a few cards like toms?
 

verbalizer

Distinguished

we already know he is an ass..
 

verbalizer

Distinguished

:lol:


afraid to respond he is, tail between his legs in embarrassment.
 

verbalizer

Distinguished

^+1
 

dickcheney

Distinguished
Aug 11, 2011
194
0
18,680
[citation][nom]werner123[/nom]For the price i got it for no, and i play all my games smooth as silk at the highest details, and that's only with a hd6850. Oh and at stock 3.2ghz nonetheless. Tell me again why it sucks?If sandy brigde is such a gaming power house of a cpu why is everybody overclocking it, not good enough?[/citation]

I never thought there was such a thing as too much speed for the same price...

A 955BE at 3.2GHz is like a 2500K at 2.4GHz. And you and I both know that running a 2500K under 4.5GHz is not acceptable. AMD chips were great for the price cause for the past 3 years all game were GPU centric. This one is not!

Hey malmental, y so mad?
 

verbalizer

Distinguished
because I TOTALLY DISAGREE with your statement and entire philosophy on the matter as well.
yes, we all know that a i5-2500k is better and will kicks anything AMD has out ass.
but that's not what it's all about.
on top of that the performance and overclocking ability that 955/965BE's have do compensate slightly but still we know, Intel offering is better.
but to say the 955/965BE is where you say it is, totally ignorant..

then I'm asking what hardware do you run and don't lie and/or exaggerate, you will be called on it and asked to show proof.?!
have you ever had a 955/965BE max'd out with a higher performance GPU and if so then what.?
just because you mis-interpret some articles you mis-read on-line or heard from some Intel fanboi in your neighborhood is not always how it really is.

as for me, I can tell you, show you.
odds are I have thrown away or given away better hardware than you have now.
and to show you how confident I am about older hardware, I sold an i5-2500k build and kept my i5-760 and my 955BE builds.
955BE / GTX 460 SOC / Veloc 10K - is going AM3+ 990X SLI upgrade (I hope you understand the numbers)
i5-760 @ 3.4GHz / SLI GTX 460 SSC+ / 90GB SSD - runs with any Intel SB out there in gaming and whatever else.
this build can hold me over until Ivy arrives and see what it's about.
the performance drop of minimal effect to me (i5-760 @ 3.4GHz from i5-2500K in-progress build) or sell it for the extra cash and come back even stronger next time..
remember that BD and Ivy are coming..
what would you do if you only knew the truth.?

so with that being said, proof your point...!
 

verbalizer

Distinguished

nothing is for certain yet from what I understand, I might be wrong.
have you seen the system requirements page.?
http://www.game-debate.com/games/index.php?g_id=1164&game=Battlefield%203
http://bf3blog.com/battlefield-3-system-requirements/

with analyzing what I can see from this (and from what I already know), our 955BE set-ups are similar.
HD6850 is the competitor of the GTX460.
but @ 1920 x 1080P like my AMD unit, we might lose a frame or two, you think.?
I got the feeling with an overclock on the GPU and it can go all out,
max settings, not sure about ultra settings though, and let it fly..
the 955BE can run it stock I bet as well but of course a nice clock on it (unlocked multi..) and your set.

also depends on desired FPS..
 

dickcheney

Distinguished
Aug 11, 2011
194
0
18,680
[citation][nom]werner123[/nom]Off topic, can you please tell me at what details I'll be able to run BF3 with my p2 x4 955 (stock), 6850 @1080p[/citation]

I dont think that your 955 will be an issue with BF3, your bottleneck with be the GPU at 1920*1080.

Itll depend on what YOU consider acceptable FPS. High with some medium settings (like AA and HDR) could probably get you acceptable frame rates (45-50FPS being the absolute minimum).
 

verbalizer

Distinguished

if playing it constantly then I'd hate any lag but of course it can't always be avoided.
but if just playing a minute here and there, it's all good..

back to the thread, the Deus Ex looks good in the TV commercials...:D
from the article however, not so much.
 

cleeve

Illustrious
[citation][nom]scrumworks[/nom]At least I trust somebody else than you. You have a long track record of dissing ATI/AMD GPUs or just making nvidia cards look better regardless of situation. [/citation]

The majority of cards I recommend in the monthly best cards for the money article are Radeons right now.

The evidence suggests that your conspiracy theory is full of crap. :D
 

cleeve

Illustrious
[citation][nom]we_san[/nom]toms didn't include 2560 because it will put the red at the top.[/citation]

No. I made the call. You want to know why I didn't include 2560x1600?

I didn't include it because I was incredibly pressed for time to get this article out the door.

If I have to cut a resolution, it'll be 2560x1600. Why? Because there are a trillion monitors out there 1080p and below, and I don't know a single person who isn't a hardware reviewer that actually owns a 2560x1600 monitor. It isn't even popular enough to register on the Steam hardware survey results. I didn't even have time to test it to find out whether or not it made any difference.

Like most things, the truth is simple and reasonable. If that doesn't make sense to you, you can convince yourself of any conspiracy theory you can come up with. Not to mention, you guys are splitting hairs. Performance is so close across the board there's no clear winner no matter whose results you're looking at.
 

dickcheney

Distinguished
Aug 11, 2011
194
0
18,680
[citation][nom]Cleeve[/nom]No. I made the call. You want to know why I didn't include 2560x1600? I didn't include it because I was incredibly pressed for time to get this article out the door.If I have to cut a resolution, it'll be 2560x1600. Why? Because there are a trillion monitors out there 1080p and below, and I don't know a single person who isn't a hardware reviewer that actually owns a 2560x1600 monitor. It isn't even popular enough to register on the Steam hardware survey results. I didn't even test it to find out whether or not it made any difference.Like most thinks, the truth is simple and reasonable. If that doesn't make sense to you, you can convince yourself of any conspiracy theory you can come up with. Not to mention, you guys are splitting hairs. Performance is so close across the board there's no clear winner no matter whose results you're looking at.[/citation]

I have yet to see a 2560*1600 screen to... You rarely (if ever) see anything bigger than 1920*1200. I am of the opinion that in such a competitive market you pretty much get what you pay for.

As for the conspiracy part, you are on the internet dude. Dont expect people to use common sense when it comes to things like this.
 

drakefyre

Distinguished
Feb 13, 2010
61
0
18,630


The reason they drop the resolution so much is to minimize the chance of the GPU having an effect on the results. By having such a low resolution they can be assured that the cpu is the bottleneck, so they get much more accurate results.
 
G

Guest

Guest
"The reason they drop the resolution so much is to minimize the chance of the GPU having an effect on the results. By having such a low resolution they can be assured that the cpu is the bottleneck, so they get much more accurate results."

LOL! Are you making any sense? The GPU is being tested not the CPU if you have such low resolution there wouldn't be enough to stress out the GPU. Higher resolution is what separates the pure power of the GPU.
 

cleeve

Illustrious
[citation][nom]conyo985[/nom]LOL! Are you making any sense? The GPU is being tested not the CPU if you have such low resolution there wouldn't be enough to stress out the GPU. Higher resolution is what separates the pure power of the GPU.[/citation]

We're talking about the CPU benchmarks on the second-last page, Conyo985, not the GPU benchmarks.
 
[citation][nom]Gman450[/nom]This game isn't really that demanding is it ? The graphics looks a bit outdated as well.[/citation]

i believe some people agree to that as well when i read discussion from other forum. to me the games graphic looks a bit dated from the trailers (showing the gameplay) but i convinced myself it could be better in the final product plus you can't make accurate judgment from trailers alone. you need to see the game on your own screen to know how it really looks like :p. leaving the graphic things aside it was the gameplay that attracts me to this game :)
 

Overclocked Toaster

Distinguished
May 5, 2011
281
0
18,810
To be honest, the graphics are really disappointing. The game looks like it is from 2005. The direct x 11 features make a laughably trivial difference in terms of actual image quality, they were probably tacked on for marketing purposes and to appease PC gamers, but I`m not fooled. In terms of graphics, this is just another console-port. I hope the gameplay is great, I loved the original.
 

pantsu

Distinguished
Aug 23, 2011
3
0
18,510
Crystal Tools engine dates back to 2006. It's the engine Square Enix uses for current console titles, no wonder it's a bit scruffy looking.
 
Status
Not open for further replies.