I7 920 vs Phenom II 965 with an ATI 5870.(Finally!)

Page 15 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I apologise..though lets be honest you've called me a lot worse elmo.

Edit - where did I say I didnt like men lol? Pretty sure I *never* said that at any time. 😀
 


... Crysis was only ground breaking in the way that it ran so poorly.

There is nothing special about Crysis. It looks as good as CoD MW/2 if not worse. It has worse game play. The story is worse. In no way is it ground breaking.
 

When someone made a comment about you and I just "doing it".. you came back stating that you weren't into men. I don't recall your exact words.
 


No I said I wasn't into *boys*.

That's a pretty big difference elmo. :kaola:
 
The COD series isn't innovative at all and has made tonnes more money then Crysis. To make a game like MW2 you do not need to be at the bleeding edge of technology, especially if you are going multiplatform. You can just use the same crap DX9 engine you have been using for the last few years.
 
Ok really now.

Why can't you just admit that my 'theory' has some merit? You have to admit that logically it is pretty sound no?

The benches add up too. Can you find any grand flaw in my theory? I've basically tried to tie it all in together and this is the best I can come up with.
 


No. It's a horrible piece of garbage.



...

If that isn't flame bait I don't know what is.



Take another bong? Next time get the insult right.
 


For one, that you claim that specific CPU arch. can be specifically coded for under the same instruction set.
 


Merit? ...

Don't make me laugh... I'm taking a drink.
 

Well I suppose this is where I come in.

Crysis is based on older primitive common sense approaches towards Polygon based 3D rendering. The engine itself is not geared towards complex compute effects (shaders). The engine relies on simple shaders and the usage of enormous amounts of high resolution textures and alpha textures.

In other words, the game is TMU bound. We see the results of this when SLI and CrossfireX are employed with Crysis engine based games. There is very little scaling. This is because TMU (Texture Mapping Units) do not scale with SLI or CFX (each GPU has it's own memory pool and each GPUs Texture Mapping Units therefore process the textures for the entire scene and not just the texture relative to the load assigned to them).

This is also why a Radeon HD 4870X2 2GB is really only a 1GB card or why a GTX 295 1792MB is really only an 896MB card.

Traditionally (DX10 era), nVIDIA cards have had far more TMU units than comparable ATi cards. Therefore nVIDIA cards perform better under Crysis than ATi cards.

This changed recently with the release of the 5870 as it contains.. you guessed it... 80TMUs (up from 40 on the 4870/4890) while nVIDIAs GT200/b based derivatives (top models) also share 80TMUs (they share the same bottleneck on that front therefore Shader OPs and Triangle count become the main performance indicators).
 
Am I the only one here that can see that even the i7 and PhII are GPU limited even with the 5870?


Seriously, if you want to compare computational limits of a CPU and how "fast" they are, you don't select a benchmark that primarily stresses another component.


/facepalm
 


Amusing to see how low you would stoop Badtrip. Tbh, I'm disappointed in you.

I wonder how many of the lesbian accusations you reported too? I'm *sure* whichever mod sees your little crybaby report will take that into consideration.
 


Yes I'm aware of all that. Nvidia does texturing better, ergo better Crysis.
 

So Crysis is not groundbreaking if it still relies on older rendering techniques (the techniques we've actually been trying to move away from since DX8).

It looks great, but it needs ridiculous amounts of Texel/s throughput.
 




See quoted.
 


Specifically is the keyword.

I didn't say that the i7 was chosen specifically.

a. Intended for, applying to, or acting on a particular thing:

It's just the way it happened. I have no doubt that intel made sure that this happened, but that is for a future thread.