I7 920 vs Phenom II 965 with an ATI 5870.(Finally!)

Page 14 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


Um yes?

There is a difference between being done deliberately (coded for) and being done that way because that was the available platform (coded on).

Christ almighty why can't you figure that basic stuff out?
 

prove it

prove there is a preference to virtually identical instruction sets. Jen the difference is largely in the pipeline stages where Intel is more capable mathematically, You can multi thread games and that but it doesnt go that much farther.
 


Do you really think they only run 1 type of rig? Or that they code on the latest platform?

So much for figuring the basics out.

There is a little thing called quality control.
 


The top end games devs and software houses?

Yes i'm 100% certain they use the very best available equipment at all times. Anything less would leave them open to their competitors gaining a tech advantage. We aren't talking about general businesses here with their pentium 2's and windows ME, we're talking cutting edge.
 


My god you have NO idea how to run a business.

You are flat out ignorant.
 



she has a point but shes not fully understanding of it

I need to go back a few years in discussion here when we were talking about fpu/ALU and all of that, coding assigns a certain amount of processor driven performance but the lack of AMD'S being able to be equal to intels architecture is all that a game would expose. like lack of threads or virtual cores.
 
You can compile a program to use specific instruction sets. You can also code a program for a specific CPU, however this would require something along the lines of Assembly (where you code individual instructions), and people do not writes games in Assembly.

Game performance has got nothing to do with what processor you code it on.
 
Ok lets say i'm wrong, and the i7 is just 'better'.

How come the phenom II closed the gap to nothing with a driver update from ATI? Talking almost 50% fps increase just on a driver update.

Surely that proves that the problem was the initial drivers causing issues with the phenom II? Problems that weren't affecting the i7.

What is a reasonable explanation for that if not this one?
 
If AMD had the same IPC/threads/ logic stages fpu alu and etc games would take advantage of all there was, like when windows 64 would only run on AMD cpus.
 


There's quite a stark contrast between Pentium 2 and i7. Secondly, saying that games could be coded on ME? I don't believe that ME even supports DX9; maybe barely DX8.

That would be folly to code specifically for the most bleeding-edge hardware available. On a tangent to that thought, do you have any idea how impractical it would be to give every coder an i7 EE w/ 2 295's for example?
 


One Game. One Driver Update. Not Exclusive To AMD.

Get Over It.
 


Oh come on do you all take everything absolutely literally? I was clearly exaggerating with the Pentium 2/Windows ME stuff...

That would be folly to code specifically for the most bleeding-edge hardware available. On a tangent to that thought, do you have any idea how impractical it would be to give every coder an i7 EE w/ 2 295's for example?

Impractical how? So it costs a few tens of $1000's? How much money do you think software houses like EA and Blizzard etc make?
 


Get over it? 😀

You are the one who needs to get over it. All this time you believed the i7 was just superior, when in fact it just got I dunno..lucky? Then a simple driver update fixed it.

So that brought us to the situation where Phenom II wins on single and dual gpu's. That's pretty much where we are at right now, isn't it? It must be true when even THG is showing it on benches lol. 😀
 


Apparently not enough, considering Activision 1) Tacked on the $10 fee that is usually reserved for consoles (due to their publishing scheme, part of the game revenue goes straight to Microsoft, Sony, etc) to Modern Warfare 2, and 2) Is considering implementing a subscription scheme into their Call of Duty franchise.

Additionally, they realize that if they designed games with a focus on bleeding edge hardware, they would not make many game sales. I'll choose not to drag out the certain beaten dead horse here.
 
mcain...the games are designed on bleeding edge. Actually they are designed beyond bleeding edge.

That would be why most games previously didn't make the absolute max settings on release. Look at even Dirt2. A 5870 won't run it on absolute max, even a 5970 on absolute max wont run it perfectly.

Can you imagine what the original Crysis was developed on? It was WAY ahead of the generally available graphics hardware at the time.
 
And jennyh, I will not sink to the level of editing my posts to "Get the last word in." 😉


As to me taking your Pentium 2/ ME comment literally, it is no different to yourself making your wild assumption on the basis of one game.
 
I changed the word 'coded' to 'developed' on my last post - hence the edit (btw i just did it again, adding this). I dunno what you mean about getting the last word in, whatever works on your other forum doesn't work here.
 


... You claim one game's performance improvement due to a driver update causes Phenom II to overtake i7 completely?



...

Based on what bullshit?

I think you need to take a hit off a bong and relax. The LSD is getting to you.
 

Which "Original" Crysis are you referring to? The version showed during GDC '06, or the version that was released?


As to editing, I'm curious as to what you engage in. You added an entire quote and rebuttal to your post before "mcain... the games..."
What I quoted in the post before that was all that you had before you manipulated it. Sorry to get too close to your tactics. 😉
 


Crysis was a poorly coded piece of garbage that should never have been released. CryTek should be ashamed of themselves.

That is a horrible argument. Now, if you code it like Activision does with CoD... it will run on any hardware.
 


That's what the uninformed masses say.

Crysis was ground breaking, and it was developed on 'next gen' hardware. And it proves my point. So it was a flawed engine? Do you think they care considering how much money it has raked in for them?

That is why software houses need to be at the bleeding edge and beyond it.
 


Sorry but Elmo hasn't said a bloody thing in this thread in ages then he comes back with his 'witty' 1-liner?

He deserves the contempt that my last post gave him, because that is what he gave to me.