LGA1156 really is dead.

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Does the Phenom II X4 come stock clocked at 4GHz?

When you begin to remove the bottleneck from the CPU and to the GRaphics card you get the results I have mentioned often. Which is that since there is a Graphics card bottleneck.. other performance indicators come into play (such as the added latency from having to tunnel data from the PCIe bus to the QPi bus).

LGA 1156 has no such bottleneck. LGA 1156s bottleneck is in the form of the PCI Express links integrated into the die.
 
I see socket 1156 being better than AM3 in games for the next few couple years if you ask me, unless AMD's (bulldozer?) IPC gets higher by quite a bit. They still are barely beating kentsfields IPC and that CPU is getting old as hell.

Aren't 32nm chips coming out on 1156?

And when i5 isn't enough for you, which in gaming that will be a good while, you go i7 1156, I call that a somewhat of an upgrade path. =P

And as far as more cores go for AMD, please. Like that'll help in gaming performance.

But hey I could be wrong! AMD's next gen CPUs (not chip shrink) could be badass, but at the progression lately it doesn't look like it.

Don't get me wrong, AM3 will still have a better upgrade path than 1156, but I see the performance not beating i5 enough by the time the i5 user is ready to buy a new mobo.
 


Phenom II > Anything intel can do in gaming.


Fact. Eod.
 

Lets wait until we get some real benchmarks. Not ones at a GPU-bound resolution. :)

But hey for all we know Phenom II really could be a gaming gem, but if you ask me it's still too hard to tell.

Unless i7 scaling is REALLY that bad. Which we all know isn't true.
 
Oh because when YOU buy a HD 5970, you play at 800x600?

Seriously what a joke.
Nobody cares about effing benchmarks, we want real life performance, something intel doesnt give us for the 300$ more we pay over such a nice AMD system.

And dude, you see i5 beating phenom II?
Did you skip the shitload of benchmarks that have been running through toms?
or your just a fanboi and dont want to see the truth?
 

Really? Pay close attention to what is happening IN THE FRIGGIN LINK YOU LINKED YOU DOLT!

Core i7 920 @ 2GHz it takes a Phenom II X4 2.8 GHz to match it:
Crysis_02.png


Core i7 920 @ 2GHz it takes a Phenom II X4 3.4 GHz to match it:
BF_02.png


Core i7 920 @ 2GHz it takes a Phenom II X4 2.6 GHz to match it:
HAWX_02.png


Core i7 920 @ 2GHz it takes a Phenom II X4 2.6 GHz to match it:
L4D2_02.png


Core i7 920 @ 2GHz it takes a Phenom II X4 3.0 GHz to match it:
COD_02.png



So what is going on.. why does scaling seem to stop at some point? A Graphics card bottleneck is formed at some point. When you hit a Graphics card bottleneck, other factors come into play (Processor frequency can still sometimes add 1 or 3 more frames).

There is a process which I have explained in detail and LINKED you the damn patent JennyH.. you're just too ignorant to acknowledge it (willfully ignorant on your part).

Latency comes into play. There is MORE latency in the Corei7 platform in the communications protocols between the CPU and the PCIe bus. I have explained this in great detail.

Now if you want to compare processor performance you have to compare the two architectures using applications which remove any other bottleneck from the equation.

This generally means removing games or running them at a lower resolution.
 


Yeah lol lets run benchmarks at 800x640 0xAA because we all know that's whay real gamers play at.

It's not hard to tell, the Phenom II has been beating the i7 in gaming all year tbh.
 
They may not beat it because they are all tied. We are talking about game software , console ports where o/c dual core cpu's or o/c cache less tri cores in some cases can give the same results because the GPU is the limiting hardware. After that its probably driver anomalies where there is a 6 fps difference. We know with driver updates they announce better fps in certain games, coding around the limiting hardware factor the gpu. Is it so shocking that $600 dollar video card setups are the deciding factor achieving fps/eye candy teamed with $200 dollar cpu's ?
 


Jeez elmo you see it from one side only dont you?

Every Phenom II made has DDR2 support. You know how hard that is to achieve?

You can make excuses about QPI or whatever but the Phenom II has *more* excuses and *more* reasons why it should be falling behind in gaming.

It isnt - and there is no better gaming cpu than a Phenom II X4.
 
"Latency comes into play. There is MORE latency in the Corei7 platform in the communications protocols between the CPU and the PCIe bus. I have explained this in great detail."

I am here just to watch the sparks fly now pretty much. But this statement may have just added to a +1 to the AMD side 😛
 


One side? Because if I am explaining things based on the facts and evidence.. the facts and evidence are what they are. If they end up only support one "side" as you put it then so be it. I think that me telling you there is added latency in the communications protocol between the X58 IOH, PCIe Bus, QPi link and the Corei7 CPU is me admitting a design issue on Intel's part. I fail to see this as me taking Intel's "side".

The argument I am making is that it doesn't matter. You're facing a Graphics Card bottleneck. I cannot make a statement that AMD is just as good or better at Intel in gaming because if I added another card (for Tri or Quad-fire) the results would show Intel in the lead by a long shot. Just look at how much extra frequency an AMD CPU has to run to keep up with Intel when there is no Graphics card bottleneck.

willful ignorance (uncountable)
(idiomatic, law) A bad faith decision to avoid becoming informed about something so as to avoid having to make undesirable decisions that such information might prompt. It may also be shown as for a person to have no clue in a decision but still goes ahead in their decision.




Not really.. because it only shows up when the gaming title in question is facing a Graphics Card bottleneck. In other words.. the worst that can happen is a few frames less for the Core i7... which doesn't really matter because normally it would show up as a tie. Either way it's not the CPU holding you back but the Graphics card. Add another graphics card and you alleviate the bottleneck.
 


I'm sure more people play at that or a little higher than people do on 25x16.

Lets see some benches at 1280x1024 - 1920x1200. 5970 and 5970 xfire. If Phenom II matches on both of those setups and those resolutions then I will submit myself to Phenom II's gaming uberness.
 

I am afraid 1156 will fail as before did other lower-ended intell sockets, (there was around 14 cpu sockets for intel CPUs since 2000).
On oposite side AMD did make 4 sockets in last 4 years, but 3 of them are compatible to each other (AM2,AM2+,AM3)
 
dude ur getting 2 HD 5970 and ur effing playing at 1280x1024????

This is a joke right???
With this system you can play crysis enthusiast 2560x1600 witouth a single drop in frames.
 

I dont really agree but its obvious AMD has come close enough to make the monkeys on the web start jumping and throwing fecal matter at each other.

Fact is performance is somewhat negligable as a purchase factor since phenom 2 can hold more of its own than could the last two generations of silicon suicide issued by the now dyckless,, I mean fabless AMD. It all depends more now on how you use your pc, and for what elmo does AMD just doesnt get as close as programs and games created on amd platforms or that can use their architecture better (whatever)
 
*Sigh*

He's recommending those resolutions because for most games, running at those resolutions puts the rendering load on the CPU more than the GPU. Since this entire thread is about CPU's, doing such is a perfectly acceptable means to test the CPU's in question. There's a reason why graphics card tests don't usually feature those low resolutions.
 


But you don't 'win' any benchmarks with an i7 unless you play at 800x640??? Sheesh isn't that obvious by now? 😉

Less than 6 weeks ago the argument was that the i7 would pull ahead given a faster gpu. the reason was the i7 was still being bottlenecked by the 5870 (lol?).

So fast forward a few weeks? The 5970 is out, the i7 is *still* losing to the Phenom II in gaming. And it always will, because the i7 is an inferior enthusiast gaming cpu.
 


That is so much garbage.

I've said it a million times but I will say it again. The i7 is great at low resolution gaming when neither the gpu or cpu is being stressed much. Throw a difficult game at it and the Phenom II copes better.

Because the Phenom II is a better gaming cpu, and always has been.
 


If your talking to me you may want to get checked out for dyslexia.
 
I know all of that THK U
But who effing cares???
your gonna pay 300$ more, for more framerates in 800x600? Is this the only thing i7 is good at???

Seriously like jennyH said, a few weeks ago it was "phenom II stands no chance against i7 WITH A HD 5970"

Now, since the benchies are out, its "phenom II stands no chance against i7 IN 800x600"

GOSH wats with all the hate???
Why don't you just admit that AMD are very performant cpu and that intel only gives us benchies, nothing real.
 


Enthusiasts talk of getting new cpu's 150-1000, new gpu's 100-700 to gain or experience a new EDGE. What is so wrong in the logic to update your motherboard/memory hardware ?
This argument is senseless. Its taken a given set of facts/circumstances and turning it in to some kind of advantage , positive or negative depending on where you want to argue the point. We pull our motherboards out of our cases to put on Heatsinks, how is "bad" that Intel spends millions on engineering better more feature oriented motherboards.
 

God how do you breathe without killing yourself? Your argument is laughably idiotic. If the i7 is the better chip when gaming at CPU dependent resolutions, how does that quickly flip when running at GPU dependent resolutions? You do realize that a faster CPU raises the level at which a GPU becomes CPU limited right? Let me spell it out for you. A faster CPU is better at lower resolutions. This means you raise the framerate and performance cap at the higher resolutions.
 


Really?

Care to enlighten us as to how much more work the CPU has to do when you increase the resolution? What does the CPU do now anyways?

Transform, Lighting and Clipping is now on the GPU, All means of rasterizing is now relegated to the GPU. Floating point calculations which have to do with Graphics loads are all relegated to the GPU. (GPUs are even having to do Computing tasks now with DX8 and higher).

The CPU simply feeds data and the data doesn't really change as you increase the resolution. It's the same amount of data being fed at the same rate.

CPUs do handle Physics and AI as well as some other integer related tasks though.
 


Look sunshine - you go test your games at 800x640, then test them at 2560x1600 and see how well your theory hold up ok?

Or did you just miss the FACT that Phenom II's are better at extreme gaming than i7's?

Read - http://www.legionhardware.com/document.php?id=869&p=23
 
Status
Not open for further replies.