P67, X58, And NF200: The Best Platform For CrossFire And SLI

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

banthracis

Distinguished
[citation][nom]ubercake[/nom] If I go with the 2% performance increase Sandy Bridge option, I'm looking at $280 for a 2600K (don't fool yourself... 4 threads / cores on a 2500k won't cut it with modern games - which use up to 8+ cores, [/citation]

False, games do not benefit from > 4 cores. In fact, most games max out at 3. This has been demonstrated enough times to have reached the point of stupidity to even bring up.

The $100 spread between the Core i5-2500K and Core i7-2600K is only recommended if you want to brag, because you're probably not going to notice any appreciable frame rate difference. The Core i7's strength is only really exploited in heavily-threaded workstation applications, rather than games.
http://www.tomshardware.com/reviews/best-gaming-cpu-core-i3-2100-recommended-processor,2895-4.html

I5-2500k and i7-2600k CPU's are essentially equivalent with slight variations depending on settings.
http://www.tomshardware.com/reviews/sandy-bridge-core-i7-2600k-core-i5-2500k,2833-18.html
 

K2N hater

Distinguished
Sep 15, 2009
617
0
18,980
[citation][nom]xrodney[/nom]When people finally realize that PCI-e is not only for graphic cards, you need it as well for Raid controllers, TV cards, audio/video encoders etc. Its not that much graphic cards, but other stuff that suffer when running on x4 because graphic card took over most of pci-e lanes.[/citation]
That's absolutely right but do you ever see a gamer with any offboard PCIe peripheral besides graphics cards?
 
Well I like to see 8x/8x is still viable, and especially at high resolutions. Next upgrade will be triple monitors around 2650x1600 or so size with whatever GPUs come out next year... not having to upgrade from my i5 750 makes me happy.
 
[citation][nom]banthracis[/nom]False, games do not benefit from > 4 cores. In fact, most games max out at 3. This has been demonstrated enough times to have reached the point of stupidity to even bring up. http://www.tomshardware.com/review [...] 895-4.htmlI5-2500k and i7-2600k CPU's are essentially equivalent with slight variations depending on settings. http://www.tomshardware.com/review [...] 33-18.html[/citation]
It has been demonstrated that BFBC2 will use all 8 threads. It has to do with parallel processing... things being processed simultaneously. Pretty modern concept. But I guess since no one plays BFBC2 it doesn't matter?

http://www.guru3d.com/article/geforce-gtx-580-sli-review/6

(check out the middle of the page)
 

banthracis

Distinguished
1. Article you linked mentioned nothing about CPU thread usage. It's a GTX 580 review.

2. There's a very simple reason why games don't benefit from multiple cores that any comp sci student or electrical engineer can tell you.

Games are very poorly threaded. The part of gaming that is well
threaded, the graphics portions, is handled by the GPU, which already
have hundreds of cores.

In gaming, the most CPU intensive task is AI. AI, by definition is not
a parallel process. It is extremely difficult to thread AI. Most games
that are "multi threaded" actually keep AI on 1 thread and throw the
rest (minor far less intensive stuff) on the other.

Can you design a game to utilize 4 or more cores? Sure, you can throw
AI on one thread, physics on a second, and all PC background tasks on a third, and various other minor calculations onto a fourth, but until someone figures out a good way to thread nonparallel
computations, the performance increase will be minimal, as the hard
work is still restricted to 1 thread.

This issue has been stumping programmers for decades. There are ways
to do this in specific situations, but no general solution yet.
see http://aigamedev.com/open/articles/hierarchical-logic-multi-threading/

A general solution allowing infinite threading of nonparallel
calculations would be the programming equivalent of finding the cure
for cancer.

Basically think of it this way. On a math exam you have a 3 part
question in which the answer to part each part depends on previous
answers. IE

A. Add up 3 and 5.
B. Use the answer from part A and divide by 2
C. Use the answer from part B and triple it.

what is the final answer?

This is the type of thinking AI requires. Threading this is the
equivalent of calculating the answer to A, B and C simultaneously.
It's not impossible like the mathematical equivalent is, but it's not
easy.

For this reason, more than 3 threads has very little benefit.
 
[citation][nom]banthracis[/nom]1. Article you linked mentioned nothing about CPU thread usage. It's a GTX 580 review. ...[/citation]
Again, look at the middle of the page. Something to the effect "The game has native support for DirectX 11 and on the processor testing side of things, parallelized processing supporting two to eight parallel threads, which is great if you have a quad core processor.":

http://www.guru3d.com/article/geforce-gtx-580-sli-review/6

I know it's hard to see things when your not looking for them.

Not all games are poorly threaded (ie BFBC2). There are more things going on in the game simultaneously than just one.

Think of it this way... Eight people taking your math exam at the same time.
Their answers don't depend on the answers of the other people, though they could do something to each other to disrupt the from getting the final answer. I understand if you overlooked that aspect. I think the engineers and computer science experts would not have done so.

Also, that was a good 2008 article regarding times when the bulk of processing in games were focused on few cores (which you correctly site for 2008 games). Things are rapidly changing...
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
@banthracis:

Assuming more than one AI-controlled entity in a game, AI should be the easiest work to multi-thread. Perhaps MSFT's FSX can serve as a good counter-example. If each aircraft operates as an independent entity, examining inputs (flights of nearby aircraft, location, intended destination, ATC requests and aircraft status) and making modifications to outputs (yoke, throttle, flaps, gear, radio chatter, etc.) accordingly, then each airplane being simulated could be run on a different thread. So 100 airplanes = 100 threads.

A similar concept can be applied to RTS games. Each computer-controlled player is AI-controlled for a high-level strategy, but also each unit requires some AI (can I see an enemy? Am I healthy? Is the enemy defeatable? I should attack!). These can all run as separate threads.

For FPS games, each enemy unit requires AI, in addition to the overall "strategy" AI supplied across enemy units.

IMO, the fact that game developers don't employ more threads is not because of a lack of opportunity to do so, but rather because of a historic shortage of usable CPU cores in their target market, combined with a lack of desire to rewrite major portions of their base game engines. If they want to, they can provide interfaces to scale multi-threaded, CPU-dependent features such as AI and physics just as they provide interfaces to scale GPU features today. They just don't choose to for cost / benefit reasons.

There are programs for which the primary function cannot be threaded, but I don't think this limitation truly applies to games.
 

banthracis

Distinguished
The article you probably meant to link ubercake, is this one
http://www.guru3d.com/article/core-i5-2500k-and-core-i7-2600k-review/21

The article however, demonstrates the i5-2500k and 17-2600k as being identical in performance at every level except 1024 x 768, though the purpose of playing at that res with no AA or other eyecandy with a gtx 580 and i7-2600k is questionable.

In fact, guru 3d notes one of the big advantages of good threading in a game. The better a game is threaded the less the CPU matters and the more the bottleneck shifts to GPU.

Battlefield Bad Company 2 will happily use four or more cores. The result is that very quickly the CPU does not matter anymore as it's maximizing the incredible amount of processor power. As a result the GPU really quickly becomes a bottleneck, even the GeForce GTX 580 is flat out running at 100% whilst the processor has plenty of force left.

This show how an i7-2600k is even less worth it for gaming since. Indeed, as they show on that page, above 1024 x 768, a 4.3ghz i7-2600k performs only 1-3 FPS faster than a stock phenom II x4 in BFBC2.
 

arkadi

Distinguished
Mar 5, 2008
395
0
18,810
I am not sure that "died" is the right term for the x58 chip set. It make sense to get the new generation hardware for new builds, but it far from worth upgrading existing rigs. I have more than two years old x58 based rig, and upgrading is the last thing on my mind.
 

banthracis

Distinguished
Either way you look at it Ubercake and teramedia, the data from every review of SB from here to TOM's to Anandtech, and the 3dguru you guys are linking concludes the same thing.

THERE IS NO GAMING ADVANTAGE IN USING AN I5-2500K OVER A 17-2600K. The data clearly demonstrates this point. No ifs ands or butts. Show me one respectable reviewer who concludes the i7-2600k is a better gaming CPu than the i5-2500k.

Now, as for multi threading, Muti threading of AI is still an issue.
I linked that particular article because it's still completely valid in terms of issues and solutions. In fact, if you had attended the aigamedev conf last year, Mikael Hedberg, the Lead AI programmer for BFBC2 spoke exactly about how they addressed AI implementation and multi threading in the game.

He and Alex J. Champandard talked specifically about the exact things linked in the above article; offloading of pathfinding and sensory systems from the main AI.

AI is more than PC controlled bots it's the fact that all objects in the game world interact with each other. In the real world, every action you take, like throwing a ball, has it's results determined by the laws of physics. In the game environment every action MUST have its result calculated, and you can't throw them all on different threads not interacting with each other, because in the end, each decision (including deciding to throw a ball), depends upon the actions of other objects in the game, which all goes back to the main AI.

BFBC2 does a good job of offloading as much off the main AI as it can, but at the end of the day, the Main AI is still 1 thread by necessity.

Is this an issue? No, as the 3d guru article demonstrates, with good offloading, we've reached the point where with only a couple of threads, CPU doesn't even become a large factor anymore.

Does this mean 8 threads are useless? No, you divide up ton's of offloaded tasks amongst the threads. However, none of these task remotely stress a CPU thread (which the Guru 3D data demonstrates) and it makes no performance difference since everything is being limited by the GPU anyway.

Hence, why in all the REAL WORLD data, that there is effectively no difference between an i5-2500k and i7-2600k especially at higher resolutions or when the eye candy is turned on.
 

r3nik

Distinguished
Apr 12, 2011
1
0
18,510
This isn't really about upgrading X58 to P67 (which I did to achieve much higher clocks on air), it's about this: what route to go if buying new today.
1156 is dead, in that all new architecture will not use it. It's far from bad, but the reviews show the more mainstream P67 isn't really gimped compared to X58 because of lower PCIe bandwidth or dual channel RAM (or DMI 2.0 vs QPI). My Gigabyte P67-UD4-B3 board was $100 cheaper than my MSI X58 Platinum when I bought it, and the CPUs priced the same.
For me, going beyond gaming to video encoding, having a higher clockable CPU, plus hyperthreading on the 2600K, made it a noticeable upgrade.
But as Toms so excellently shows, gaming isn't too affected. Everyone rejoice with their current X58, and if you're running a Q6600 or other Core2, now's a great time to upgrade.
 

jsowoc

Distinguished
Jul 6, 2005
32
0
18,530
Coming back to the topic of P67 vs X58 chipsets, there were two warring camps.

Camp 1: I should pay extra for an X58-based board (vs P65 or P67) because I will be gaming and I need 2 x16 PCI-E connectors.

Camp 2: Current video cards/games/screens do not need 2 x16 connectors to game well.

This article took a specific case of two similar-priced processors and compared SLI/CF setups to investigate if the argument from "camp 1" is valid. Based on the assumptions of roughly equivalent processors (we know that SB is slightly better clock-for-clock), we find no difference (I don't call 2-4% a difference) between the two setups.

This does not mean that those with X58 setups need to switch (or vice versa). It means that if/when you buy a new computer for gaming in "real-world" SLI/CF setups, you have the option of going with one or the other depending on current pricing/preference.
 

banthracis

Distinguished


The task you're talking about threading are exactly those mentioned in article above.
each unit requires some AI (can I see an enemy? Am I healthy? Is the enemy defeatable? I should attack!)

Those are by definition pathfinding and sensory systems which are offloaded.

I'm not familiar with Flight simulators usages of AI, but since these objects lack true high level decision making, (all you really need is a behavior tree and task scheduler to scale this) it's not an issue to scale.

One again however, your suggestions are directly at tasks that are already part of the established protocol for offloading the main AI.
This works fine for "dumb ai", but no one wants that in their games. They want their PC bots to work together as a team to flank and surround players. They want PC Ai's in RTS's to work together to attack simultaneously and use abilities to boost their teammates units like real player do. All of this involves tasks related to the main AI, which in turn requires searching the whole tree.

Am I saying it's impossible to thread AI? No, as I've mentioned programmers have accomplished a lot with offloading, including most of the suggestions you've mentioned. However, to eliminate the Main AI and decision tree would require a fundamental paradigm shift to how programmers, and indeed how the scientific community approaches AI.

I'm not sure how much clearer I can make this without going into some tech jargon. But really, if you're interested and feel you can do better, go take some courses in comp sci and AI and you'll be able to really understand the issues at hand and perhaps be the one who comes up with the idea that changes everything.
 

jasonh8806

Distinguished
Jul 28, 2010
94
0
18,640
[citation][nom]ubercake[/nom]Seriously... I know. A 2 percent increase and in some cases the X58 still takes the cake. But we do have the give the 1155 props... I guess you do save power if you don't OC your 1155. Additionally, if you go with the NF200 option by which you can add a third card, the Sandy Bridge option becomes expensive. It's good marketing on Intel's part to get us to pay more for a motherboard while making us think the 1155 is the more cost-effective option and buy a CPU you have to void the warranty on to see real performance advantages. Warranty costs go down and executive bonuses go up.I hope AMD's bulldozer is finally an offering that really competes with Intel in the enthusiast space or we're looking at some serious price increases and another chipset that's "the best thing since sliced bread" in the LGA2011 that will give us a measly 3% performance increase (mind you it's greater than the 1155's 2%) at double the price and 10% less power than the 1366... But if you OC it and void your warranty, it'll take you all the way to 5% performance increases on air!If anything, this article shows me there is not much of a reason to consider the 1155's performance much different than the 1136. It's just a different chipset with a slight performance improvement in some cases. A consistent 15-20% performance improvement and I'd say we're looking at an upgrade between the 1366 and the 1155, but this is not the case.If you're in the market to buy right now... the 1155 is definitely the way to go, because at stock speeds it will get you higher performance (even if you have to pay more for the NF200 addition) for less money. But considering they're still selling socket 775 equipment, I wouldn't call the 1366 a "dead" socket by any means. If any socket could be considered "dead", I'd call anything pre-dozer AMD or the 1156 "dead" sockets. AMD is not even considered in any sites' reviews when benchmarking video cards, SSDs, HDDs, etc... due to architectural deficiencies... and Intel leapfrogged it's own 1156 socket only a year later with the 1155 (who does that?!). Intel is currently only competing with itself in the enthusiast space. I hope like heck it changes with the dozer release.At any rate, great article.[/citation]

It's pretty obvious that you have some kind of personal distaste for Intel since all you did was bash the way they're advancing technology. First off you say that 1155 isn't any better than 1366 but take into consideration that the 2600k (most expensive 1155 processor) is $325ish and the processor used for the 1366 tests is the 990x at $1000. Second, the 1155 isn't a replacement for the 1366, it's a replacement for the 1156 or mainstream socket. If you want to complain that this is only a 2% increase in performance maybe this is true but that is a 2% increase over the enthusiast socket of the previous gen vs. the mainstream socket of the current gen. The comparison you want is between 1366 and 2011. To address the AMD Bulldozer issue, I do hope that they come out with something amazing with this release to keep pressure on Intel to innovate more rather than resting on the top of the hill like they can now. I kind of doubt it, but you can hope right?
 

Hard_Rain

Distinguished
Mar 13, 2011
25
0
18,540
[citation][nom]jprahman[/nom]Only point for LGA1366 now is for workstation builds which need 6 cores or for $3000+ bragging rights builds with quad-SLI/quad-Crossfire. Even then, anyone in the market for such systems know that LGA2011 is still going to come out later this year and will want to wait for that, instead of buying into LGA1366.[/citation]

Your conclusion assumes steady pricing. However, pricing has been fluctuating wildly. I took advantage of February sales at Cyberpower to get an X58 system with the i7 970 that is about the same price as buying a Sandy Bridge system (with 2600K) today. I saved a $100 on my Haf-X case, $20 on my PSU, and also saved ~$100 over what I would have had to spend for an equivalent performing motherboard. Then I used these savings to upgrade from the i7 950 to the i7 970. But I use my system for the financial markets so I can take advantage of multiple threads, and boatloads of triple channel memory. I also couldn't wait until later this year because my 5-year-old Dell laptop has reached the end of its useful life.
 

iceveiled

Distinguished
Oct 9, 2007
17
0
18,510
Wow...that was impressively comprehensive. As somebody looking to piece together a new build in the next couple months I'm even more confident in the P67 chipset and sandy bridge processor combo.
 

Crashman

Polypheme
Former Staff
[citation][nom]jtt283[/nom]Very interesting, especially for the reason outlw6669 cited, until I got to "...slams the lid on the coffin for X58 gaming." That was melodramatic BS; perhaps the silliest remark I've ever read in a Tom's article.[/citation]Nope, I'm tired of the ancient Quads Intel makes for the X58 platform. They start off slower, they overclock worse, and they produce much more heat in an effort to get less gaming performance. They were fine when they were new, but they haven't been updated. Intel could have given us an advanced LGA 1366 quad-core for enthusiasts, but instead expects them to dish out stacks of cash for six-cores.[citation][nom]r3nik[/nom]This isn't really about upgrading X58 to P67 (which I did to achieve much higher clocks on air), it's about this: what route to go if buying new today.1156 is dead, in that all new architecture will not use it. It's far from bad, but the reviews show the more mainstream P67 isn't really gimped compared to X58 because of lower PCIe bandwidth or dual channel RAM (or DMI 2.0 vs QPI). My Gigabyte P67-UD4-B3 board was $100 cheaper than my MSI X58 Platinum when I bought it, and the CPUs priced the same. For me, going beyond gaming to video encoding, having a higher clockable CPU, plus hyperthreading on the 2600K, made it a noticeable upgrade.But as Toms so excellently shows, gaming isn't too affected. Everyone rejoice with their current X58, and if you're running a Q6600 or other Core2, now's a great time to upgrade.[/citation]Bingo
 

Hard_Rain

Distinguished
Mar 13, 2011
25
0
18,540
[citation][nom]jprahman[/nom]But your usage falls into the 6+ core workstation category, where LGA1366 is still a good choice. So that characterization is still valid.[/citation]

Yeah, I am just trying to alert new consumers to pay attention to actual pricing and their specific needs. With today's prices, my i7 970 system is less attractive, even to me, than it was in February and certainly an i7 990 system is more difficult to justify then or now. Cheers!
 
G

Guest

Guest
[citation][nom]amk09[/nom]There will be an astronomical number of people who will butthurt after seeing the conclusion.[/citation]
The comments to this article do indeed exhibit a greatly elevated butthurt quotient.
 

flyinfinni

Distinguished
May 29, 2009
2,043
0
19,960
You guys realize this review is basically empty because the resolutions are too low to really see anything informative? Honestly, without eyefinity/surround resolutions, you'll never see the real picture when running tri-card setups.
Toms- get with it! You really need to be doing ALL of this testing with a triple monitor setup. 1080p displays can be had for $150 each, so for another $300 add 2 more to your current setup and get some results that actually mean something.
 

xrodney

Distinguished
Jul 14, 2006
588
0
19,010
[citation][nom]K2N hater[/nom]That's absolutely right but do you ever see a gamer with any offboard PCIe peripheral besides graphics cards?[/citation]
Let me think, from 6 gamer friends at least half of them and thats when not counting me so you can say 4 out of 7. And then there are those using it mainly for work and little gaming and for those non graphic use of PCI-e is even more important.
 
Status
Not open for further replies.