FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Mathos

Distinguished
Jun 17, 2007
584
0
18,980
The article pointed out one thing I've known since the original Phenom release. The Memory controller and L3 cache being tied together for clock speed, is the bottleneck in most of AMDs recent and current line.

Most people don't realize that the IMC, and L3 in AMD cpus since Phenom have ran at 2ghz. Or basically around the same speed as the HT Bus. Meaning the cores themselves are being fed data much more slowly than they can process it. Back in the day, when I was tinkering with a 9600BE, or later a 9850be I use to be able to get performance gains, equal to the OC % or slightly higher, just by OCing the IMC/Northbridge on the CPU, up to the same speed as the cores were running.
 
Excellent article, although I found no surprises.
The AMD Zambezi and Vishera CPUs were / are not intended to be great gaming CPUs; that wasn't their focus. I see too many AMD fanboys trying to apply the wrong metric to their favorite chips. There are other, well-threaded "enterprise-type" tasks at which the AMD CPUs absolutely excel, but transferring those expectations to gaming doesn't make sense. I wouldn't bring a knife to a gunfight, but I'll take the knife every time when I want to carve a roast.
I would like to see this same type of test done with SLI, for the sake of exposing platform differences, if possible. In fact, I'm probably more interested in the SLI vs. Crossfire numbers than the AMD vs. Intel ones, on BOTH platforms.
 

f-14

Distinguished
[citation][nom]Mathos[/nom]The article pointed out one thing I've known since the original Phenom release. The Memory controller and L3 cache being tied together for clock speed, is the bottleneck in most of AMDs recent and current line. Most people don't realize that the IMC, and L3 in AMD cpus since Phenom have ran at 2ghz. Or basically around the same speed as the HT Bus. Meaning the cores themselves are being fed data much more slowly than they can process it. Back in the day, when I was tinkering with a 9600BE, or later a 9850be I use to be able to get performance gains, equal to the OC % or slightly higher, just by OCing the IMC/Northbridge on the CPU, up to the same speed as the cores were running.[/citation]

we've known that since DDR2 came out and the entire AMD product line wasn't keeping up, you did a get job explaining it clearly tho and telling people how to improve amd's problem the most and what amd needs to fix for the last 7 years.

if AMD doesn't fix their bus speed problem they are going to become Cyrix very very soon. having doubled the bus speed in the Athlon slot A is what put them ahead of intel in the first place 200mhz vs 100mhz was incredibly compelling at the time.

it's the same problem with video cards right now those 64 bit/192/256 bus speeds are killing them but from a companies side of view it's the greatest way to not develop new product and charge more for just releasing faster bus speeds. we don't see any more 384bit/512bit bus speeds on video cards any more unless they release an X90/X990/X970 model on video cards and even now they are trying to get away with having 2 gpus at 256bit bus speed on 1 card rather than do a bit more work than slapping two cards onto one board instead of changing some chips and circuit pathways to allow for greater heat and put on a more expensive cooling solution.
 
[citation][nom]eastyy[/nom]could tomshardware or if anyone knows of any other websites who have done this please recommend benchmarks comparing the phenom II series with the bulldozer and piledriver fx series (ideally with plenty of gaming benchmarks)[/citation]

Well... if you typed into google "Tom's Hardware Piledriver", First thing you should of seen is "
AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?" Which is the review of Piledriver when it first came out with comparing the cpu's your asking for..

It's been out there since October.....
 

ElMoIsEviL

Distinguished


This ^

I'm not looking to brown nose just pointing out the validity of this statement. It is not said often enough imo.
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
Amd's performance in skyrim was pretty bad, regardless of cost, i'll stick with intel for now. I'd hate to buy a processor just to find out im not happy with it cuz one game I love runs like crap.
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795
Was that bottleneck examination ? no it was performance comparison.
It's easy to see that in heavy cpu load game and physics like F1 and skyrim
Intel is unmatched.
I was expecting i7 with changed parameters to mimic (i3 and i5) vs new FX lineup.

I'd love to see Tom's do a new revision of Building balanced PC which was
one and only real bottleneck examination I've seen on this site.
There is heap of new parts on the market it should happen.

and that's what amd strongest chip can do gaming wise
barely better vs i3 with it's 1ghz higher stock clock
http://www.anandtech.com/bench/Product/289?vs=697&i=60.61.62.129
and small disaster vs i5
http://www.anandtech.com/bench/Product/701?vs=697&i=505.506.39.60.61.62.129
 
Skyrim is built on a DX9 game engine , created when a top end processor had two cores and most still had one .
The gaming benches here dont show the superiority of the intel architecture. They show the age of the game engine . And it skews the results of the entire analysis . Without skyrim the $200 FX is matching the performance of the $330 intel very well .
Isnt it time to move on to recent game releases even if it means backward comparisons with older cpu's stops being possible?

I dont think anyone will build a computer NOW to play skyrim. But they might build one to play Farcry 3
 
[citation][nom]Outlander_04[/nom]Skyrim is built on a DX9 game engine , created when a top end processor had two cores and most still had one .The gaming benches here dont show the superiority of the intel architecture. They show the age of the game engine . And it skews the results of the entire analysis . Without skyrim the $200 FX is matching the performance of the $330 intel very well .Isnt it time to move on to recent game releases even if it means backward comparisons with older cpu's stops being possible?I dont think anyone will build a computer NOW to play skyrim. But they might build one to play Farcry 3[/citation]
That was true when Skyrim was released, but a few months later, they updated the game with a patch that made it take advantage of more cores, and improved performance greatly on both platforms.

It is also not so easy for games to utilize more than 4 cores. A big part of what holds the CPU performance back is simple scheduling, which more cores can't help. Physics and AI are the parts that more cores could help with, but only if the game pushes those things and it is still difficult to take advantage of cores.
 
Skyrim might have been released in the last two years , but the game engine is old . And the engine updates didnt make a huge difference to multithreading when they were applied .

The results show that intels individually stronger threads do better in games that cant multithread .
Only an intel fanboy would think that that proves intel is superior , since the rest of this article shows that intel does not have an advantage in most games

 
[citation][nom]Outlander_04[/nom]Skyrim might have been released in the last two years , but the game engine is old . And the engine updates didnt make a huge difference to multithreading when they were applied .The results show that intels individually stronger threads do better in games that cant multithread . Only an intel fanboy would think that that proves intel is superior , since the rest of this article shows that intel does not have an advantage in most games[/citation]

Sorry, I forgot to add that since the FX costs $200 and the intel costs $330 the result show pretty clearly that everyone but skyrim fans can save themselves a lot of money by building a new gamer with an FX cpu .
 
[citation][nom]Outlander_04[/nom]Sorry, I forgot to add that since the FX costs $200 and the intel costs $330 the result show pretty clearly that everyone but skyrim fans can save themselves a lot of money by building a new gamer with an FX cpu .[/citation]
You did see that they compared price per performance on the systems compared. While the whole system was cheaper, the Intel system still had better value.
 

loops

Distinguished
Jan 6, 2012
801
0
19,010
All this did for me was value my i5 more. I would note that the FX can have some added value if you can push the gpu budget higher and get into a better gpu by taking the i7/i5 off the table.
 

avjguy2362

Honorable
Jun 21, 2012
732
0
11,360
It would really help consumers if Tom's did some blind and double blind evaluations. Simply set up identical monitors, keyboards, and all input devices on a table with a large thick black curtain behind it. Two computers are hooked up to setups A and B. In blind tests, the evaluator knows that the test is between two CPU's, GPUS or whatever and in double blind tests the evaluator doesn't know what they are testing. Then after playing various games or using various programs the evaluator fills out questionnaires and leaves comments regarding each setup. This type of testing would ultimately show the consumer when benchmarks matter. Benchmarks are still done and compared to the blind testing results to best show real world differences. Frequently, small differences that appear to be "wins" will be shown to be irrelevant and other times small differences in benchmarks will show through the blind evaluations exactly where the line is that people actually see or experience the differences as being relevant.
 

Crashman

Polypheme
Former Staff
[citation][nom]mansfield[/nom]shouldn't they compare it to the 3570k? closer to the price range.[/citation]For the third or fourth time, the comparison should have included a theoretical $330 AMD super-gaming CPU. But that super-gaming CPU doesn't exist. A few people keep hammering on the "take the test backwards instead of forwards" drum, but that's still backwards from what a gaming enthusiast with 2x $400 graphics cards would do.

And bystander's 4-core 8-thread comment also applies, with the extra interger cores handling those extra four threads giving AMD the theoretical advantage over Intel's virtualization.
 
[citation][nom]m32[/nom]Intel fanboys keep talking about Skyrim. How many dang threads does it uses? I'm just wondering...[/citation]
Skyrim will use 2-3 threads and see improvements up to that point. It is not the only game that showed advantages for Intel. Look at F1 2012 as well, that one is even more pronounced. Starcraft 2 also shows large gains. Basically any game that is CPU bound.

For many the differences aren't a big deal, but for myself, I use a 120hz monitor, which makes having gains beyond 60 FPS is quite useful.

Anyways, the term Intel Fanboy seems to have lost meaning, as I'm betting 99% of those around here would probably fall under that category by your meaning. Just because people think Intel is better doesn't mean they are a fanboy. For gaming purposes, you'd be hard pressed to find anything that gives an advantage to AMD.
 
[citation][nom]BigMack70[/nom]That would be true if it weren't common knowledge that the 3570k performs the exact same as the 3770k in games....[/citation]

The 3770 usually games a little better than the 3570K because of the higher clock speed .

In games that can use more than 4 threads it sometimes shows advantages there too .

If this test was about showing the best intel processor vs the best AMD processor why did it not include a Hexcore socket 2011 ? So , having disproved the assertion that the best intel went up against the best AMD ,regardless of price, Im back wondering what happens when you run this suite of tests with a $200 i5 .
I would imagine in all the close scenarios the FX would be ahead by a small margin . The intel would still have won skyrim because of the game engines limitations , I would guess, but not by as much .
 

Crashman

Polypheme
Former Staff
[citation][nom]Outlander_04[/nom]So , having disproved the assertion...[/citation]Except that you didn't disprove anything, you're just voicing an opinion that's contrary to the truth. Tom's Hardware has the 3970X against the 3770K and found that the 3770K is the better gaming processor at high resolutions, which are the baseline for testing these graphics configuration.
 

Gin Fushicho

Distinguished
Mar 11, 2009
1,777
0
19,790
[citation][nom]Crashman[/nom]I'm calling BS on this one because AMD's "eight cores" are actually four modules, on four front ends, with four FP units. Games have historically been limited by FP units specifically and front ends in general, no? What I'm seeing is that Intel's per-core IPC appears to be a little higher, when two different FOUR "full" CORE processors are compared.[/citation]

That's besides the point. NONE of these games or programs actually are using four cores. They are using two. 3D mark may be using 4, I've never used the program for any reason as I like real world Benchmarks.
 
[citation][nom]Gin Fushicho[/nom]That's besides the point. NONE of these games or programs actually are using four cores. They are using two. 3D mark may be using 4, I've never used the program for any reason as I like real world Benchmarks.[/citation]
3dmark11 shows improvements with hyperthreading enabled, it clearly will use more than 4 cores. It does so in the physics tests.

Metro 2033, when advanced DoF is on, also uses more than 4 cores. You'd be pleased to note, that AMD did fair about the same as the i7 on the Metro 2033 benchmark when using advanced DoF, but was behind when it wasn't.
 
Status
Not open for further replies.