• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

FX Vs. Core i7: Exploring CPU Bottlenecks And AMD CrossFire

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Crashman[/nom]I'm calling BS on this one because AMD's "eight cores" are actually four modules, on four front ends, with four FP units. Games have historically been limited by FP units specifically and front ends in general, no? What I'm seeing is that Intel's per-core IPC appears to be a little higher, when two different FOUR "full" CORE processors are compared.[/citation]
Depending on the AI pathfinding algorythm, the code can use only integer math. With a CPU that has so many integer cores, the pathfinding can be more complex, allowing for some new ways of AI behaviour.
Besides, I still wonder why no one ever thought of developing an engine based on integers, not on floating point, and abstracting it to developers and users.
 
It does not matter that cards can drive 200fps @2560x1600? I think someone forgot that 4k will be in your average conumers homes in 4 years.... (2017, some say 2015) and 8k is not far behind that, maybe 2 more years......

Yeah these are over kill now, but they are not even capable of driving the displays we will be using shortly.
 
[citation][nom]BigMack70[/nom]lol CES has people stupidly optimistic about 4kNo way is that going to be in the "average consumer's" home in 4 years. Not a snowball's chance in hell of that happening. In 10 years? Maybe. In 4?[/citation]
Agreed, how many people actually have a 1440 set up. There are a lot of people still playing on 1600x900 and 1680x1050, let alone 1080. We still have a while before even the average enthusiast will have 4k monitors.
 
Nice article, though i'll echo the others about throwing in a i5 there too, since you ended up with the value analysis anyway.

And of course, the 4C/4T vs 4C/8T vs 4 module/8T comparison, which i've been dying to see, especially for newer games.
 
The vast majority of people do not build ultra high end pc's with $800 worth of crossfire / sli graphics cards. The ones that do probably will not care about a measley $180 because they obviously want to build the highest end pc possible.

AMD's real value is for every one else. Those users that would rather take the $180 saved by using an AMD processor and contribute that money to a better graphics card. Most people build their pc's with a specific budget in mind. Therefore they look at it this way:

1. I can have an Intel Processor with a Radeon 7770 for x numbers of dollars or I can go with an AMD processor with a Radeon 7850 for the same amount of money. Which one is going to give me the best bang for the buck in the programs I want to run?

Additionally, I will be very curious how this article may change once the next generation of gaming consoles are introduced. From all early indications, it looks like both the PS4 and Xbox 720 will utilize 8 core cpu's at around 1.6ghz. To push games to the limits, developers will be multi-threading their games for all 8 cores. My guess is that the GPU will be the real bottleneck then no matter what processor is used. This might just change AMD's perceived value.
 
[citation][nom]BigMack70[/nom]lol CES has people stupidly optimistic about 4kNo way is that going to be in the "average consumer's" home in 4 years. Not a snowball's chance in hell of that happening. In 10 years? Maybe. In 4?[/citation]

4k will be here sooner than you think. It will not be for average consumer for sure. Just look at the 30" IPS they have been around for decade and maybe one in million PC user has it. 4k will be the new enthusiast's monitor. If it sells for $2k or less I 'd buy it tomorrow lol.

At 4k resolutions I believe there is no need for AAA or MSAA and tehrefore 8970/GTX880 or 9970/GTX980 will be able to handle it.
 
For 99% of gamers, this article doesn't really mean anything.
Any AMD FX, Intel i5/i7 or even a high end Phenom II or Core 2 Quad are all fast enough for most gaming needs.
The graphics card is a lot more important, and very few people will be using a 7970.

 
[citation][nom]Crashman[/nom]AMD sets the standard for the elite gaming market BY MAKING THE GRAPHICS CARDS in this test, and you want someone to gimp the CPU? What gamer would do that? At $800 for graphics, these cards deserve all the performance you can throw at them. If anything else was needed, it would have been a FASTER AMD processor that's worthy of this $800 graphics array.[/citation]
Half of the games tested weren't very CPU (much less utilized hyperthreading) bound and a i5-3570k would have destroyed the value comparison.
 
[citation][nom]salgado18[/nom]Depending on the AI pathfinding algorythm, the code can use only integer math. With a CPU that has so many integer cores, the pathfinding can be more complex, allowing for some new ways of AI behaviour.Besides, I still wonder why no one ever thought of developing an engine based on integers, not on floating point, and abstracting it to developers and users.[/citation]
That would never work. FP datatypes have around 10^300 times the range with constant precision, while an integer's significant digits end at the radix. Most trig functions have a domain and/or range the is less than one. All of the workarounds required would make the program much slower (not to mention impossible to understand) than using FP logic.
 
AMD FX Processors are made different for the sake of cooling off their CPUs. Multiple cores on a 1 core is called cluster cores. AMD squeezed 2 cores on each processor instead of putting it directly on the bus (like Phenoms 1 + 2). So when you look at your Window Experience Index and it shows the number of cores remember I said. Example: FX-8350.......4 clusters cores times 2 per cluster equals 8 cores total.
 
[citation][nom]BigMack70[/nom]lol CES has people stupidly optimistic about 4kNo way is that going to be in the "average consumer's" home in 4 years. Not a snowball's chance in hell of that happening. In 10 years? Maybe. In 4?[/citation]

i think saying ten years is optimistic , look how long standard def hung around 60 almost 70 years now i know plenty people still using standard CRT TV's.

besides 4k resolution is only good for super enormous displays , (like 50+) and i just don't see 50 inch tv's getting pushed down to "every man's" price range , taht being 300-400 bucks
 
FX-4300= 2 cluster cores x 2 native cores per cluster = 4 cores
FX-6300= 3 x2 = 6 cores
forget the assumptions.....AMD rid of 2 monikers cause they figure it is pointless to bring it up and all of the AMD brand will have it. "64-bit" and "Black Edition or BE"
 
Great article, I enjoyed the read quite a bit. I especially like the fact your benches included ultra widescreen res. Hardware, including monitors, has hit a price point where if your a serious PC gamer you should be rocking 3, at least 1080p, screens.

Kudos to the 8350 as well. The "everything else is free" scenario exists for many people who already have an AM3+ system. I just went x6 to 8320 and could not be happier with my $180 upgrade.
 
about what i expected.

we all know intel has better single core performance then AMD. About the only thing noteable in this whole thing was how well the fx8350 did. I didn't expect it to go toe to toe with an intel chip in gaming.

It seems like the high end cpus for both companies aren't the gaming bottleneck (unless you're playing with 3 monitors)... the GPUs are. If anything this article made clear it's that as long as you're playing on one monitor at 1080p resolution either CPU is more then enough CPU coupled with a high end GPU.
 
[citation][nom]razzb3d[/nom]No need. My sister's FX 8350 kicks my 3570k's ass at 4.2 ghz consistently in most benchmarks. We both run GTX 480's[/citation]
Assuming the same graphics setup:
Benchmarks, like 3dmark11 are far different, favor more cores, but games rarely take advantage of those cores.
Also, there are many other parts of your system, including install issues that can make the systems vary.
 
So basically the AMD gets 60 fps or more for gaming and I need 140 fps because I can afford the expensive monitoring system and all. Bottom line AMD= Low cost basic performance Intel= High cost great performance. I see your graphs and all but at the end of the day I spend $140 less buying the AMD processor for a decent gaming performance. I get what your trying to say but end of the day its not always dollar to performance its what can I afford to a game at the needed level. Some of us have house payment and car payments and a family to raise. Its not all about the shiny jewel in my pc that lets me put a wall full of monitors up and Play BF3. I get it but people need to get you cant compare apples to prime rib. An apple will satisfy your appetite but a prime rib will do it with a nicer flavor in my book. But thanks for the input and all your hard work and benchmarks you all provide.
 
Overall it was a good article, and I think you have shown once again that AMD is still not yet !/$ competitive with Intel, but it would seem to me that they are at least making some headway.

Anywho, my only complaint is that this is not how people build systems. Yes, if you want 'the best of the best' for a respective platform, then it is helpful to show that Intel wins (would be more helpful to know how nVidia cards react... though I imagine it would be much of the same story). But 90+% of buyers out there have a specific, or near specific, dollar point in mind when purchasing a total system, and so the question then becomes what is best at what price points.

In other words, I think the real question is not one of what platform is best, but as GPUs have vastly more influence on performance than the CPU, what level of CPU is needed for today's games?
So if I have a set budget of (lets just say) $1000, am I better off getting an i7 or 8350 with a single monster GPU? or would I actuially get better performance from a much slower/cheaper i3 or i5 (or AMD equivilant), but paired with perhaps 2 upper midrange cards?
 
Please test a 3570k with the same GPU setup and the same games. At least there would be price parity with the cpus. I own a 8350 and I own 2 2500k rigs. My $$$ is on the 3570k winning. However, at least on a true cost basis, the comparison would nearly be equal.
 


That seems like a rather convenient example. Don't forget to take power efficiency into account. To use an equally convenient example, if you save $30 on the initial purchase only to spend an extra $60 on your electricity bill over the next 3 years, you've gained nothing. (Obviously usage habits and power costs vary from person to person, locale to locale.)

In this case, the Intel solution costs significantly more than the AMD solution (about $100 more, off the top of my head), so you have a point. But as others have pointed out, the Intel Core i5 probably provides a more apt comparison. It's easy to say that AMD is for cost-conscious consumers and Intel is for high-end consumers, but the truth is that there isn't a hard-and-fast rule. Intel has a whole range of powerful and power-efficient CPUs ranging in price from about $120 to $1,000. (For the sake of argument, I'm ignoring everything below Core i3.)

AMD wins in the extreme-budget segment of the market, I think, on the strength of their superior integrated graphics' solutions. In the low-to-mid range, Intel and AMD trade blows; whether one or the other represents the better buy will depend on the consumer's particular needs.


I'd like to think that all of that is true, but there are limits to how well certain tasks can be multi-threaded. Sometimes, step 1 must proceed to step 2, and so on. We'll see.
 
could tomshardware or if anyone knows of any other websites who have done this please recommend benchmarks comparing the phenom II series with the bulldozer and piledriver fx series (ideally with plenty of gaming benchmarks)
 
Status
Not open for further replies.