FX 8350 or i7 4770k

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

ANIR0X2K00L

Honorable
Mar 10, 2013
69
0
10,640
I need help with this topic, i am getting a new PC and i can wait till then end of this year. The problem is that i will have to keep this PC for the next 5 to 6 years. I can spend extra to get a 4770k when it release but from what i have heard is that next gen games would run better on AMD hardware and will require more cores. I play graphically extensive games like Battlefield 3, Crysis 2, Crysis 3. I will be getting battlefield 4 so this will give u an idea of what type of games i am playing and all of these games need a good cpu to run, especially crysis 3 and possible Battlefield 4. Please just tell me that should i go with fx 8350 or the i7 4770k (when it releases in june), or should i wait for amd's steamroller or intel's broadwell. And please even list a good motherboard like asus maximus or crosshair formula motherboard or something better. I have noticed one problem that the AMD chipset i.e. 990fx dont have all the latest features.

Any help is greatly appreciated, please understand that i dont want to make a mistake cause if i make one i will regret it for the next 5 to 6 years.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


If you use cheating benchmarks or software optimized for Intel, plus run the FX-8350 with underclocked memory, and makes invalid power measurements, and lots of more stuff then yes Intel wins. If you run fair benchmarks (e.g. not using ICC) and modern software that uses cores and run memory at stock speed and measure power correctly, then the FX-8350 is near so good as an expensive i7-3770k, beating it in some benchmarks.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


In that silly review they did all the possible, and more, to favour the intel chips over the AMD ones. They did everything possible to decrease the performance of the FX-8350 whereas increased the power consumption. This is a resume, from memory:

- Selected one of the more power hungry mobos for FX and one of the more power saving mobos for ivy.
- Installed the buggy FX hotfixes. Those decrease the performance of FX chips, whereas increase the power consumption.
- Selected a performance memory kit for ivy, whereas only an "Entertainment" kit for FX. Moreover FX chips run memory underclocked, whereas the Intel chips run memory at their stock speed.
- Selected many applications favouring intel or optimized for chips. Skyrim or Batman are two games listed in the "games optimized for Intel" progr. Civ V is one of the half dozen of games where the FX performs bad. Cinebench is a biased benchmark that uses the cripple_AMD function...
- Using water cooling they only could obtain 4.5 GHz OC, when almost anyone is able to pass that using air.
- And so on.

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The hierarchy is not based in best value, but in performance. Each tier groups chips of about the same average performance in gaming. The i7-3770k and the FX-8350 are separated by one tie. The future 4770k will be in the same tie that the 3770k. Toms also says that you need to upgrade three or more ties to notice a worthwhile difference in performance. If you buy a FX-8350 and upgrade to an expensive 3770k (which is only one tie above) you will not notice it.

Moreover, many people seem to miss that almost all the games used to make that hierarchy are 2--4 threaded and thus only use a 25--50% of a 8-core chip. Future games will the 8-threaded thanks to next consoles being 8-core. Then most games will run better in a FX than in an i5/i7. Crysis 3 is the first example where this happen.

I also love as people continue citing biased benchmarks such as cinebench which give fake scores for favouring intel chips.
 

tadej petric

Honorable
Feb 9, 2013
826
0
11,010


Thing is that games will be optimised for 4 cores. But AMD cores are weaker so they cant do that on 4 cores ------> they put thing thats on 4 cores in intel to 8 AMD cores. Simple. Four strong mens could carry a heavy table, but if this mens are weak you need 8 of them to carry it.
See?
 

8350rocks

Distinguished


Anandtech has intel bias...they even got called out for a few of their reviews having "discrepancies" that showed favor to intel.

http://www.semiaccurate.com/forums/showthread.php?t=3151

Have fun with that! (I am so kind...I know)
 

8350rocks

Distinguished


Wrong...8 AMD cores are equal to about 6.4 Intel cores...

Where 4 intel cores with HTT are equal to about 4.8 AMD cores.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Nope. The reason why current games use so few cores is because the Xbox 360 has only three and the PS3 has one (plus six coprocessors) and most of PCs have dual and quad cores.

Next gen consoles are 8-core designs (developers choice) and future games will be 6-8 threaded. Modern games as Crysis 3 start to show how an eight-core FX-8350 outperforms a quad i5 {*}.

qSNrpeA.png


The first demo of killzone is already using six cores in the PS4. And most next gen games will be like that. All triple-A game developers participating in Eurogamer poll recommend the FX-8350 as the best cpu for future gaming rather than the i5-3570k.

{*} Before you fantasize about a future Haswell i7-4770k being able to match the performance of the FX-8350, let me say you that Crysis 3 is very far from exploiting the true performance of the 8-core chip. Only two cores are loaded at about 90% and four of them are only loaded about a 60%, whereas four-cores chips get all the cores loaded between 94% and 98%.

 

FAMDUCK

Honorable
May 18, 2013
61
0
10,660


Yes, future games will be utilizing more threads. Or did you mean physical cores? You know Intel i7's have 8 threads, right? And the Extreme chips have 12 threads. Either way, just as future games will use more cores, they will also be able to take advantage of hyperthreading. By the time these games you are foreseeing come out, an Intel chip will be out with 8, maybe more cores and have HT as well and AMD will probably have a 16 core CPU that can barely keep up with them. By then, nobody will give a crap that an 8350 gets 2 frames per second more than a 3570k in a game that comes out in 2023. This is like someone in 2003 saying a Pentium 4 will perform slightly better than an Athlon 64 in Far Cry 3, even though neither CPU can run the game in even the lowest settings. These benchmarks aren't fake and don't favor any brand. The reason they can't test them on future games is because time travel hasn't been invented. They test them on current games and on current apps. And there isn't a single app where AMD can beat any 2nd or 3rd gen i5 or i7. Nobody cares that an 8350 has as many cores as a PS4 (or PS3) because all the games that will come out for both next gen consoles can be run on any 4 core mid to top end CPU and any high end GPU from 2010 just like any game that came out on the 360 and PS3 can be run on any dual core CPU and and any high end GPU from 2007 and up. Hell, any title from the 8 core PS3 can run on an i5 with the HD 4000 in the same texture quality. What really matters in games in the future is GPU parallel processing cores. This is why even mid-range GPUs now have 500-600 cores; or about the same number of GPU cores rumored to be in the next gen consoles. When these future games come out, the 4770k and 8350 will be used as file servers or as spare HTPCs, which is what I'm currently using my Pentium 4 519k that I paid $1500 for in 2004.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Future games will be utilizing more threads and those will be using physical cores (in case they exist). Yes, i7's have 8 threads, but are run in a mixture of real cores (four) more virtual cores (HT) and how you can see above the FX is more faster with its real 8 cores.

Yes, extreme chips have 12 threads, but again they run in a mixture of real (six) and virtual cores. It is funny that you need an extreme ultra-expensive Intel chip, which is not aimed at desktop (and uses special socket), to beat a cheap FX-8350.

Intel Haswell top of line i7-4770k will be a four-core design and only offers about a 1--5% gain over 3770k.

You are right on that "nobody will give a crap that an 8350 gets 2 frames per second more than a 3570k in a game that comes out in 2023." Specially when today the FX already gives 8 frames more than the 3570k (the FX gives a 16% more FPSs than the i5) on a game that loads four of the FX cores only about a 60%. With future games being able to load the eight cores above the 90% then the FX will destroy the i5/i7. That is why the FX has been selected by game developers as the best gaming cpu (see below).

You continue negating reality with you false claims "And there isn't a single app where AMD can beat any 2nd or 3rd gen i5 or i7". No worth to reply this.

You also did not read that the PS3 is not an eight-core design like the PS4. The PS3 is a one core design assisted by six coprocessors. The PS4 has eight-cores which can be assisted by up to 18 coprocessors (shared with GPU via HSA design).

All triple-A game developers participating in a recent poll by Eurogammer have selected the FX-8350 as the best cpu for future gaming. Of course dual and quad cores will run next gen games, somewhat as you can run Crysis 3 on a Core2duo at 15 FPSs. run != play
 

tadej petric

Honorable
Feb 9, 2013
826
0
11,010


So what your saying with that bench is that 8350s min frames are about the same as 3570ks avreage. As all of you say that anandtech is ''biased'' so Ill show you toms.
CPU-Scaling.png

AMD wins for. OMG whole .3 frames. Point 3 frames. And AMDs min frames are about 10 FPS lower...
And some more non-toms
1350948339W82hNIgGd3_5_1.png

1350948339W82hNIgGd3_5_2.png

1350948339W82hNIgGd3_5_3.png

1350948339W82hNIgGd3_5_4.png

And quote from same site
Conclusions
Rory Read, CEO of AMD, has in no way ever given me any true hope for Piledriver cores being competitive with Intel on the desktop. Back in February of this year, he explained that the company was not going to be primed to compete with Intel on the high end desktop.

Rory sees AMD as being a huge player in entry and low end systems while taking a big part of the low power market as well. It seemed as though continuing trying to compete with Intel on the high end was not the end game. He referred to competition with Intel as an "unhealthy duopoly."

Since then he has gone on to explain it even further, but in no way going into this review did I have any true expectations of AMD even coming to within striking distance of Intel when it comes to computational dominance on the desktop. Piledriver and Vishera have cemented my thinking on this. To be fair to AMD, Piledriver was mostly baked when the new regime came on board at AMD and could do little to change the architecture, but I have no hope that the next core architecture will be nipping at Intel's heels either.

Is Vishera a better part than Zambezi? Yes it is. And in some areas of performance, quite a lot. AMD has done some great things with Piledriver when compared to its previous Bulldozer architecture. AMD can surely claim a victory in that arena and kudos to its engineers for doing so.

Is Vishera a better part than Intel's Ivy Bridge or previous Sandy Bridge processors? No it is not, not even close. Intel still has a healthy lead in both Performance per Watt, and Instructions per Clock. AMD cannot best or match Intel on the desktop and keep in mind that AMD brings 4 more processing cores to the table than the Intel processors compared here today.

When you look at the pricing structure laid out for Vishera parts on page 1 of this review, you will see that AMD knew its place well. AMD has not even begun to suggest it is in league with the higher clocked Ivy Bridge and Sandy Bridge parts. The FX-8350 reviewed here today is slotted to be sold at right around the $200 mark. When you look around at a comparative price point, you can purchase a 4-core Intel Ivy Bridge i5-3570K for $230. This is exactly the Intel processor that AMD is matching its FX-8350 up with. If you were to stick with the 3.4GHz/3.8GHz stock clocks of the i5-3570, the FX-8350 can be a compelling argument. But once those two processors are in the hands of an overclocker, our primary readership, the Intel processor simply cannot be beat performance-wise by the new AMD Vishera in PPW or IPC.

Is toms biased? is HardOCP biased? It got to take you hours to find bench like yours.

And I would like to see that poll of yours.
 

FAMDUCK

Honorable
May 18, 2013
61
0
10,660


WHOA! 8 FPS in one game? That is dominating! NOBODY CARES. Eurogamer selected the 8350 as best CPU for FUTURE gaming, but when those games actually come out, there will be an Intel CPU on the market that will make the 8350 and current i5s seem obsolete. Right now, for NOW gaming, and for the next few years, the CPU you want is the i5 3570k. Well, not you, you are a fanboy. 99.999999% of all gaming websites recommend the 3570k as the best gaming CPU. When will these "future games" that you somehow know will work better on an 8350 come out? 5 years? 10 years? You really think people, especially PC gamers, keep their PC for that long? And, if so, how much better will it be if it isn't much better now except in one or two games and by only, what, 8 FPS? Like I said, the 8350 isn't bad, but it isn't the beast you are making it out to be.

"The PS4 has eight-cores which can be assisted by up to 18 coprocessors"

So, the 8350, having 8 cores is supposed to be better for console ports even though it is nothing like the PS4 CPU you just described? By proving a point about the PS3, you contradict all your points about the 8350 8-core is better argument for games that don't exist yet. Seriously, nobody cares. If they want all these features, they will simply purchase a PS4 which is about the price of a GTX 680 or HD 7970. For games that are designed for PCs and other non-game apps, any quad core or higher Intel CPU made after 2009 is better.
 

8350rocks

Distinguished




WHOA! Since when is anything over 60 FPS not playable? As a matter of fact, there aren't very many solutions that would even allow you to view 200 FPS. 120 Hz monitors cut off past 120, 60 Hz monitors cutoff at 60.

So in your examples...both systems would play equally to the user.

I am not saying AMD is superior at everything, nor am I saying that AMD is superior in all games. However, I am saying that it is competitive with i5's and i7's and it is. Additionally...I am saying you shouldn't discount them.

Thanks for proving my point...90% of the time, it's a wash between them. (It falls within margin of error, or the difference in FPS occurs past the point that your monitor can keep up anyway)

 

logainofhades

Titan
Moderator
People also often forget the human eye cannot detect more than around 60fps anyway. Those higher numbers you never see. Put and Intel and an AMD rig side by side. If the Intel rig got 90fps and the AMD rig only managed about 70, you visually would not be able to tell the difference. Don't get me wrong, I like Intel chips and own a fair number of them, 4 if you count my laptop. If I were to buy a new gaming rig today, I would go with an 8350. It would be about half the cost to get the same features that mattered to me for an 8350 and motherboard than it would for the i5 and board I have now. That savings would get me a much better GPU. Haswell isn't all that impressive. So much so that I know of i7 920 owners that plan on skipping it and waiting for the next gen.

PS4 will have 8 core AMD arch and games for it will be optimized for 8 AMD cores. Saying this wouldn't translate to better performance for an 8350 in console ports is foolish. There are programs out there that heavily favor Intel because they were optimized as such. If an Intel chip would have found its way into the PS4, the same would be said for Intel chips as people are saying for the 8350 now. It is the fanboys that are in denial.
 

tadej petric

Honorable
Feb 9, 2013
826
0
11,010


I agree that 8350 is competitive to i5, but every race has to have a winner. And in this race i5 is the winner.

As you may know that you need to turn down your settings for CPU benchmarks (thats why high frames). It would still win with but diference wouldnt be that high if it would be maxed.

Direct quote from me.

AMD is great CPU but it loses against intel i5. It will be like that for at least 10 years. And after that they will be equal since the both wouldnt run games. At all./quote]
 

FAMDUCK

Honorable
May 18, 2013
61
0
10,660
...any quad core or higher Intel CPU made after 2009 is better [than an 8350].

If you were to buy an 8350 system, it would actually cost the same, or slightly more than an i5 3570k system. A 3570k barebones PC with just a motherboard, CPU, RAM, HDD, case, PSU and OS is an average of $500. A FX 8350 barebones with the same exact parts is on average $500, or about $100 less than a barebones 3770k system since both CPUs are about $100 less than a 3770k. So I guess the only real reason to buy an 8350 is the prediction that console ports will somehow magically be better on it one day since it has the same brand CPU as the consoles. Sure, it might be better in medium settings, which is usually what consoles are capable of, but if you want to max out your settings, a high end Intel + Nvidia PC will always be a better option. When a game is available on both consoles and PCs, the console version is usually preset to the equivalent of what would be medium settings on a PC. Since all PCs are different, PC gamers have the option to change all the settings and resolutions. If you want to play the game in 1080p and medium settings, a mid-range CPU like the 8350 and 7780 is perfect since the new consoles will have about the same exact hardware specs as that combo. If you want to max out every game and use the highest resolution, an 8350 is good, but not as good as a 3570k or higher Intel CPU + a GTX 670 or higher GPU. All of those benchmarks where the 8350 is close or better are in 720p or even 480p and in settings nobody plays in, not even consoles.
 

logainofhades

Titan
Moderator
You forget that AMD boards tend to cost less for the same features of an Intel board. For a quick comparison, an Asrock 970 Extreme 4 is similar in feature set to my Z77 Extreme4 and is about $35 less. Also Nvidia isn't always a better option unless you like buying $1k gpu's. The best single card under $1k is still the 7970. I admit recent cards have done well for Nv on the mid range parts. AMD has the low end and most of the higher end covered.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


It is easy to find those benchmarks and other confirming them. Moreover this is the article explaining why all triple-A game developers select the FX-8350 as the best cpu for future gaming

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen




8 FPS are nothing on 200, but are a lot of on 10. What matters is the relative value, and I gave it before: 16% more performance. That is much much more than that Haswell will produce over ivy.

You also seem to avoid that crysis 3 is not using all the performance of the FX chip. As stated before, four cores are only loaded by a 60%, whereas the four cores of the i5 must be loaded above the 95%. Develop a more advanced game engine that loads all the 8 cores of the FX at 90% and tell me how the FX destroy the i5.

It is fine that you claim that 99.999999% of all gaming websites recommend the 3570k as the best gaming CPU. What did me laugh in the first place was your use of so many figures after the decimal point. The second that you seem unaware that those websites only test existent games. They don't know anything about the games that are being developed now. Precisely the developers who are making those next gen games recommend the FX-8350 over the i5-3570k for those games.

When will these "future games" that you somehow know will work better on an 8350 come out? 5 years? 10 years?

Imagine that. One is already here in front of your face.

Yes, the PS4 having eight-cores is one of the reasons why the FX-8350 is more future proof than an i5-3570k or a i7-3770k. As stated before the i5/i7 performed better with most current games because they were optimized for few cores, assisted by the fact that PS3 and Xbox 360 were designs with few cores.

This is going to change with PS4 and the next Xbox. Both are based in AMD 8-core chips. In fact, the dev. kit of the PS4 uses a 8-core FX chip.
 

FAMDUCK

Honorable
May 18, 2013
61
0
10,660
The only article about gaming ever is that not-biased-at-all eurogamer article. Eurogamer, the authority on tech... oh wait. That is where I got the 99.99999999999999999999999999999% figure that "did you laugh" or whatever. The .0000000000000000001% of the planet that thinks the 8350 is superior is made up entirely of that single article that you love to quote, and repeatedly post a link to. The rest of the planet somehow doesn't exist to you and they think the 3570k is a good CPU for PC gamers. Find another, just one, article that is not on the eurogamer.net domain which says the 8350 is superior. We will all "did me laugh" when you find one. The whole "the new Playstation is going to have 8 cores and an AMD chip, so my 8 core AMD chip is now better than the Intel chip" theory is about as insane as saying the Xbox 360 having an Apple Power PC or an IBM Xenon CPU means that Macs and IBM Power PCs were superior for gaming simply because they share the same CPU and have more cores than the Core 2 duo or dual core Athlon 64. Same argument. Same logic. Both insane.

Please explain your theory to this thread:

http://www.tomshardware.com/forum/366034-28-what-core-gaming

It seems that none of them has read that single solitary article about the "8 core console = 8350 is better now" theory. There are 50 replies. Most of them are along the lines of "PS4 having 8 cores means nothing for PC gaming, now or in the future. What matters is your GPU." Please, PLEASE, post your theory on that thread, AMD fanboys.
 

FAMDUCK

Honorable
May 18, 2013
61
0
10,660


The GTX 680 is $500 for a 4GB version. The HD 7970 is $550 for the 6GB. The only advantage of the 7970 is memory. In actual FPS tests, the GTX 680 is slightly better. Not dominating, but slightly, or according to the comparison below, about 9% better in pixel and texel rate. The 7970 is not the best card under $1,000.

http://www.hwcompare.com/12351/geforce-gtx-680-vs-radeon-hd-7970/
 

tadej petric

Honorable
Feb 9, 2013
826
0
11,010


7970 GHz edition is better than 680 in most games. And theres no need to buy 6 Gb Vram except in workstation and high eyefinity...
 

tadej petric

Honorable
Feb 9, 2013
826
0
11,010


Biased.
And theres always that totaly reliable (sarcasm) german website.
And by all AAA game devs. I only saw 2 or three and even that was probably fake...

To all of you mentioning PS4.
No need of 8 cores for PC beacuse of this. That jaguar is weaker or about as good as i3. Thing is that today we have one core for physics, one for AI and so on. And next year it will be the same for intels CPUs. But jaguar will take 3 cores for physics... See? Impossible to make some real destruction on one jaguar core. But its more possible on Intel. Thats why it will use more cores on PS.
 

8350rocks

Distinguished
Actually...Jaguar cores are not PD cores...they are evolved K10 cores...meaning they all have their own FPU, no shared front ends. This is different architecture....it would be like sticking 2 low end i5 intel chips together with GDDR5 memory and a massively parallel GPU.

Now, when steamroller arrives, it will have HUMA and HSA to seriously impact performance on many levels. This will be game changing.
 

FAMDUCK

Honorable
May 18, 2013
61
0
10,660
HUMA and HSA are pretty cool, even though they have nothing to do with a 4770k or 8350 which is what the OP's original question was about. It would be game changing, if games actually existed that could use it. Right now, this is about as useful as Hyperthreading.

"The big difficulty for AMD is that merely having hUMA isn't enough. Developers actually have to write programs that take advantage of it. hUMA will certainly make developing mixed CPU/GPU software easier, but given AMD's low market share, it's not likely that developers will [be] in any great hurry to rewrite their software to take advantage of it."

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/

If you (8350rocks, juanrga, or whatever your SN is tomorrow) start your own thread (instead of hijacking other people's threads), what would the title be? I'm guessing it would be something like: "Who wants to bash every chip maker besides AMD, especially Intel?" or "Which is better, the 3570k or the 8350, oh, wait, I already know which one I think is better, but post anyway so I can troll." or "I have a CPU expert badge and would like to lecture you on what makes the AMD CPU so much better than the crappy Intel CPU. Nothing you say means anything, and if you post something, I will simply refer you to this Eurogamer article that I once read and have tattooed on my chest. [hyperlink to the article]" or "My screen name is 8350rocks and I am totally not biased at all." or "I check on this thread multiple times a day and if anyone posts something about AMD, I have a Turret's like compulsion to reply every single time even though it is irrelevant."