low budget cpu: i3-4150 vs fx-6300 vs fx-4350

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ashrosene

Reputable
Jun 29, 2014
8
0
4,510
Hello,
As the title says, I am trying to upgrade my CPU (and mobo, obviously).
I am stuck between I3-4150, FX-6300 and FX-4350.
I know that i3-4150 has much better, albeit fewer cores, but I was wondering if that is matched by the FX's more cores.
This PC's most CPU intensive games will be Total war games, the occasional Skyrim and Fallout, and Paradox Interactive games.
Post note:
How Important is mobo chipset? I am thinking of an msi-h81 for the I3.
 
Solution
Yea I am such a fanboy, that I own two FX 8320 rigs. I couldn't turn down the $100 Microcenter deal for black friday. Lay off the AMD fanboy kool aid. $200 more, please... To get an overclock to even remotely touch a Xeon 1231v3, you would have to have exceptional cooling and a higher end motherboard. I could pair the Xeon with any board I want and they will all perform the same. Overclocking costs more, and AMD needs to be overclocked to compete, which makes their value pretty poor.

PCPartPicker part list / Price breakdown by merchant

CPU: AMD FX-8350 4.0GHz 8-Core Processor ($159.99 @ Amazon)
CPU Cooler: Noctua NH-D14 65.0 CFM CPU Cooler ($71.74 @ OutletPC)
Motherboard: ASRock...
Dude, you can get the FX-6300 and be done with it regarding gaming! Or you can wait to get an i5. Getting an i3 is not being done with it. Look at other forums, message boards, websites and show me people who built a "GAMING RIG" in the year 2012 and up with an i3 processor? I do not know why they keep shoving an i3 down your throat.

And you can upgrade to the 8000 series or the 9000 series if you got the FX-6300. Either way, for gaming you will hit 60fps plus on ultra settings at 1080p with the FX-6300 when paired with a good graphics card. How many times does a person upgrade CPUs anyway? If he got the AM3+ he has two upgrading options. The 8350 and the 9000 chips will be great for at least the next 5 years easy. Maybe even more do to their multiple PHYSICAL cores. The PS3 and Xbox 360 lasted for 8 years before new systems came out. And people still play their PS3 and Xbox 360.

The new console systems have old AMD Jaguar cores, they will be around for at least 5 years before new systems come out. Put it in perspective, the PS4 and Xbone have 8-core AMD APUs. That is why you can play Titanfall and give the system commands with the Kinect (like the tv commercial), download games while playing games, and every other multi-tasking prowess that the new systems bring. I have never before even seen someone recommend the i3 so strongly for anything, let alone for gaming. It is a low-end processor designed for Office applications, internet surfing and web browsing.

If you want to go with Intel, go with Intel. Both companies make great products. But just patiently wait so you can afford an i5.

P.S. The new consoles suck in the graphics department if you were considering just getting a console! 😉
 


Many people have done, and thankfully they haven't taken your advice. It has been proven here in this thread that the i3 is more suitable.



Not much of an upgrade. You get 2 more cores that aren't utilised most of the time, and that's it. The 9000 series is only an option on more expensive motherboards, so you lose the price factor there. The 8/9 series are outdated as of right now in many games (specifically the ones OP's looking at), so no 5 years isn't an option. I can tell you from experience that frame drops are frequent on the FX platform, so a consistent 60FPS isn't necessarily correct either.



Same old argument. There is a very large architectural difference between the jaguar cores and the FX cores. Next.



No need.



So why bother comparing to them?
 
LOL... I have to save this thread for future reference. It is hilarious!

Anyway, the building process can be more enjoyable than actually using your new system. Enjoy the process and remember to learn from your mistakes along the way! Hopefully, you won't catch the "upgrade bug" and begin to constantly upgrade and accessorize like I did (I told my wife I need a cheaper hobby...maybe I can start building massive Lego worlds like the The Lego Movie, lol) !Happy building! :)
 
Look at other forums, message boards, websites and show me people who built a "GAMING RIG" in the year 2012 and up with an i3 processor? I do not know why they keep shoving an i3 down your throat.

For one thing, this is 2014, not 2012. The FXs are old now. You can keep living in 2012 if you want. The current i3s are a lot better than the older i3s were. The Haswell i3s are more capable than ever.


And just because I know where you were going with that statement, Back in 2011, I built my lil bro a super budget gaming PC with an i3 2120 and a HD7850 with the intent of upgrading to an i5 when he got more money. Well, little did we know that the little i3 would actually outlive the HD7850 by a good bit. His first upgrade was a GTX660ti because the 7850 was actually the bottleneck. The i3 worked pretty well with the GTX660ti and it wasn't until a few months ago that he finally upgraded the i3. It lasted a solid 3 years. Now he has a Xeon e3 1230v2 and it's a beast.

The i3 4150 is around 25-30% stronger than the i3 2120 was.
 


LMAO. It has been formerly "proven" because two people on a forum convinced a beginner builder to buy a dual-core i3 processor for gaming! WOW, ok!

On another note, you obviously do not multi-task much, or do not know how to monitor CPU usage. All the cores are utilized when multi-tasking or playing a demanding game like BF3 or BF4. On lesser titles, the remaining cores are free for you to run other programs so your system does not crash. Have a good time trying to have two games running at the same time in windowed mode. Or performing other tasks in the background while heavy gaming with the i3.
 


Where are the links to other forums of PC gamers lining up to buy an i3 for gaming? Where are those links? And you have no idea where I am going! SMH Your presumptuousness about what you think I am thinking explains a lot in your logic!.
 


Formerly?

Anyway, have you not seen any of the sources provided. In fact, just go and look at some game benchmarks yourself.

Here are a few that I grabbed just from the top sellers list on steam.

Skyrim.png


CPU_03.png


USKkvXQ.jpg


a3%20proz%20ultra.jpg


4409e0d6_rome1.jpeg


CPU-scaling.png


page6.html


And what you need to remember is, these aren't even Haswell i3's.

Also why are you running 2 games at the same time? If you wanted to do that you'd definitely want a larger budget. I don't think your hypothetical situation is at all what OP is looking to achieve.

By the way, please don't get so personally offended and sarcastic.
 
I have an fx-4350 with a hd 7950 and it is an amazing cpu. Bf3 ultra. Wildstar and wow I can run ultra but if I lower view distances I get better fps. At default view distance I get up to 78 fps on both
 


Currently I have AMD Athlon II x3 (3 cores) overclocked to Phenom X4 (Quad Core).

http://cpuboss.com/cpus/Intel-Core-i3-4130-vs-AMD-Athlon-II-X3-450

As it shows in the benchmark above, their i3 dual core easily beats AMD Quad-core. A true expert in CPU doesn't base solely on Core counts or clock-speed, but the features also and the architecture of the cpu.
 
All that debating and not a single person bothered to point out the differences in execution resources between a haswell core and a piledriver core, nor the differences in instruction pipeline length, nor the differences in cache performance, nor any of the facts about the actual hardware that would have made the case for the i3 far more clear. A bunch of people going back and forth "no this one is better because it has a bigger number" ... "no this one is better because I think it's 50% faster in single core workloads." Sort of pathetic to be honest.

The very first response to such a thread should have been very clearly:
"Core count and clock speeds are not measurements of execution performance."

Any AMD enthusiast should be VERY familiar with this concept. It wasn't so long ago that AMD was beating Intel at lower clock speeds and/or with less CPUs and with less power dissipation. Coming onto a forum to tout core count as a measure of compute performance is just as ridiculous as trying to make the case that Netburst was faster than AMD offerings at the time because it had more GHZ.

----------

So, in case anyone cares how to handle these threads on the next go around:

A haswell core hasmore execution resources than an entire piledriver module (if we look at the number of transistors allocated to non-cache functions, and perhaps more importantly, how many of those resources can be simultaneously scheduled to perform operations). Haswell also has a shorter instruction pipeline, and substantially better cache performance. The result: Clock for clock, a hyper-threaded haswell core can be up to ~25% faster than a PD module when compared 2 threads vs 2 threads. If we actually observe the number of effective ALUs and FPUs and simultaneously accessible execution ports/pipes on these two designs, it becomes clear that counting cores is about as stupid as counting cylinders in an engine without ANY regard for displacement, valve-train design, fuel management, ignition systems, atmospheric modifications, etc.

When the workload is only 1 thread per module, vs 1 thread per haswell core, the PD module suffers a major penalty, as the number of execution resources it can schedule that work out to is reduced significantly (it can not "split" the thread across the 2 cores, the single thread only has access to as little as HALF the execution resources in some cases (ALUs, primarily). By contrast, when the workload drops to a single thread on the haswell core, there is no reduction in the available execution resources for that thread. The out of order scheduler can attempt to use up to all 8 execution ports/pipes of the haswell module to work on that thread.

The result of this, is that the i3-4150 is equally fast or faster than the FX-6300 in any workload that saturates up to 5 threads. The ONLY condition where the FX-6300 can pull ahead of the i3-4150, is when the workload saturates and scales out to all 6 cores of the FX-6300, and then, the margins are not a make it or break it thing (~20%). On the other hand, in real-world workloads, especially gaming workloads that can not and do not scale proportionally into more-cores, the i3-4150 can indeed actually produce up to 50% higher minimum FPS in compute bound conditions. This is because the most critical execution workloads in the game engine have access to more execution resources via the superior intra-core parallelism of the haswell architecture.


-----------

Many people on the interwebs try to use the PS4 and Xbox as validation of the many-weak-core AM3+ approach for desktop gaming. They are operating on a fundamentally flawed set of assumptions and ideas. The consoles are configured with a PAIR of 4-core jag modules on a proprietary back-plane for communication between modules. inter-core performance scaling from increased thread count is poor on the jag design to begin with due to the shared L2 cache space and access, and worse still when communications from module to module are involved. The result of all this, is that unlike an FX-8350 or similar AMD desktop part, which demonstrates useful performance scaling all the way up to 8 threads in many workloads, a pair of Jag modules strapped together like that simply won't. In fact, there's no reason to believe that game development on the PS4 requires any hyper-focus on improving parallelism of the workload to beyond a few threads, as the overhead involved with spawning and managing additional threads would quickly become counterproductive to performance on this platform, even though it has a high core count.

Consoles are a fixed hardware configuration. The games are compiled and optimized for that specific hardware and it is a fully functioning HSA/hUMA platform. There is a dramatic performance difference between a binary that has been compiled to run on ANY configuration of desktop hardware made in the last 5-8+ years (common), and a binary that has been compiled with all of the optimization flags FOR a specific piece of hardware. The compute overhead of ports from these consoles will have the same problems that all other console ports have always had. They are forced to shed all of the scheduling optimizations and special instruction usage that maximizes performance on the fixed hardware. In this case, they will also be forced to shed all of the advantages of being in a hUMA environment, which will add significant compute overhead to the port in some cases. To make matters worse, we won't see any developers compiling a console port with a heavy use of modern instruction capabilities as they will want to maximize their target audience. There's also no practical way for them to build a piece of custom software for every arrangement of hardware out there (though, PC gamers SHOULD be encouraging developers to at least offer a FEW different binaries to choose from based on the platform in question. but that's a can of worms for another thread).

Simple answer here is that the PS4/Xbox1 does not mean that the FX-8350 or any other FX chip with lots of weak cores is a good desktop gaming chip, but I'm sure the marketing department of AMD wouldn't mind if you kept on believing that and sharing that.
In fact, the BEST solution to dealing with the compute overhead "ballooning effect" that comes out of typical console ports that must shed all compiling/scheduling optimizations, is to have a CPU with VERY powerful cores, to overcome the losses via brute force.

---------------

Just to make it clear, I am not an i3 owner, I am not an Intel owner. I have an FX-6300 on a UD3P as my primary desktop. I like the system very much for my needs and interests. It offers enterprise class features that a similar cost to implement i3 system would not, and I was able to obtain this hardware in a MC combo for a very reasonable price. It's a fantastic platform for some purposes, but is not well suited to real-time workloads like gaming.
 


I so wish I could quote that out of context every time someone uninformed claims an FX-6300 is 3x as fast because of 3x as many cores. But they don't listen.

*Sobs*

I spent the last 2 days arguing with some guy. I was saying the i3-4330 was a stronger choice for gaming than the FX-6100, and some guy who calls himself Random5 insisted the FX-6100 would run every game at higher FPS. I posted tests proving him wrong, so he claimed I was being paid off by Intel to trick people into buying i3s, then claimed he couldn't post counter-evidence because all the gaming sites were paid off by Intel.

*Sobs*

The morons on both sides just never learn that AMD and Intel both have two entirely different non-comparable architectures meant for totally different tasks, no matter how you explain it.
 





lol, the number of cores isn't really relevent here. Pay attention. Do you understand how software actually utilize cores? Do you understand script optimization? If you do, you sure as hell don't act like it.

What is more than clear isn't that ppl recommending dual cores (because, like it or not, they get the job done better and more efficiently in this guy's case) are fanboys. Its that you somehow talked yourself into believing that "more is always better"...

The case you make is like you riding around on a bike with 6 wheels laughing at the ppl on the bikes with 2 wheels... even though the bike with 2 wheels is designed so efficiently that it gets its job done (transportation) actually BETTER than the bike with the 4 extra wheels sitting there doing nothing.

So, you just come off like the goofball butthurt that his 6 cores continue, to this day, to get smoked by the "low end" Intel offerings. Riding around on his bike making himself feel like 4 extra wheels actually make a difference. "Have fun with your 2 core machine in 2014"... says the guy who clearly doesn't get much other than marketting when it comes to computer hardware.

The i3 is sitting in a board with enormous upgrade potential. Suppose the OP gets left 10k in his granddad's will. If he goes with AMD, he can upgrade to....? and if he goes with Intel he can upgrade to....? Ya. Sorry, the AMD's value vs the i3 is just beaten on all sides. I won't say its "clobbered", because it is a close race. But the i3 more than edges into the lead in every way... Oh, except for cores that won't be used 90% of the time, and which, being several architecture generations behind, are not doing as much with 6 cores when they ARE being used as the 2 cores on the i3, or 4 if hyperthreaded power is needed.

Its simple. For what this guy wants to do... ie get the best "value for the money" in his given price range, whether its fucking 1AD, 2014 or 3025... with the offerings available, the i3 is the better value. And remember, we're talking VALUE here. Find me any site with benchmarks for gaming that show the AMD cpu dominating or even coming out ahead in a majority of the tests and I might actually think you know wtf you're talking about.

 
Flogging a dead horse, yiehaa!: This might show something - or maybe not, but: Shadow or Mordor can use more than 2 cores. I remember seeing someone saying that, but seeing no real info about "how many cores?" and "how efficiently are they used?", so...

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-Middle-earth_Shadow_of_Mordor-test-ShadowOfMordor_proze.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-Middle-earth_Shadow_of_Mordor-test-ShadowOfMordor_intel.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-Middle-earth_Shadow_of_Mordor-test-ShadowOfMordor_amd.jpg


Hmmm. In some cases it seems like an elderly man, driving a Porsche 911 on the 5th gear, doing 50 km/h as he's too weak, to push the pedal to the floor. If you ask me: Blame the game, not the CPUs.
 
I was taking my dog out for a walk this morning looking at details about my new I3 4150 processor when I accidentally found this forum. I am a noob when it comes to understanding how cpus work but from my personal experience of using a fourth generation i3 it can do allmost anything, been playing games like crazy and using my pc for 12 hours straight (little bit of gaming, little bit of web browsing etc.) and the cpu doesn't break a sweat and the temperature never goes beyond 50 degrees. If your going for a low (or even mid budget) build do not make the mistake of ignoring the I3 processor. So glad I got this instead of an AMD 6300.
 
i3-4150 owner here. It seems no one mentioned the i3's TDP over AMD's FX CPUs. I have one on my gaming rig paired with a GeForce 750Ti and it's very efficient in terms of performance over power, an AMD CPU paired with an AMD GPU will consume about three times more power. Incidentally, lower power draw means lower heat, and less fan noise. Performance wise, the performance difference between an i3-4150 vs the FX-6300 is negligible.
 
Status
Not open for further replies.

TRENDING THREADS