AMD FX 8350 vs Intel i5 6600K

atf_mart

Distinguished
Mar 6, 2013
111
0
18,710
https://www.youtube.com/watch?v=WZ_5p9wd2dk

This just shows how AMD is actually worth it.

A $150 CPU vs A $240 CPU (prices at the time of this post)

This is why Intel fanboys look bad.

And again, this is why we need competition, since the jump from the Core 1st gen (When AMD was still relevant in the CPU market) to Sandy bridge, Intel CPU's have been barely faster, with skylake some CPU's are showting as little as 3% increase in performance, and HIGHER TDP.

Support AMD.
 
I have an 8350 at 4.5 GHz and a 3930K at 4.3 Ghz. I tested the same 780 Ti Classified at 1080P using the Valley Extreme preset in each box, and the 8350 had an average of 10 FPS lower.

Not a big deal in the big picture, but it's the minimum frame rates where the 8350 shows it's weakness and why an Intel processor is better to have for gaming (if you can afford it). This is at 1080P, and will probably be less of an issue when gaming at higher resolutions where the GPU is more of a bottleneck than the CPU.
 
While I agree on that, I had an fx 6300 before the i5 I have now, it is not that noticeable, remember min framerate is the min framerate at a certain point, so lets say that an amd cpu hits 40fps at certain point in a game, its not the avg, and since avg is what matters, they are worth it, just beause amd cpu0s hi lower framerates for 1 sec that becomes the min framerate, not the min avg.
 
Average is part of the overall experience yes, but when explosions (as one example) are going off and the 8350 is dipping into the 30s but the Intel is staying in the 50s, one feels a whole lot smoother and provides a more enjoyable experience.

Granted, not all games are limited by this as the video, linked above, shows in Crysis 3 that the 8350 doesn't have as big a gap (but it's still there).

Edit - if you have a gsync monitor with a nvidia GPU or a freesync monitor with an amd GPU this issue goes away until you dip below 30 FPS. Personally, I REALLY notice when I am below 45 FPS because I am not on a gsync monitor.
 
It depends on the games you play and the setup. In some games it may not seem to matter, but what if I was looking to play cod advanced warfare and play it on a 120hz or 144hz monitor? The fx 8350 would be a poor choice since it can't keep frame rates up. Not to mention the much lower fps and frequent 20+ fps drops lower than even the previous 4690k in pretty much every game. It wasn't until the cut scene cinema shots during some of the games that the fx 8350 managed to keep up. The rest of the time it was floating along the bottom of the graph all by its lonesome outperformed by 4c/4t cpus. Intel's price is justified.

Refusing to fix their own booboos and rely on everyone and everything else like dx12, gsync, freesync and the rest of it to try and reduce their shortcomings makes amd look bad.

Even in crysis 3 where all the cpus looked even, if you watch the frame drops when the i5 drops to the 90's the fx drops into the high 40's and low 50's - I'd say 40fps is a major difference and it's points like these even though brief can make the game feel like it's lagging in spots. I especially like the amd fanboyism when it comes to cherry picking the info. The 8350 is only $150 at microcenter where the current 4690k is $200, not $250. In terms of pricing the 6600k, it's not available to purchase until 8/14 so it's not for sale just yet aside from preordering. Comparing the price of a years old cpu that's had more price cuts than a pair of mismatched socks at a yard sale to brand new cutting edge tech available only as preorder is a bit apples and oranges. The lowest price on the 8350 without going to a microcenter (since not everyone has access to one and doesn't include sales tax in that price) is a promo at superbiiz for the next couple of days at $165 and the cheapest price without promo and free shipping is amazon at $170. So if I wanted to play the fanboy game of skewing information to suit my point, I could just as easily say the much better performing i5 4690k (which can actually be purchased) can be had for just $30 more than the weaker performing fx 8350. Doesn't put amd's price/performance in such good light that way though.

Looking at the pricing history, the fx 8150 was $245 when it released and the 8350 was $199. Actually if you wanted to compare apples to apples, when the 8350 was available for preorder, it was $253. Factor in that was 3 years ago so counting for inflation, the i5 6600k is actually cheaper in its preorder than the fx 8350 was even ignoring inflation. Ouch.
 
hopefully dx12 helps the AMD cpu's a decent amount like Mantle does in games that support it. I haven't kept up like I used to with PC news, appears Broadwell wasn't a huge splash in the desktop scene(just like I had been saying for awhile for people not to spend money on an h97 over an h81 to 'upgrade' to broadwell for a wimpy improvement) , now we're in Aug on Skylake, I watched the vid and still years later no reason to upgrade over my stock i5 3570k, somebody on Sandy Bridge would still be more than fine. Even the old 2500k was still holding up and beating the 8350 most of the time. I would have liked to see the fx 6300 in those tests, from everything I've gathered over the years, in real world gaming and even multitasking, the 6300 really isn't that much of a drop-off from the 8350


electricity costs and heat are a concern with me on the fx 8350 I'll be honest, but on a budget I'd get an 8320 (or even 6300) and pair it with a decent gpu for 1080p and play at high or tweaked ultra settings and call it a day and be happy. I've literally never seen a person who had a high end 120-144hz monitor use an AMD cpu anyway. (I could certainly be wrong though.) Most gamers are on a 1080p 60hz monitor though, and probably quite a bit honestly still on 1600x900 or even 720p.


With dx12 (although I'm weary of windows 10 with all the spying and privacy issues) the AMD FX cpu's will age well, they'll be good for at least this entire console gen, both consoles are using (low clocked) AMD 8 core APU's based on Piledriver design.

I'm just an average person who doesn't have to have the most fps in the world or the most ultra maxed out graphics though, as I get older I honestly care less and less about gaming, I write this from my 'sucky' first gen AMD llano a8 laptop w/6gb ram and w/ a slow 5400rpm hdd and win 8.1 and it's speedy enough certainly for every day tasks and non demanding games, my i5 3570k stock desktop w/ 8gb ram and win 7 on an SSD load stuff instantly, with a gtx 660 as the gpu, I can still play most games on 1080p high settings.


In the back of my mind at first I didn't like turning stuff down to play Witcher 3 since my gtx 660 is the minimum gpu for that game, but after watching youtube gameplay vids, all I did was turn AA off, shadows to low, and hairworks off, and it still looks good, and everything else is on high (not ultra) and the game plays smooth with no lag. What I'm getting at is, contrary to online elitists, turning some settings down isn't the end of the world, and saves alot on your wallet. Not everybody is a tryhard mlg gamer who has to have 120fps+ or 4k ultra graphics. Some of us grew up on the snes and black and white tvs 😉 at least that kept me humble lol.

in closing, I wouldn't have any qualms getting a cheap 6300 or 8320 and pairing it with 8gb ram and an r9 280x and be happy playing games on 1080p high(if not tweaked ultra) . I'm not super into single thread heavy games like mmo or rts(not my thing)

The new i5 is certainly nice, but certainly not worth the money over my current 3570k, and even the 8350 in those tests stayed around 60fps anyway roughly.


on a sidenote, geez, I guess I missed it, but did all those people saying to spend more for a z97/h97 for broadwell support get quiet like those people that said "you'd never need more than 2gb vram for 1080p" ? (lol...)
 
I agree with a lot of that WhiteSnake91. Although for those who do want to play on higher res screens (those gaming tests the op linked to on youtube were using a gtx titan after all) intel would be the better bet. Spending $700+ on a gpu, not sure $50 on a cpu is a quabbling point lol. Not just for the spying, but win10 is brand new and we all know what that means. Issues and more issues. Already I'm seeing "I can't load win10", "win10 messed up my pc", "installed win10 and now cpu usage is through the roof". May be simple fixes, settings or just poorly optimized brand new os that needs a good 6-12mo to work the bugs out.

I'm still gaming on an hd 7850 so I know all about turning settings down lol. I also don't play brand new games for the most part either. (see win10 issues above, applies to games as well). Cod ghosts was one of the last games I picked up just a few weeks after it became available. That was a quick $60 down the drain for a game that lasted all of a week. Also growing up with nes (the snes was an upgrade for me and didn't come til years later), the games were more involved. Zelda and the original nes final fantasy looked like blobs but they played for hours, weeks, months.

In terms of broadwell, it was no secret that intel reluctantly put it to market. It was one headache after another trying to drop to 14nm, made the release with skylake awkward and was more of an efficiency bump than performance like ivy was to sandy. I still think there's good reason to recommend h97/z97 over h81 or b85. For the user who already had those chipsets being able to upgrade the bios and pop in a newer gen cpu was great. For new builds who attempted to go backwards to cut corners and save money not so much. Many ran into bios problems where they needed an older cpu that was actually compatible to install the bios upgrade to use their newer cpu they paired with it. Some bios could be updated via flash drive, but it was hit and miss. What a bunch of monkeying around to save $15-20. Plus the older boards did lack a few features found on the h97. The z series was a good choice for those even with a locked core i5 for the sli support the h97 was lacking. In other words, there were more reasons than to be broadwell compatible.

Intel's motherboards haven't had a lengthy lifespan in quite some time, lga 1155, 1150, 1151 - all pretty much following the tick/tock/replace schedule. The good news being it really doesn't matter since you don't have to upgrade i5's and i7's every time you turn around trying to squeeze more performance out of them either. So even though the number of chips compatible with a mobo seems sparse for 'future proofing' (which doesn't exist anyway), it will still last most users a good number of years before it becomes an issue.
 
that's true about the i5s and i7s being good for many years, no reason to upgrade my 3570k at all, and I even see plenty of people with the first gen i7 920s and even first gen i5s. Wasn't aware of all the headache involving bios with h81's,etc. I'd rather spend 15-20 more purely to avoid the headache.

Reminds me of the athlon x4 860k, plenty were recommending people just plop them into fm2+ mobos, but apparently when it released only a rather high end asus fm2+ mobo supported the 860k out of the box, many had to get older fm2 cpu just to update the bios.

I've personally been eyeing the new r9 390 8gb vram purely to avoid vram bottlenecking, the whole 3.5gb gtx 970 thing left a bad taste in my mouth and I feel the 970 won't age as well. It's certainly a strong gpu, but vram usage in games seems to have shot up due to the consoles having 8gb as well as bad ports. I've stayed away from a lot of big name pc releases. My gtx 660 is the recommended gpu for gta 5 but yet the minimum for Witcher 3, hmm. (and yet I can still play it on high just tweaking some things)

Just hard for me to justify 300+ on a gpu when I hardly game like I used to, and when I do game, it's always older stuff from a few years ago or even older, never current stuff most of the time. Partly because none of my friends ever get new stuff to play together anyway, one is stuck on an old athlon 64 x2 and the other has a trinity a8 prebuilt using integrated graphics.
 
I'm very confused about OP.

A video is displayed showing the Intel i5 6600K in video games getting 0-70 FPS higher than the AMD FX 8350 depending on the game. Far Cry 4 showing a near constant 20-25 FPS higher. GTA V also 20-25 FPS higher.

Does this not show that the $90 more is well worth it? I mean, I was planning to use it for the next 4 years. That is $22 extra more per year for a decent bump in FPS. And I play games every day, at least 1-4 hours.

The last AMD I bought and was proud to use in my gaming system was a Phenom II x6. Actually in the last 2 years due to money Issues I've been using an Intel i3 for gaming until I upgraded to the i5 6600k.

I came here to find useful conversation because I will soon be upgrading to the i7-6700K cause I want to, and passing this i5 to a friend who is using the FX chip. I wanted to see if they would see an improvement upgrading from the FX to the i5. Looks like a resounding yes. Thanks!

Additionally, they complain about their game being jerky when something big happens. That low frame rate for that 1 second matters hugely in First Person Shooters.
 
Min frame rates is what matters to me, if i get a dip to 30-40 fps from 60, i notice it, especially in the games i play. AMD brings nothing to the table in terms of cpus, zen is there lost shot, if they blow it, there out of the cpu market.
 
For those that remember the days before the Core i series came out, wasn't that just awesome? Core2 was a great platform but for the performance minded, the idea that I could get an Phenom II that could hang with a Core2Quad that cost 2-3 times as much was the best. Ditto back in the Athlon XP vs Pentium 4 days.

Ever since Intel started outthinking AMD (RnD) instead of screwing around with that stupid NetBurst crap, they have run the show. AMD is still obsesses with MOAR GHZ AND CORES!!!!!!!!!!!!!!1111 To my mind this is simply because they can't compete with Intel without it.
 
Zen will finally even the field which AMD badly needs, it's going to be about sandy bridge level or ivy bridge level IPC per core, and if anybody says sandy bridge or ivy bridge isn't good enough to game on, they lose any credibility. Just not sure how much the Zen 8 or 16 core will cost. Not even sure if it's a full 8 cores or 16 cores or will use their version of hyperthreading, and price is a big concern. No reason to upgrade from my 3570k at all, just curious how Zen will do next year, then Zen+ after that. Also power consumption is a concern. I steered away from the 8350 due to electricity costs and no integrated graphics which I like having personally for backup in case a GPU dies. I don't play single thread intensive stuff and if I do it's only casually so the weaker AMD cores aren't the end of the world for me personally, but Intel being much stronger per core is nice and means they're going to be good for a long time. Plenty of people still game on the old i7 920s and old i5s still. Plenty probably still on the old Phenom x4 965 for that matter as well.

It all comes down to the user and their budget and what games they're into. There's such a thing as a budget in the real world, which pc enthusiasts seem to all too often forget, and if somebody gets a cheap fx 6300 and cheapo 760g mobo with 1333mhz budget ram and a decent gpu and games on 1080p ultra or high settings on new console releases and is happy, there's nothing wrong with that. Not everybody is in the same income bracket and have other responsibilities. A young dad might think he needs the most expensive i7 but can't afford it due to kids food/diapers, meanwhile he'd be just as happy in real world usage with an fx 8320 or 6300 or maybe even athlon x4 860k.

There's definitely diminishing returns with anything. Is a fancy sports car cool? Yes. But a much cheaper reliable Toyota will still get me from point A to B while saving lots of money and the end result is still the same.

The new i5 beats the 8350 in fps in all the scenarios, but the 8350 is certainly still playable. The electricity costs steer me away from it though....the fx 6300 or athlon x4 860k are a much more tantalizing offer though.
 
I haven't owned an AMD system since my FX 6000+ Athlon64 3600+/3800+ and my Athlon XP 2500+ before that. That said, come Black Friday, should the 6300 drop to the $75-80 range or the 8320 hit $100ish, despite not really having a use off the the top of my head (even for the CPUs I just have lying around), that might be too good to pass up. We shall see.
 
I agree and disagree. Yeah, they're ancient but you're talking to someone that still has Core2Duos and a Core2Quad that are happily still in use, doing exactly what is asked of them. Realistically, the prices I mentioned should be the regular prices, at which point, given a sale, an 8320 could end up at $75. Throw in a $50 mobo and you can probably put together a pretty good number crunching PC at 50 - 60% of the Intel equivalent and get a minimum of 80% of the performance of said Intel system, probably more like 90% or better, depending on the job. I'm no fan of the FX line but they have their uses in certain situations.
 
+synphul. YOUR CPU WENT CRAZY OVER A NEW OS! My FX 8350 can get as low as 1% USAGE! Everyone who wants an FX 8350 should totally get one. Games are taking use of more threads therefore hopefully eliminating the single core performance of the 8350. Either way there is hope on the AMD side with the Zen cores coning this year (2016) and as for intel? Well.. They are not going to be leading the PC industry anymore they have there eyes on other things...