Fx 8350 with gtx 980 vs i7 4790k with gtx 970 [continued]

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

furiousss

Reputable
Jul 9, 2014
129
0
4,710
hello everyone,

I am looking on to build a new gaming pc. as the topic says, I have come across two different cpu's.
I know that the i7 4790k is far better. but the fx 8350 has 8 cores and would it be useful for future gaming? also whether Fx 8350 with gtx 980 outperform intel i7 4790k with gtx 970 [both are of same price].Any answer would be appreciated. [please don't suggest i5 4690k as it wont match the price and I would have to buy a gtx 970]

thanks in advance.

this is the continuation of my previous thread http://www.tomshardware.com/answers/id-2583796/8350-gtx-980-intel-4790k-gtx-970.html
 


as far as gaming is concerned there is virtually no difference between a 2500k/2600k and a 4690k/4790k.
 
I wonder if you even want advice, OP. Many have told you that the I7 is superior and that won't change. I have told you possibly everything that you can expect with dx12. Yet you still keep asking if dx12 will make fx and I7 equal?

The I7 is superior to the fx. In every way. Always.
The gtx 980 is superior to the gtx 970, always too.
The I7 is 15-60% stronger than the fx.
The 980 is ~10-15% stronger than the 970. On the models you chose, possibly less.

In the end, an I5 paired with a gtx 980 would deliver the best results, and it's totally in your price range. The I7 + 970 would come after that and currently, as well as in the future the fx + 980 last - at best equal. Unless you're talking about 4k gaming, but then again SLI 970 would stomp a single gtx 980 and the I5/7 would still stomp the fx.
 
this is not a battle between fx8350 and i7 4790k only. its also between gtx980 and gtx970.

most of you say i7 4790k with gtx970 is better which is for now. just because a fx8350 bottleneck's gtx980 the i7 with 970 would become better. REMEMBER ITS JUST BECAUSE OF BOTTLENECK.

Let us consider these two cases

DIRECTX11: i7 4790k vs fx8350
4 strong cores vs 8 weak cores
1 strong core can communicate with gpu vs 1 weak core can communicate with gpu
DOSEN'T RESULT IN BOTTLENECK VS RESULTS IN BOTTLENECK

DIRECTX12: i7 4790k vs fx8350
4 strong cores vs 8 weak cores
4 strong cores can communicate with gpu vs 8 weak cores can communicate with gpu
DOSEN'T RESULT IN BOTTLENECK VS DOSEN'T RESULT IN BOTTLENECK

DIRECTX12 would remove fx8350 bottleneck. As a result the fps would also get higher [minimum, maximum, average]
so isn't the fx8350 with gtx980 indeed a better choice. REMEMBER THAT FX8350 WOULDN'T RESULT IN BOTTLENECT IN DIRECTX12. AS THE PROVERB GOES, "UNITED WE STAND, DIVIDED WE FALL."[8 weak cores would be powerful than 1 strong core]

Hope this explains.


 
Assuming you’re remotely technical, the change from DirectX 11 to DirectX 12/Mantle changes are obvious enough that you should be able to imagine the benefits. If before only 1 core could send jobs to your GPU but now you could have all your cores send jobs at the same time, you can imagine what kinds of things can become possible. Your theoretical improvement in performance is (N-1)X100% where N is how many cores you have. That’s not what you’ll really get. No one writes perfect parallelized code and no GPU is at 0% saturation. But you get the idea.

source: a part of article of http://www.littletinyfrogs.com/article/460524/DirectX_11_vs_DirectX_12_oversimplified

wow, this means that fx8350 would have (8-1)x100% = 700%
intel i7 4790k would have (4-1)x100% = 300%

ITS JUST THEORETICAL IMPROVEMENT, NOT CONFIRMED.

I don't think I would be sorry with fx8350 and gtx980.

@DubbleClick
Don't think that I don't need advice. I really need all of your advice. I have seen many articles stating the improve of fx8350 in directx12. But none with negative results. Just show me a article with negative performance of fx8350 in directx12.

the thing is I don't want the price of fx8350 to increase due to directx12. otherwise I would been waiting for directx12 benchmarks and results.


 


No TV in the entire world is over 60Hz. There is no such thing. Computer monitors heck yea, but all TV's take a 60Hz signal and interpolate (interpret) to provide a "fake" faster refresh rate. I repeat, your TV is 60Hz. At most, TV's feature a 120Hz back panel which still doesn't change the signal source (60Hz) nor does it cause the screen to output any game, movie or media above 60Hz (sort of 60 FPS). The panel itself can handle 120Hz most likely but still takes all signals 60Hz and below and upconverts (so to speak, interpolates) to enhance smoothness.

No TV will ever be 120Hz or higher for an exceptionally long time. Genuine 120Hz HDMIs have only just been created only months ago while the other HDMI cables had to use tricks to send part of the signal one moment (horizontal info) and the other part of the signal a split second later (verticle signal) which is why 3D has been tricky to accomplish. So still it will be a very, very long time before TVs have any reason for this. And when that reason comes, people will still have to figure out how to send a 1080p 120Hz broadcast and right now we can barely send 1080i let alone actual 1080p.

Computers have been able to offer 120Hz and higher for a while because computers do a million things and don't rely on compressed (interlaced) broadcasts as a signal source, and so they aren't as limited. Someone somewhere might need the ability and GPU manufacturers these days tend to cover all of the bases. Computers are also naturally extremely powerful when compared to a TV. Another reason is that computers have been able to do 120Hz for a long time is because the panels that they come with are more "active" in their display tech while TVs are passive (using special features to enhance or correct media). It's one of the reasons why TVs can so often look brilliant and horrible depending on what you're doing. Recent trends seem to be bridging the features of TVs with computer monitors lately.

Chances are insanely high that your TV doesn't even have a real 120Hz panel, which would prevent overclocking the TV itself. Even for those who do this, they often burn out their TV very fast.
 
<Mod edit>

I'm outta here, too. Brought up enough reasons to why the I7 + gtx 970 are and will be superior to the fx + gtx 980. Told enough about what directx12 is and what it changes. If you still don't listen, your fault, I couldn't care less what you choose in the end. Go all amd if you want, check as much on their site as you want and fall to believe. None of my business.

Meanwhile, there is no such limitation of only one core being able to communicate with the gpu, nor would that be what impacts performance.
 
also the entire FX series is garbage. I love AMD they are the underdogs but even AMD has pulled out of high end CPU production while they clot their wounds that Intel has been dishing at them lately. Know this, if you buy AMD now you will need a new mobo and CPU to get an upgrade later because there is no freaking way AMD will roll out a new desktop CPU using the ratchet and clank FX-socket. They are great at APUs but APUs really just suck !@# for PC gaming. Just look at how badly the consoles are performing. My friend bought the newer FX 8320-E and it's meh at best. Lots of stuttering when under load and even with its TDP enhancements it still gets pretty hot
 


First off, DX12 is manking a joke out of mantle, so drop mantle from all considerations. Secondly, you don't want to build a PC based on what might be. You have to go with what you know and also know that computers change every 6 months or so. A game might come out optimized for AMD and do well, but most don't. AMD has been served over and over the last few years. I went through this same exact debate as you are and the fact is, no one likes Intel because they cost a lot and seem like bullies. Hell I feel the same about Nvidia but AMD hasn't been the greatest either in that realm. Over 75% of CPU marketshare is in Intel hands and AMD has been licking its wounds more than ever.

If you want an AMD GPU I'd say you could not regret it. I can say the same about any i7 CPU from Sandy Bridge to today...but AMD has sucked at CPUs from the getgo. They are always first out of the gate though.

1) AMD made first dual core
2) AMD made first quad core
3) AMD was the first to sell a 5GHz CPU out of the box
4) AMD made the first consumer based 8 core CPU
5) AMD loves to give handies for cheeseburgers since they are desperate for money

All sucked and for a reason. They have to try 10x harder to sell because their track history has been pretty bad. There's a reason why Intel doesn't even care about what AMD news comes out. They've always won. Intel doesn't even rush out of the gate. They don't even compete. Intel. is. just. plain. superior...for now.

You want a nice PC? Here's one

any z97 based i5 to i7 plus a motherboard with what features you want
16GB 1866MHz ram probably any RAM brand (I usually like G.Skill, Kingston, and Corsair)
R9 280, 290, any X variant or the GTX 980 (970 has memory issues)
and whatever you like for your storage needs.
EVGA Supernova b2 850w

Drop that poo poo screen for any 1080p one and forget using that 200Hz TV. It's trash for games. I bet the lag on that TV would be 200+ milliseconds. Good luck with Call of Duty on that
 


haha that was Sheridan hahahaha yeah....ultra dork here. Sinclair was the best. Punch first. Talk later.
 

We simply don't know yet. But there are some instances where the R9 290x beats the GTX 980 as of right now. Generally high resolutions, and very specific games.

Btw, generally, this place is practically anti-AMD. The AMD disadvantages are exaggerated from ants to elephants. And their advantages are downplayed like from a diamond to a dirt speck.
 
Yeah, with all the confusion that's going on, I would recommend waiting for DX12 to come out, and maybe even the r9 300 series cards from AMD. There will definitely be some market wars as prices go down and it will be best for us the consumer.
 
why do people wait for tech to come out? It's going to be bleeding edge expensive when it comes out and a cheapskate who is wanting an AMD CPU is obviously not going to splurge $900 for a AMD bermuda for dx12. You want a nice PC man you need tons of money. But for gaming it honestly boils down to pure efficiency for the game or bragging rights.

You are getting caught up in the bells and whistles. Just get an i5 and z97 mobo w/ 16gb 1866MHz ram and you will smoke everything on a 290 or 980. END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION END OF DISCUSSION

THE PHARAOH HASSSS SPOKEN!
 
To support my statement that this place hates AMD, look at this post from user cowboy44mag. I'll just copy a small part of it:



http://www.tomshardware.com/answers/id-2577037/series-cpus-gaming-2015/page-2.html#15574485
 


I had a 2600K and it was decent. 2 years with it and I was not impressed with it over my dual core. The leap of it ahead wasn't great. I completely believe AMD has been an underdog too long due to some mishaps...but they have quit. They literally quit. They were happy to charge people $1000 out of the gate for their FX9590 which was only comparable to a high end i5 or low end i73700 series for example. in a desperate move, they grabbed the highest frequency with the FX9590 with the highest amount of cores that they could fit and made everyone believe it was the best but it was pretty awful. They simply aren't trying anymore for CPUs and it really shows. AMD has lost its pity points for giving up.
 
I had an FX-8350 with a Crosshair V Formula Z motherboard. I now have a 4790k with a Maximus VII Formula motherboard. The 8350 was clocked at 4.7ghz and the 4790k is clocked the same. Each one ran with the same pair of crossfired 290xs. The only noticeable differences I've had are my 3D mark score went up over 4000 points, Skyrim runs a little better, and Minecraft runs better. Most MMO's run better as well. Without a doubt the 4790k is faster than the 8350, but the 8350 is still a fantastic processor. AMDs weaknesses are often blown far out of proportion, and with the 8350 costing around $170 it's hard to tell you not to choose it. Just because a few games run "better" now doesn't mean they ran bad before. Either way you are going to have a fantastic gaming machine that will play games well into the future.
 


Even though you kind of saved your post in the other topic, since his power consumption calculations were not off, this video is just purely nonsense. Literally all reputable sources for benchmarks, including tom's hardware and anandtech (where we're currently commenting - same company) show the absolute opposite. And as if that was not enough, his "results" do not even make any sense when looking at them from a logical (architectural) standpoint of view. That channel is, even more so than biased company channels (cough), the most retarded thing you can find on youtube.
 


Well, to be fair, he did not report minimum framerates. Only average. An average can be higher with lower minimums as well. And he never explained which portion he used for benchmarking. So even in two runs, if one has more explosions than the other or whatever, it would not be representative. Those are things that we indeed don't know, but I don't immediately jump to call it bullshit.
 


All of his videos follow that scheme, though. Impossible results (without messing up BIG time continuously and/or deliberately) that follow no logic and generally praise amd. He keeps saying fx chips are great for gaming but intel shows their real advantage in benchmark scenarios, while absolutely the opposite is the case. Fx 8320's fare quite well against $100+ more expensive locked i7's in theoretical benchmarks specifically designed to get the cpu to it's maximum capabilities. But they are not even close to competing in any kind of real time or single thread/simulation scenario.

I'd bet my i7 that his results are all bullshit, quite sure he's either paid by amd or has some mental issues (the latter being for sure anyway - although, to be fair he even laughs about what he's saying, probably exhilarated by the thought of people actually believing him).

Ivy i5's (and especially i7's) are vastly superior to fx chips not only in minimum but also average framerates, this is shown in literally every benchmark you get when googling i5/i7/fx review.
 
Status
Not open for further replies.

TRENDING THREADS