Will AMD cpus become better for gaming than intel with direct x12

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It's true that before reaching Novigrad, the FX CPUs are consistently slower. But how relevant is it really? They never really dip below 60 fps. When the heavy part comes in, the heavy dips are mitigated by the FX CPUs, while the previously superior i3 is suddenly more of a bottleneck than an FX-6. So in that sense, the FX are still the more reliable performer in this game, in kind of a weird way. Doesn't reach the max performance, but doesn't dip to the minimum either.

Remember that I've been talking about the (near) future here. I haven't been talking about older games. Sure, in older games an FX CPU can give you 25 fps dips while you remain above 40 with even a Pentium. This will probably also continue in the very very near future, but it should slowly fade away where the roles switch. I'm saying should, because the future is not guaranteed. But the trend is there. Also, the FX-8 outperforming the fastest i5 is not going to happen. It's going to be on par with most i5 CPUs and slightly slower than the fastest i5 CPUs. i7 is far out of range of the FX-8 CPUs.

It's still safer to go for an i5 rather than an FX-6/8 right now. You'll get best of both worlds. i7 is obviously the best choice. The i3 vs FX-6/8 is debatable.

You're saying it's irrelevant to look at their future, but it really isn't. At what time before Witcher III did we see a CPU heavy part where the i3 performance drops and the FX keeps up with an i5? The future is quite bright here for the FX. This only happened just now, and the chances of it going more this way is increasing. I personally upgraded from a Phenom II X3 710 with the 4th core unlocked to my current FX-8 last year. From that perspective I'm already 'behind', but really, I'm not. I simply love to squeeze every little ounce out of my spent money, and I think the FX-8 will have a longer use for me than the X3 710 had. Remember that a stock FX-8 was achieving 75 fps on average in a 2015 game, and it's a 2012 architecture.
 
It's true that before reaching Novigrad, the FX CPUs are consistently slower. But how relevant is it really?

Quite, since it highlights, that in non-CPU performance bound situations, that Intel chips will consistently give higher performance. FX pulls head of the two core chips once you hit Novagrad simply because the performance bottleneck shifts over to the CPU, due in large part to the insane amount of CPU side processing that needs to go on. Those types of performance bottlenecks are the exception, not the rule, and the exact type of thing DX12 is going to help with. So you are likely right back to FX matching up against the i3 and lower tier i5's.
 
Again, why is it so significant if you're well above 60 FPS anyway? If you're getting drops into the 20s or 30s I can understand. With the benchmarks shown here, I really don't see the issue, because well, there isn't any. The biggest complaint of the FX CPUs have always been the lower drops. Right now we have an example of the lower drops being eliminated in a CPU heavy situation, and we all still think the FX CPUs will be crap forever.

Yeah... Tradition is stronger than rationale and progress apparently.
 
Yeah it could. This has been seen in the recent past indeed with quite a few games. I did mention that the i5 is the safer bet over the FX CPUs. But remember, it's still a 2012 CPU running a 2015 game that gives this performance. Ageing seems to be going quite well.

In the past I might have been a bit excessively proponent in recommending FX CPUs in new systems because the worship of brands annoys the hell out of me, and the devotion to 'Intel solves everything' is not really true, especially if you're on a budget. I've seen people recommending switching to Intel from an FX-4300 for someone who plays Bejeweled. Not kidding.

But, I'm not recommending anyone with an Intel to 'upgrade' to an AMD CPU because suddenly AMD is gonna kick Intel's ass with the current FX CPUs. That includes upgrading from an i3 to an FX-8. It would be a waste of money, and that's an understatement.
If people with a Piledriver FX CPUs or even FM2(+) CPUs are thinking of switching to Intel however, I'd recommend them to wait, especially if they can overclock. Unless they can afford an i7 right now, because that's really the only one that is completely out of reach for AMD CPUs max performance. If you have the money it's always better to go with the fastest one, obviously. But generally, overclocking the FX CPU will be cheaper than upgrading to Intel when you already have the AMD system, and with the view of the direction things are moving in, it might be premature to switch, in the end letting you waste money. Again, I'm not saying AMD will be better than Intel or even that it's better than the Intel in Witcher III. I'm saying that labeling FX CPUs as inadequate for future games is a premature conclusion. In the end, where the CPU mattered the most in the Witcher III, the FX held up by not reaching as low a framerate as even the 4690k, despite it having a lower average fps.

When switching to Intel, you really need to consider quite a few things. Everyone decides for themselves what kind of performance is acceptable for them, and they need to know their options. But I think people exaggerate the kind of performance that's really necessary.
Do you really always always need 60+ fps? It's understandable if you play multiplayer games and take it seriously. I wouldn't want any drops under 60 fps in a fighting game, not even offline. But if you play mainly single player games like RPGs, how much will the drops into the 30 fps range really affect your performance and experience? Is that worth the price difference between say the FX-8320 that you already have, and the 4690k + motherboard that you have to buy?
Even if you have fps drops and play multiplayer, will 45 fps be destructive to performance if you get a G-sync/Freesync monitor? Are you able to get such a monitor? If you're considering upgrading, getting a better monitor with G-sync/freesync might be a better choice than having to switch to Intel.
Related to upgrading your monitor, are you going to be playing at 1080p? Will you switch to a higher resolution shortly? At 4k an FX CPU will less likely bottleneck compared to 1080p, so, if you want to play at higher res next year, upgrading the CPU might not be warranted.

There are many things to consider that I didn't mention, and I for one don't have money to waste, and I don't like recommending people to waste their money either. I'll squeeze as much life as possible out of everything I have, and I recommend others to do the same. Unless you have money growing on trees lol. If people want to jump on the Intel bandwagon it's their choice. But consider this;

What if future games are constantly going to have the Novigrad load? What would then be the best CPU choice?
 
Yeah it could. This has been seen in the recent past indeed with quite a few games. I did mention that the i5 is the safer bet over the FX CPUs. But remember, it's still a 2012 CPU running a 2015 game that gives this performance. Ageing seems to be going quite well.

The i7 920 has aged well too. Why? Because CPU requirements are largely nill after a certain point of performance. Beyond that point, the CPU simply doesn't matter anymore. Why else do you think I'm still rocking a 2600k? There's zero reason for me to upgrade.

What if future games are constantly going to have the Novigrad load?

Unlikely, given the CPUs in the PS4/XB1 are basically low end APUs and not much more powerful (and in some cases, weaker) then the CPUs that were in the PS3/360. As a result, I doubt CPU side requirements are going to budge much over the next decade or so. They'll go upward with time, but at a very slow pace.
 
I don't think folks should waste their money either, but given that there's only a roughly $50 price difference between an 8350 and even the new locked skylake i5's much less the lesser expensive haswell i5's, there's no real reason not to. It's not about brand worship, it's a matter of performance. If amd had it I'd be recommending them but they don't. People are tired of hearing intel, intel, intel - well then maybe amd should do something about it instead of being constantly 2nd and 3rd tier, 2nd and 3rd tier, 2nd and 3rd tier. It is what it is. If anything the brand loyalty is on amd's side when people insist others should buy into them simply to 'keep competition alive' or 'support the underdog'. No one says that about intel, instead it's based on performance and data.

As NightAntilli mentioned, they like to squeeze every last bit of performance out of their system - and saved around $50 to do so. For a system that's lasted them a number of years. What exactly is the savings of $50-60 over 3-4 years? $15, 20 a year? If someone's budget is really that tight then I don't think gaming is one of their priorities.

Amd fans will suggest anything and everything to try and promote a struggling chip. Don't switch to intel, instead buy into a more expense g-sync or free-sync monitor. When you have to adjust the rest of your peripherals that much to compensate for weak hardware it should be telling someone something. Over 60fps is needed for smoother gameplay on a 60hz monitor that doesn't have g-sync or free-sync. It's also needed for games being played on 120hz monitors. If all you're doing in playing bejewelled at 1080p on a 60hz monitor then fx chips are perfectly suitable. How many times do I see people pairing a 'bargain' amd cpu with a $70 air cooler or $100 aio cooler? A lot. Imagine if they'd shifted the very same budget around, they could have gotten an i5, xeon or maybe even an i7 with a stock or inexpensive air cooler and gotten way more for their money. Instead of keeping up in a title here or a title there, they'd have a cpu and platform capable of any game they wanted to play.
 
Even if i agree with most of stuff wrote in this post i will give my personal opinion.

I think the Software its really outdated. The hardware since 5 years ago its a lot more advanced than the Software.

DX9 for example its horrible coded, charging 90% of a process into A single Core
DX11 its like DX9 Ver 2.0 DX11 its just a revamped version of DX9. Using in a horrible way your PC resources.

Why a Ps3 or Xbox 360 with a ultra inferior hardware specs could make games like Last of us, uncharted, Tomb raider, GTAV. ITs easy its because of the Software.

Why ps4 and xbox one are giving us decent graphics for 350 and used 250 for the software.

If the software its old and oudated you need overkill that with hardware this is what the PC gaming its all about since 5 - 8 years ago.

DX12 for me will be a boost for the medium low and Low gaming pcs. If you posses a Gaming machine you will not feel any improvement.

Tiled resources the stuff to use the Ram and video ram similar what consoles do = This is for the Laptop upgrade and Medium low gaming Machines

Using More cores better = To AMD hexa, octa cores cpu perfomance. yes your FX 8320 probably will last 4 years giving you a 85% of perfomance of a I5 on DX12.

Of course intel will still be on top but AMD will be more close. Also if AMD offers competitive prices to their new ZEN cpus i can start show the middle Finger Intel.

250 dolars = 6 Physical and 12 Virtual Cores 4.5 ghz OC
300 dolars = 8 physical and 16 virtual cores 4.5 ghz OC
350 dolars = 12 physical and 24 virtual cores 4.5 ghz OC
400 dolars = 16 physical and 32 virtual cores 4.5 ghz OC

Since what? 2000 - 2002 the PC gaming starting become super popular. Why microsoft cant create a operating system to gaming only? like "Windows Gamer"

Windows Gaming Features

1.- Use all the resources of your Machine to "Gaming" strickly
2.- Give you a browser similar what consoles do
3.- Give you a program to stream your stuff
4.- Price $180 dolars.
5.- Coded in advanced graphic engine to use in a magistral way your pc hardware.

The software its the key. Maybe 15 years ago with Pentium 4 single core cpus was the hardware the problem Not anymore.
 
Consoles are able to keep up with such weak hardware because they're only pushing 30fps and they're streamlined, much like a bitcoin miner is. A bitcoin miner won't render video very well, would be crap for gaming but the one thing it was designed to do it does extremely well. If people want a one function machine, isn't that what consoles are precisely for?

What's funny is people who own intel buy intel because of its proven performance. Amd fans suggest they'll buy amd to 'give intel the middle finger'. Yet intel owners are the ones accused of being fanboi's, interesting. No one is preventing people from buying amd as it is, buy it all up. I'm sure they'll appreciate the loyalty, just don't expect a thank you card in the mail around the holidays if their loyal fans manage to keep them out of bankruptcy. They are big business after all and out to turn a profit not score brownie points with their customers.

It's unrealistic to list a huge column of 'wants' and then lay out a price we're willing to pay for it, this isn't an auction. I don't get to walk into ferrari and tell them I want the latest model for oh gee, $15k sounds like a nice price. Never going to happen. Apparently more cores aren't better or amd would be doing twice as good with twice the hardware as intel's i5's which aren't even capable of more than 4 threads. That's been proven repeatedly across all fronts.

I'm not aware that amd even has any zen chips out of the oven for internal testing at the moment, much less a price structure and they've been really tight lipped about the ipc performance projections. Not sure where the whole price to core count 'chart' comes from but no documentation I've seen unless it's merely a wishlist.

Doubtful that windows would make a 'gamer only' os, it's just not feasible. They have a hard enough time making a coherent desktop and mobile version and have gone to the trouble to streamline their products to fuse mainstream desktop users and corporate users with the same os (potentially different features depending what license you have). They're no longer running a 'home' user flavor such as win98 and a business segment version like win2k. If gamers got their own version then everyone would want their own version. Pretty soon there would be windows gamer edition, htpc edition, video editing edition, etc. It just doesn't make any sense from a development or sales standpoint.

The os hasn't been the biggest issue, the biggest issues have been the games themselves. Before even getting caught up in what dx version the game is built on, it would be novel if we had a game that just plain worked correctly when it was released. Gta was having issues, Konami just tried to make nice with their customers for all of their release boo boos and so on. Whether the tires a car runs on will effectively let it corner at 120mph is irrelevant if the car is stalling at stop lights and won't even turn on half the time. That's pretty much where games are they're so horribly coded anymore. Dx version is a distant concern by comparison.

All those 'windows gaming' features are synonymous with console. Uses all the resources to game. Capable of streaming and connecting to various video services. An advanced graphics engine. If that's all people want, there's the wii, xbox and ps. The point of having a pc is for better than console performance and/or the ability to handle other tasks a console cannot. A multipurpose device rather than a single use device.

Instead of spending $800-1000 on a pc and software, a console runs someone $350 to $400. Ps4's are running $350-400 depending on the bundle you buy, there's one for $380 that includes the console, controller, a game and a turtle beach headset. There's an xbox one selling for $500, comes with console, controller, 3 games and a kinect device. They sound like a perfect match to someone interested in limited capabilities and gaming only.
 


DX9 was the last of the "old" DX API, and was still designed around the idea of doing the rendering process as a series of individual stages. As a result, it was designed around one large thread managing the entire process.

DX10+ started to change this, and DX11 did give access to a limited form of multithreaded rendering, but this was very hard to use properly due to restrictions put on it.

DX12 is allowing much more low level control of the API, so extra performance can be squeezed out. However, this also means it falls on the software developer, not the driver OEM, to update their software for new HW architectures in order to get the most performance out of the API.

Why a Ps3 or Xbox 360 with a ultra inferior hardware specs could make games like Last of us, uncharted, Tomb raider, GTAV. ITs easy its because of the Software.

More like the fact the games ran in upscaled 720p (if that), 30 FPS target, no AA, and reduced visuals compared to comparable PC games.

Also remember consoles have EXACTLY one HW specification, so you can do very low level optimizations that you simply can NOT do on a PC.

Why ps4 and xbox one are giving us decent graphics for 350 and used 250 for the software.

PS3 GPU: 7800 GTX derivative.
PS4 GPU: AMD 7700 derivative.

360 GPU: ATI 1950XT derivative.
XB1 GPU: AMD 770 derivative.

I'm sure jumping almost 10 generations on the GPU also helped performance some.


Look guys, allow me to code to one specific GPU, say a 980 GTX, and I have the capacity to triple your performance. No other GPU will work, but hey, triple performance! That's the downside to using middleware APIs: You trade performance for HW support. Unless you want to return to the days where you needed to provide different SW drivers for every HW device, then that's that tradeoff you have to take.
 
I did say the i5 is a safer bet than an FX CPU.
I did say there is nothing AMD has that reaches the i7 CPUs.

And then I get labeled an AMD fanboy for not wanting to upgrade to Intel yet, since there are signs of the FX CPUs becoming more useful, like the example I gave in the Witcher III.

Funny how wanting to make the most use of your AMD chip that you already have is being labeled as being fanboyish. The amount of red herrings and hypocrisy in here is cringe worthy.

I'm going to repeat myself here, so that it's clear what my point is;

I'm not recommending anyone with an Intel to 'upgrade' to an AMD CPU because suddenly AMD is gonna kick Intel's ass with the current FX CPUs. That is not going to happen. That includes upgrading from an i3 to an FX-8. It would be a waste of money, and that's an understatement.
If people with a Piledriver FX CPUs or even FM2(+) CPUs are thinking of switching to Intel however, I'd recommend them to wait, especially if they can overclock. A $30 Hyper EVO 212 is enough, no $100 cooler required. Unless they can afford an i7 right now, because that's really the only one that is completely out of reach for AMD CPUs max performance. If you have the money it's always better to go with the fastest one, obviously. But generally, overclocking the FX CPU will be cheaper than upgrading to Intel when you already have the AMD system, and with the view of the direction things are moving in, it might be premature to switch, in the end letting you waste money. Wasting money as in, you switched without ultimately really having to.
Again, I'm not saying AMD will be better than Intel or even that it's better than the Intel in Witcher III. I'm saying that labeling FX CPUs as inadequate for future games is a premature conclusion. In the end, where the CPU mattered the most in the Witcher III, the FX held up by not reaching as low a framerate as even the 4690k, despite it having a lower average fps. If this doesn't count for evidence of the FX still having life in them, then nothing will change your mind.
 
I skimmed over this thread and I know its been said several times, but any CPU offloading that DX12 gives to AMD its also going to give to Intel. There's an old saying about putting lipstick on a pig (its still a pig). A current i5 (Haswell Refresh or Skylake) is always going to be better overall than an FX 6 or 8 thread Zambezi. In my opinion its worth the extra $$ spread out over the three to four year life of the system. Or you can get an i3, still match or beat the FX in most applications/games, spend no more money and reap the benefits of greatly lower power consumption.
 
I would recommend people to pick the FX-6 over the i3. At the tipping point (which is currently happening) where more cores outperform single core performance, you will benefit from it, and right now it's enough for most older games anyway.

What I would personally recommend is;
< $80: Athlon 860k
~ $100: FX-6300
~ $135: FX-8320
~ $170: i5 4460
~ $210: i5 4690
~ $240: Xeon 1231 V3 / i5 4690k
~ $290: i7 4790
~ $320: i7 4790k
> $350: Whatever you want because you have money to waste.
 


The PS4 uses a cutdown 7870 (1152 shaders from 1280), and the Xbox One uses a cut down 7790, also known as the R7 260 (768 shaders). Both run at much lower clocks than their closest PC counterparts though.
 


and the 7850 is a cut down 7870, same gpu core design just smaller, the ps4's gpu puts out about the same performance as a 7850. And while you're right the xb1 uses a cut down 7790, it's performance is right in line with a 7700. there really isn't anything different from what i said.
 
DX12 not a magic bullet confirmed:

http://www.techspot.com/review/1081-dx11-vs-dx12-ashes/page5.html

FX 8350 still slower then i3 6100. As predicted. The underlying performance dynamics have not changed.

I see another, more worrying problem though:

Nvidia has heavily optimized its DX11 driver for Ashes of the Singularity while there is little it can do to optimize its driver for DX12 as the game engine communicates almost directly with the GPU. The company is limited by its Maxwell architecture which suffers from a call bottleneck due to the game being programmed for thread parallelism. Nvidia is dependent on game developers to make efficient use of the Maxwell architecture as best they can, so don't expect to see DirectX 12 driver improvements from team green.

Here's the issue: Say developers tailor their code to target a specific GPU architecture to get maximum performance. Great. Fast-forward two years, and that company moves to a brand new architecture. Performance on that specific benchmark TANKS because the new architecture is no longer optimized for what the code is doing.

The way things are currently done, NVIDIA can just do a driver update and optimize the performance of the software as best it can. But with DX12, this is no longer possible, it now falls to the DEVELOPER to determine where the problem is in code, and update the application to fix the problems. But do you seriously think any business is going to do this? Post launch support multiple years down the line? I think not.

This is already showing me warning signs for problems down the road. I suspect whatever AMD/NVIDIA release to replace GCN/Maxwell will pretty much be used with minor revisions for the next decade or so, since I suspect both companies are going to be very skittish about significant HW changes going forward, for fear of suboptimal code leading to low benchmark scores (which correlate into sales).

Hence why you don't want low level coding on a general purpose PC. The developer isn't going to be able to target every piece of hardware out there.
 
Here we go again...

900x900px-LL-1e876f7a_hitman-cpu-scaling.jpeg
 
So again: Lower performance CPUs see the largest gain, Intel retains the lead despite seeing less gain (stronger baseline CPU). So DX12 is shaping up pretty much as expected.

Now gimme i3 benchies; those are the ones I'm interested in. DX12 shines at the low end, not the top.
 


Every program in existence going back 20 years compiles with SSE2 by default. AMD and Intel share conformity to baseline SSE specifications until SSE4, where AMD started to make their own non-compatible extensions to the specification. Based on the last compiler benchmarks I looked at (shortly after SSE4 launched), performance basically peaks at the SSE2 level, with minimal gains/losses as you compile against higher targets.

Seriously, some people will go to any lenghs to blame Intel for AMDs performance.
 
Ok. Remember when I was saying that an FX-6 should beat an i3, and an FX-8 will be equal to an i5 when proper multi-threading takes off? This is still DX11, but, it does show the shift from single threaded limit to more optimal multi-threading. DX12 was the facilitator, but it has been shown that it's also possible under DX11... See here;

CPU_01.png
 
^^ That type of graph won't see much movement; you have a clear GPU bottleneck, so the GPU is already pretty much loaded by anything stronger then an i3 class CPU. Your improvements in CPU performance will be directly proportional to the GPU load freed up and reduced driver overhead, rather then any increase in CPU processing potential.
 
I disagree. The difference in minimum framerate between the i3-4360 and the i7-6700k is 10 fps. That is significant. Average also has a difference 8 fps... There's no indication of the GPU being the main bottleneck.

CPU_02.png


CPU_03.png