KevTheGuy :
The server had ~40 players in it and I played in 720p to remove GPU bottlenecks. It's not a perfect testing but I did it to show someone how the FX 6300 performes on BF4 so yeah p:
Others would then just say it's an unrealistic test scenario (720??), that the real
solution is for AMD to develop better CPUs and just make better drivers for all their
cards, spend the money where it benefits everyone, not just a few in a limited no. of
games, on specific grades of hw, for narrow usage cases.
We had a similar situation back in the days of Athlon2/Phenom2 and P55. People asking
for advice on what kind of upgrade would help boost gaming performance on systems with
lesser Athlon64 X2s, Athlon2 X4s, etc. All sorts of info posted on how an Athlon2 or
Phenom2 upgrade would help, especially when oc'd, but the reality was that in just
about every case, a stock i5/i7 gave far better performance anyway, in some cases as
much as 40% better (eg. Far Cry 2), so the sensible conclusion should have been to just
switch platforms given it was already obvious AMD was falling way behind.
Example ref.
To me, Mantle is just AMD's way of trying to make up for the fact that they have weak
CPUs, but it's such a narrow case solution. I mean, you have to test at 720 to
demonstrate the difference? Really??
IMO they've followed up bad new CPU designs with even sillier direction decisions. Pity
nobody at AMD has the guts to say, hang it, let's go back to what we know worked well,
sort out a proper 8-core Phenom2, improve the design, shrink the process, lower the
power, improve the IPC. Then they might have had something usefully competitive, but
instead we have the 8350 which has an IPC no better than the best Ph2 that was released
18 months earlier, and a power consumption high enough to cook your dinner. Don't even
get me started on the FX 9Ks...
The fact that AMD is in this mess makes the following comment from a Jan/2013 toms 8350
review all the more relevant: "Our benchmark results have long shown that ATI's
graphics architectures are more dependent on a strong processor than Nvidia's." So
it's not as if AMD didn't know about the issue.
Thus the answer atm is simple: if the best value/performance GPU at any one time
happens to be an AMD, then fine, get one (driver fun not withstanding), but stick it on
an Intel board to get the most out of it, even an old one. Using it with an AMD CPU in
a new build is just wasting potential performance.
Many of my earlier builds were AMDs, and I have lots of their CPUs (3400+, 6000+, 7850,
X2 250, X4 635, X4 640, several X4 965BE, Ph2 1090T), but IMO Mantle is just diverting
resources away from where they really need to go, and that's better AMD CPUs. The
longer that doesn't happen, the worse the competitive CPU space becomes at the same
time, so everyone loses.
At the end of the day, Mantle is not a universal gain for all AMD users, and that's
going to hurt them in the long run if they focus on it at the expense of improving
their CPU tech.
Ian.