AMD CPU speculation... and expert conjecture

Page 418 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


That's what I think they SHOULD be doing, and what I've been calling for some time.
 


If the threads were getting swapped, you wouldn't get the result set I saw in my HTT disabled run. If they were swapping often, I'd expect linear performance to 4 threads, but you clearly saw performance dropoffs, indicating situations where it looks like one core is handling two threads. If they were swapping, I'd expect all four threads to eventually get on four different cores.

Again though, I'd really need to sit down and profile to be sure. But eyeballing the performance metrics, that's my guess to what's going on.
 
gamerk All your predictions about MANTLE have been shown to be wrong. Why do you insist?

I've said, multiple times, the overhead involved in DX is a good 10-15% performance hit over what you could get with a lower level API. I've also pointed out that's the same reason why consoles can still pump out decent visuals, considering they are powered by 7800 GTX/2900 XT variants. The lower level you go, the more you squeeze out of the hardware. I'm not arguing this point.

What I am arguing, is over the long term, is that developers are going to invest in two separate graphical backends, just to satisfy the minority maker or GPU's. Remember: GCN only, and not supported by consoles. Doesn't benefit ~65% of the dGPU market.

Also, one thing everyone here has also missed: What card(s) are seeing 40% performance gains? What market is AMD talking about? APUs. I would expect Mantle improvements to have the greatest effect on the low end of the spectrum, where even small improvements in overhead can lead to significant FPS increases [GPU bottleneck]. Using dGPU's however, I expect a much, much lower benefit, as the overhead in DX becomes almost insignificant by comparison. The performance gains aren't going to be flat across the board; APU's are going to benefit more.

So call me in two years. I'll be waiting for your apology. I ALWAYS take the long view. And two years down the line, Mantle is going to be an afterthought.
 

jacobian

Honorable
Jan 6, 2014
206
0
10,710
Nothing that I have heard about Kaveri so far suggests that it will be a huge leap from Richland for pure CPU performance. My personal speculation is that Kaveri A10 will trade blows with the top of the range Haswell i3 for purely CPU intensive tasks, give or take, but will fall short of the Core i5. For people who don't care about iGPU the more affordable and more power efficient Core i3 will be more attractive. For those who want to use the iGPU, the A10 will make more sense. The later demographic is quite small on the grand scale of things IMHO. The desktop market is just half of the combined laptop/desktop market. And the desktop gamer market is a fraction of the overall desktop market. And the desktop gamer with iGPU market is a fraction of the later. This is the AMD niche now. It's tiny. AMD needs to do more to succeed IMHO. For example, they need to be more aggressive on the laptop market. AMD's low-power APUs are uncompetitive and show up mostly in garbage brands/models. AMD can't compete on price with Intel because Intel's low end CPUs are already aggressively priced. AMD needs to crank up the performance and power efficiency to compete.

 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


You been shitting on AMD since day 1, we all seen your love for Intel and your unfair comparisons over and over... all you have shown is that you are pro Intel and anti AMD.

And this claim about AMD Paying Devs and Publishers, can you please post proof of this claim? otherwise, you might as well stop all this BS.

Mantle will be around like it or not, dont compare Closed tech such as PhysX or GameWorks with MANTLE... and you seriously think AMD has the money to bribe all those Devs and Publishers? every Million counts for them and i bet you if they were bribed each of them would ask for at least a dozen of Millions to support MANTLE, AMD Cannot afford bribing like Intel and nVidia... to me it is clear that the Devs LOVE MANTLE, did you not see the AMD Developer Summit were MANTLE got discussed? the Devs who were present were BLOWN AWAY by the claims of DICE.

MANTLE will be open but it is not finished as of now, reason why it is only GCN at this moment or you expected them to make it CUDA first? deamn on, i already posted several links and slides regarding MANTLE and AMD promised to make it 100% open and that it will not only be tied to GCN as soon as they finish it.

http://www.dsogaming.com/wp-content/uploads/2013/11/Untitled11.jpg


45% more performance with GCN GPUs on BF4 will surely impress the heck out of any Gamer, specially the budget gamers... we already saw Kaveri run at a avg of 30 FPS on BF4 without MANTLE, add 45% to that 30fps and it becomes 43 FPS on average, i can predict AMD GCN based GPUs to start selling like hotcakes in the following months.

Your attitude is getting on my nerves... you just cant stop crapping on AMD at every turn, you crapped on Kaveri claiming only 10% more performance, you crapped o MANTLE over and over, and now you crap on TressFX... heck why are you on this thread if you clearly can´t stand AMD? at least you´re not a raging at every turn but you have proven nothing but that you are clearly against anything AMD does.

I am not Pro AMD, i was even accused of being an intel fanboy when i posted the news about AMD having no plans for Steamroller FX CPU and most of you had diarrhea because of that.

My first pc was a Intel Pentium2 at 266mhz, my second Pentium 3 at 600mhz, third was a Pentium4 at 2.2ghz, third was a Xeon Dual Socket Server at 3.8ghz with HyperThreading and my only AMD pc is my current PhenomII 980 clocked at 4Ghz, i will not support Intel any longer after i learned about all the crap they done to hurt the Industry (not only AMD) and i am also aware of the issues AMD has with their CPUs but their CPUS get the job done anyway.

But you sir, you cant stop worshiping Intel and crapping on AMD... and is just getting annoying, you cant give the slightest credit to AMD... just impossible for you.

You, Haffijur and that other guy (can`t remember his Nickname) have dragged this thread over and over in comparisons of Intel Mighty CPUs against the shitty AMD CPUs and at least you seriously wont convince me about getting an Intel CPU anytime soon, i mean.. you guys try sooo hard it seems you´re trying to sell all of us Intel CPUs.

 

logainofhades

Titan
Moderator


With the 760k now available, I lost interest in APU. :lol:
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
During the presentation. AMD gave a slide that explains why are not making Steamroller/Excavator FX CPUs. The slide shows the evolution of sales of CPUs and APUs and says that in 2013 9 out of 10 PCs have APU.

I predicted AMD would pair Kaveri APU with a R9 graphics card for gaming and they did...

A10-7850K-vs-Core-i5-4670K1.jpg


My mistake is that I waited a 290, but they used a 270.

HSA in action. In this example HSA brings up to 5x boost over a quad-core CPU.

Kaveri-APU-HSA-performance1.jpg


Another example of MANTLE potential. However, don't expect this kind of boost in games soon ;-)

Mantle-vs-DirectX.jpg

 
re/compilation of available kaveri promo slides and bits and pieces of other information:
http://wccftech.com/amd-kaveri-apu-a107850k-gaming-general-performance-unveiled-mantle-45-faster-directx/
http://wccftech.com/amd-kaveri-apu-architecture-specifications-prices-revealed-241-billion-transistors-245mm2-die/

CYBERPOWERPC Debuts Steam OS Powered Gaming System at CES
http://www.techpowerup.com/196490/cyberpowerpc-debuts-steam-os-powered-gaming-system-at-ces.html
the $499 iteration should give some hint on how the cheaper steam machines will be configured.

10Gbps USB3.1 up and running at speed
http://semiaccurate.com/2014/01/06/10gbps-usb3-1-running-speed/

CES 2014: New MSI AMD Kaveri A88X Motherboards and Socketed Kabini
http://www.pcper.com/news/Motherboards/CES-2014-New-MSI-AMD-Kaveri-A88X-Motherboards-and-Socketed-Kabini

i'm a bit disappointed by a10 7850k's $173 (promo slide) price. it has a rather massive price difference with a10 7700k @$119. a10 7850k woulda been a killer deal at $150 (my prediction turned out to be wrong). 7850k's "competitive positioning" looks bold too, reminds me of bulldozer. hopefully the opposite result this time.

amd seems to have started a new version of 'moar cores' with hsa underpinning. they now refer the whole apu as 12 "compute cores" (4 cpu + 8 igpu, 7850k).
 
Slide 1... So they used a 270 instead of a 290? Then that slide doesn't say anything important, really. Why put a bottleneck to your system? The only reason I can think of right away, is that they were trying to hide the CPU bottleneck for traditional games. FISHY.

And I think gamerk has a valid point. The "up to" statements are psychological marketing terms (brain farts? haha). So far, until MANTLE titles reaches my own hands, I think no one as proven him wrong, but he isn't right either IMO :p

Anyway, in all fairness, nVidia's presentation was more interesting to me. And they took all media outlets for themselves. It's really amazing how Jen has the Tech News outlets at his palm.

I'll keep the salt at my side until I replace my Llano A8.

Cheers!
 

in amd's defense (fudis: i don't speak for amd), an a10 7850k with ddr3 2133 running r9 270x pretty much represents a high end apu gaming machine with dgfx. it's similar to ones current trinity, richland owners are running, so it represents a realistic configuration. it's understood that those users upgraded to 7850/70/7790 or higher after using the igpu or 6670-class gfx for a while. incidentally, this is something quite a few clueless people argued with me 'back in teh days', claiming whoever buys an apu machine will never have funds to upgrade to gaming discreet gfx. hehe.. :ange:

on the other hand.... why introduce a gpu bottleneck in a promotional showcase event... why indeed.... :whistle: :ange: too bad pure pc didn't post the last slides (i haven't found them). the ones right before disclaimer slide contains the real bits like full specs, software and o.s. versions etc.

 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



I think they're just being realistic. Pairing a $173 CPU with a $189 GPU is far more common than the people buying $500 GPUs.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


You seem to have missed the part where Mantle solves the problem of not being able to scale properly to multiple threads.

AMD showed off a 2ghz 8350 performing the same in a game as 4670k or 4770k (I forget exactly). But regardless, with the amount of value you place into single thread performance, that should never happen. Yet it did.

There are lots of things that aren't supported but people do it anyways. One of which is installing Linux on a laptop. But my point is that just because Microsoft and Sony aren't coming straight out and saying that they support Mantle 100%, it doesn't mean it's not going to get used. It should be extremely obvious why Microsoft doesn't want to support it as it's a direct competitor to DirectX. For Sony it's just another library for them to have to support. AMD will have to shoulder that burden and that's the whole point of Mantle, to give API control to the people who know the hardware that API is going to be running on the best.

However your love for Gameworks while hating Mantle is contradictory. Gameworks suffers very similar problem to Mantle, in that Gameworks is going to be optimized for Nvidia, yet Nvidia has no presence in consoles. So to use Gameworks on a multiplatform game, you would isolate Xbox, Playstation, Nintendo, and AMD GPU users on PC.

I said several pages ago that there's going to be a lot of dust being kicked up in the transition to a weak, multi-core environment for games. And you seem to be discarding all of that and still burying your head in the sand thinking that Pentium G will be the best CPU for the next 50 years because it has decent single thread performance in Battllefield 3 single player.

Mantle solves problems that you seem to be overlooking:

1. Intel and AMD can't make significant single thread performance gains anymore and are relying on adding more cores
2. Game developers don't know how to scale to more cores
3. Mantle abolishes AFR multi-GPU setups

You also seem to think that PhysX and CUDA are viable solutions in gaming for this generation. Using PhysX or CUDA instantly locks you into having advanced features only available to Geforce users on PC. Meaning that all those GCN cores in the consoles which Sony and Microsoft put there for GPGPU are going to get wasted and instead ran on an array of weak Jaguar cores.

But it should be obvious Mantle is going to scale differently on different hardware. It's going to scale better on weak CPUs, specifically ones with many weak cores. Which is what game developers need to make the jump we saw in graphics quality from the start of Xbox 360 to the end, but on Xbone and PS4.

 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
With certain Memory Brands selling 2666 and 2400 MHz cheaper than 1333, 1600 and 1866 memory. APUs are looking pretty good...

I'm a bit dis appointed in MSI for not releasing a ATX Gamer Series FM2+ Motherboard... But I think I found the MoBo for my mATX build...

@logainofhades, Yeah man, the 760K is a baller for sure! Esp with a dGPU.

@gamer3k, Glad to see you agreeing on something us AMD guys are saying, and as much as I hate to admit it, I think you're right on everything. Except MANTLE. I think if MANTLE gains enough speed, Nvidia will hop on the bandwagon. And if Nvidia does, you can expect me to buy one of their GPUs. I don't play favorites. I get what I want, and what I feel best benefits me.

I'm curious to see if Micro center will carry the A10 7700K for cheaper than $120. I can't wait to pick that bad boy up!

I'm also wondering how MANTLE is going to change BF4 with memory. Will faster Memory effect the FPS like it does with DX11?
 

here's one of the biggest issues with kaveri launch pricing imo - fx6300. and to a lesser extent, fx6350. if both these two were near or over $150, i wouldn't have much of an issue. however, if amd can really make their "12 compute cores" propaganda a reality for all applications including games, with proper load-balancing where the apu can juggle between cpu, igpu and heterogeneous workloads out of the box well, then 7850k will be the better buy. until then, fx6300 will remain better for price.
edit: i've considered motherboard prices as well.
 
And this claim about AMD Paying Devs and Publishers, can you please post proof of this claim? otherwise, you might as well stop all this BS.

Easily done:

http://bf4central.com/2013/10/amdamd-paid-ea-5-million-battlefield-4-deal/

And given the development time needed to put the Mantle API in place, I have no doubt that it was a part of this deal.

Mantle will be around like it or not, dont compare Closed tech such as PhysX or GameWorks with MANTLE...

Direct X is still hanging around. As is Windows.

Closed versus Open DOESN'T MATTER. All the matters is that the API in question is clean, fast, gets the job done, and is regularly updated. The idea the Open is somehow "better" is simply FUD.

and you seriously think AMD has the money to bribe all those Devs and Publishers? every Million counts for them and i bet you if they were bribed each of them would ask for at least a dozen of Millions to support MANTLE,

See above. $5 million to EA. AMD views this marketing as an investment. Pay money to get your API used, which makes your product look better, which gets more people to purchase said products. Not that hard to understand.

AMD Cannot afford bribing like Intel and nVidia... to me it is clear that the Devs LOVE MANTLE, did you not see the AMD Developer Summit were MANTLE got discussed? the Devs who were present were BLOWN AWAY by the claims of DICE.

Show me numbers. Across a wide variety of hardware, from APU's to CF/SLI'd GPU's. Intel versus AMD CPU's. AMD versus NVIDIA GPU's. 40% could just as easily mean going from 10 FPS to 14 FPS on their lowest level APU, after all.

Now tell me how long and costly adding the API was, and how much more money it made for the studio. Performance is only one factor; time, money, compatibility and stability are another four. The Devs may like the increased performance, but if their managers don't see a financial benefit...

MANTLE will be open but it is not finished as of now, reason why it is only GCN at this moment or you expected them to make it CUDA first? deamn on, i already posted several links and slides regarding MANTLE and AMD promised to make it 100% open and that it will not only be tied to GCN as soon as they finish it.

Right, I can easily forsee AMD helping NVIDIA get Mantle working on its cards.

http://www.dsogaming.com/wp-content/uploads/2013/11/Untitled11.jpg

45% more performance with GCN GPUs on BF4 will surely impress the heck out of any Gamer, specially the budget gamers... we already saw Kaveri run at a avg of 30 FPS on BF4 without MANTLE, add 45% to that 30fps and it becomes 43 FPS on average, i can predict AMD GCN based GPUs to start selling like hotcakes in the following months.

You assume performance increases are flat across the entire product line. 40% could just as easily mean going from 10 FPS to 14 FPS on the lowest tier APU.

Your attitude is getting on my nerves... you just cant stop crapping on AMD at every turn, you crapped on Kaveri claiming only 10% more performance, you crapped o MANTLE over and over, and now you crap on TressFX... heck why are you on this thread if you clearly can´t stand AMD? at least you´re not a raging at every turn but you have proven nothing but that you are clearly against anything AMD does.

I'm sticking to my 15% CPU performance increases for Kaveri. And I personally think I'm being optimistic.

TressFX isn't used. Period. Even Eidos isn't using it anymore. Its a dead API; get over it.

I am not Pro AMD, i was even accused of being an intel fanboy when i posted the news about AMD having no plans for Steamroller FX CPU and most of you had diarrhea because of that.

My first pc was a Intel Pentium2 at 266mhz, my second Pentium 3 at 600mhz, third was a Pentium4 at 2.2ghz, third was a Xeon Dual Socket Server at 3.8ghz with HyperThreading and my only AMD pc is my current PhenomII 980 clocked at 4Ghz, i will not support Intel any longer after i learned about all the crap they done to hurt the Industry (not only AMD) and i am also aware of the issues AMD has with their CPUs but their CPUS get the job done anyway.

But you sir, you cant stop worshiping Intel and crapping on AMD... and is just getting annoying, you cant give the slightest credit to AMD... just impossible for you.

You, Haffijur and that other guy (can`t remember his Nickname) have dragged this thread over and over in comparisons of Intel Mighty CPUs against the shitty AMD CPUs and at least you seriously wont convince me about getting an Intel CPU anytime soon, i mean.. you guys try sooo hard it seems you´re trying to sell all of us Intel CPUs.

I was getting plenty of these prior to the BD launch. I'm still gloating about being right on that one. And i was correct for the right reasons: Knowing how SW works, BD was a poor design for desktop PC's. That's EXACTLY why AMD is moving toward APU's, which is what I said they should do almost 5 years ago.
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460
@de5_Roy, I agree, at the price point of the 7850K, it isn't a good buy. Yet. I really like the price on the 7700K though. Plus the newer motherboards and support for excavator is looking quite nice.

IMO, the 7850K should be priced @ $150 and the 7700K @ $110. But I think AMD is charging $170 for the i5 level of CPU performance... If you think about it, It's looking better and better a deal than a i5.
 

7700k has a better price... but look at the shader count - 384. right now i guess amd has an a10 7800 positioned between 7850k and 7700k @$130-150. the rest of the features are indeed nice, and upgrade path to carrizo is an assurance some people will seek since am3+'s is stalled (or gone).

price-wise, anything is better than an i5, especially haswell i5 lol. as long as amd delivers, people will buy for the long term. i am not talking about the initial rush.
 
You seem to have missed the part where Mantle solves the problem of not being able to scale properly to multiple threads.

RENDERING threads; CPU based threads that don't talk to the GPU at all aren't going to be affected here. And yes, Multithreaded Rendering in DX needs a LOT of work, since its a PITA to set up right. I expect this to get cleaned up in the next revision though.

There's also still the trend of using one "Big" thread to drive the rendering process, and smaller "worker" threads to handle other processing, which limits scalability.

AMD showed off a 2ghz 8350 performing the same in a game as 4670k or 4770k (I forget exactly). But regardless, with the amount of value you place into single thread performance, that should never happen. Yet it did.

Depending on workload, I could see that. BF3/4, for instance, would tend to do better on AMD since it scales [especially in MP].

There are lots of things that aren't supported but people do it anyways. One of which is installing Linux on a laptop. But my point is that just because Microsoft and Sony aren't coming straight out and saying that they support Mantle 100%, it doesn't mean it's not going to get used. It should be extremely obvious why Microsoft doesn't want to support it as it's a direct competitor to DirectX. For Sony it's just another library for them to have to support. AMD will have to shoulder that burden and that's the whole point of Mantle, to give API control to the people who know the hardware that API is going to be running on the best.

But its not Sony that needs to support it, its NVIDA and Intel. You are now coding at a MUCH lower level, so the software MUST be optimized for a specific hardware architecture. That's why AMD is only supporting GCN. That's the penalty of removing abstraction.

And historically, developers have ALWAYS chosen ease of development over pure performance; that's why we stopped coding in Assembly.

However your love for Gameworks while hating Mantle is contradictory. Gameworks suffers very similar problem to Mantle, in that Gameworks is going to be optimized for Nvidia, yet Nvidia has no presence in consoles. So to use Gameworks on a multiplatform game, you would isolate Xbox, Playstation, Nintendo, and AMD GPU users on PC.

1: I've never expressed support for gameworks except to state that its the same deal as Mantel, as you just stated.

2: Consoles don't support Mantle.

I said several pages ago that there's going to be a lot of dust being kicked up in the transition to a weak, multi-core environment for games. And you seem to be discarding all of that and still burying your head in the sand thinking that Pentium G will be the best CPU for the next 50 years because it has decent single thread performance in Battllefield 3 single player.

You again assume you can trivially break threads up in a coherent manner as to increase performance in this way. And as someone who's actually worked on software, I can tell you this is not the case. And will NEVER be the case for the majority of tasks out there. If it was, then we'd be running the host OS directly on the GPU by now, and the CPU as we know it would be obsolete. Some tasks simply DO NOT SCALE.

Mantle solves problems that you seem to be overlooking:

You assume GPU performance is currently a problem.

1. Intel and AMD can't make significant single thread performance gains anymore and are relying on adding more cores

10% per year isn't THAT bad. And aren't some people claiming that Kaveri is getting, what, 30% more IPC?

2. Game developers don't know how to scale to more cores

Not "don't know how to". "Can't".

3. Mantle abolishes AFR multi-GPU setups

How so? Surely, with all that free processing power, devs will NEVER add a bunch of features to saturate the GPU again, making SLI/CF attractive...

You also seem to think that PhysX and CUDA are viable solutions in gaming for this generation. Using PhysX or CUDA instantly locks you into having advanced features only available to Geforce users on PC. Meaning that all those GCN cores in the consoles which Sony and Microsoft put there for GPGPU are going to get wasted and instead ran on an array of weak Jaguar cores.

As already noted, Mantel isn't supported on consoles [which is the point EVERYONE is missing here]. So using Mantel is no different.

But it should be obvious Mantle is going to scale differently on different hardware. It's going to scale better on weak CPUs, specifically ones with many weak cores. Which is what game developers need to make the jump we saw in graphics quality from the start of Xbox 360 to the end, but on Xbone and PS4.

And we get back to my point: 40% on what hardware? 10 to 14, or 100 to 140? Big difference in how we evaluate the results, no?
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
You`re twisting things around, as your own links claims this:

"The deal also gives AMD the right to bundle Battlefield 4 with various graphics card products"

AND

"Part of the deal included AMD giving Battlefield 4 developer DICE "EARLY" access to the new “Mantle” API technology"

See the key word EARLY? just in case you want to pretend that was never written in the article.

So no, they did not pay EA DICE to use mantle, they paid them to be able to BUNDLE BF4 in AMD Hardware, and as a Bonus they gave EARLY access to MANTLE API, see how you are damn good expert twisting things around?

Did you read this?

http://www.polygon.com/2013/11/11/5091726/amd-to-release-tressfx-2-0-animates-grass-and-fur

That was just posted Nov of 2013, see how most of your claims are wrong or a plain twisted to your own bias? perhaps it is both Wrong Claims and Twisted facts all you got... you seem to be very good at that.

TressFX is here to stay... you know why? because it just works on any GPU, and you know why MANTLE will be around? Because it`s not being pushed by AMD (although it is being developed by them) but being pushed by DEVELOPERS themselves, those same devs that always been begging for a low level API (but yet not so low level that it can work on other vendors) , cant you remember all the times John Carmack, Johan Andersson and others begged for Low Level access?

DICE already claims that adding MANTLE support cost about an additonal 10% of the game development budget? which Johan Andersson affirms that is very affordable, fast and worth the extra effort at the end of the day, you probably did not read about all the Devs getting blown and amazed by MANTLE at AMD Dev Summit, but i guess you can speak in the name of a whole bunch of developers who just hate AMD for no apparent reason and will rather not use AMD APUs inside Xbone and PS4 haha.

Your words are coming straight from an Anti-AMD fanboy, Point and Fact.

What ever proof and fact it´s shown, you will just ignore... damn it`s like having a conversation with a Wall, but a with wall you can at least pretend it`s listening, anyway your 15% prediction on Kaveri seems to be falling appart... just saying.

Even if it´s only 10% on average more CPU performance compared to Richland is still damn good, but for you never seems to be enough... heck for me jumping from a PhenomII 980 at 4.0 to Kaveri it will be a massive boost plus a shiny GPU equivalent to a HD7750 in case something happens to my GTX470 as a backup, i also plan to OC the heck out of Kaveri as much as i can. MANTLE and HSA will be a nice extra boost in Performance in the future.

Serious people about AMD Steamroller Debate should really just ignore anything you write, all you got to offer to this thread is bad news and negativity, kinda sad... one thing is bringing facts and reality about a Product, but another is to just trying to twist everything around and throw crap... i have still to see you saying something good about one of the AMD Products and Technologies, just as most of us praise Intel and nVidia when they deserve it.

Im putting a sticker with the word fanboy on your nickname, just so i can safely ignore anything you write from now on, others can follow suit if so they wish.


[Edit]

Yeah i was expecting Kaveri to hit at 150-160, but 170 is still affordable and for the damn awesome boost in performance is very well worth it, more if you take HSA and MANTLE as future Extra Performance, planning on buying a R9 270X to pair with Kaveri.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Therefore, after months explaining here that next gen games will be "GPU bottleneck", how MANTLE implies "GPU bottleneck", why weak CPU on next consoles implies offload to GPUs and "GPU bottleneck", and that in the infamous slide 13 in October OEMs talk was showing AMD strategy towards gaming on "GPU bottleneck" situation, people here is surprised about... "GPU bottlenecks".

Funny also how claims are changing slightly. Some time ago MANTLE BF4 would be, at best, 5% faster than the DX version and Steamroller would be, at best, 10% faster than Piledriver. Their figures are changing to 10-15% and 15% respectively.
 
Status
Not open for further replies.