AMD CPU speculation... and expert conjecture

Page 37 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
AMD Expects Game Consoles to Account for Nearly 20% of Revenue This Year.
AMD May Change Business Model with Next-Generation Game Consoles
http://www.xbitlabs.com/news/multimedia/display/20130123225745_AMD_Expects_Game_Consoles_to_Account_for_20_of_Revenue_This_Year.html
they'd be selling chips, not technologies as x86 licence is non-transferable. so simple and i totally forgot about it. :pt1cable:

AMD "Richland" Desktop APU Lineup Detailed
http://www.techpowerup.com/179248/AMD-quot-Richland-quot-Desktop-APU-Lineup-Detailed.html
here it says richland will have gcn igpu. wtf. i wonder if they confused it with gcn because of the 'radeon 8000' igpu numbering scheme or there really will be a gcn igpu.... :sweat:

amd wins
http://www.techpowerup.com/179245/AMD-Wins-Two-Industry-Awards.html
 


Wii U sales are already slumping, now matching PS3 sales in Japan.

I'm going to say it again: Wii U = Dreamcast.
 
It appears AMD is playing the Intel game, now that theyre finally selling some chips:
Chips Instead of Licenses
Based on unofficial information, both next-generation consoles from Microsoft and Sony will utilize system-on-chips with up to eight Jaguar x86 cores and custom Radeon HD graphics engines. Considering that due to x86 license agreements with Intel AMD may have no rights to license x86 designs to any third company and has to build-in third-party IP into x86-based SoCs itself, there may be reasons why AMD is going to sell chips to console developers and not license technologies.

http://www.xbitlabs.com/news/multimedia/display/20130123225745_AMD_Expects_Game_Consoles_to_Account_for_20_of_Revenue_This_Year.html

Therefore, having a healthy AMD is important to them, and R&D can be shared, which also keeps AMD healthy, with a healthier business model, methinks thanks to one RR
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



The numbers were just a very rough guess. We have no way of knowing the contract details. Microsoft may have bought the designs outright like they did with XBox 360. However, with 28nm capacities rising monthly and 20nm coming online, there's virtually no way the costs could rise over 3-4 years.
 

lilcinw

Distinguished
Jan 25, 2011
833
0
19,010


It was interesting to see that in some games the FX showed less variability than the i7.

What I thought was most intriguing was that Skyrim at the High preset showed higher spikes than the same resolution at the Ultra High preset for both CPUs.

SkyrimHi5760CPUBottleneck2013.png


SkyrimUltra5760CPUBottleneck2013.png

 

wh3resmycar

Distinguished


nah. it's nintendo, once the flurry of pokemon/mario/JRPGs game hit that console, (nearly)all those japanese folks will get one.

a samurai has his sword; a japanese gamer has his nintendo.

we can never explain "japan".


there's a new toms article about bottleneck. obviously amd folks will see only the test where amd came close of it but here's the thing, a phenom2 would be close to PD performance more or less (or maybe i'm wrong). which negates every little hope for those am3+ adopters aside from feeding the epeen.
 


My counter argument: BF3 isn't THAT good a benchmark, considering how aggressively shaders are pre-loaded. As a result, you are going to see a generally flat graph for any CPU above a certain amount of processing power [with the odd spike here and there]. That matches what the graphs, indicating latency wise, both are about the same.
 


Which again is easily explainable, as Bethdesia games are much more strenuous on the CPU, so weaker CPU will be exposed. This is the type of game that does NOT play nice with BD/PD's architecture.
 
But, where it isnt stressed, it shows slightly better.
Also Id point out that OCing the Vishera wasnt much good, as the thermals are somewhat maxe out, similar to nVidias 6xx series cards.
I will also say future solutions will gain, as in crashes link dating back to Dec? where SR appears to address several more of the limitations both BD and PD have, again, following a possible node shrink as well, and a better understanding of the newer silicon/doping issues.

I would also point out, in gaming, there actually is a good enough scenario, if its smooth, and your fps are there, its good enough, or, having 80fps vs 100 usually wont make alot of difference to most gamers who want to save 200$, and makes it an outlier/corner case than a practical one
 

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860

not really, from what I gathered, Bethdesia is still using pure x87 programming with 0 optimization. It helps to know what part of the cpu is being worked.

The thing is you missed the actual comment on Intel cpus performing better with ultra preset and worse on high preset, Intel VS Intel, not Intel VS AMD.

Just because a GAME shows disparity between Intel and AMD doesn't automatically make it the benchmark of choice just for that one reason and all other benches are invalid because AMD can't possibly be competative with Intel.
 

What enhancements going to ultra are added to the demand of the cpu.
Im curious on this, as it could be a number of things, like gating
 


I'd love to see a source for that. Hard to imagine anyone would waste time using X87 instructions when any compiler (short of MSVC 6.0) would automatically drop in much faster SSE instructions. So I'm guessing your blowing out hot air again?

The thing is you missed the actual comment on Intel cpus performing better with ultra preset and worse on high preset, Intel VS Intel, not Intel VS AMD.

Define "better" for me. On Ultra, you have a higher overall latency, but you don't see any of the nasty spikes high demonstrates. Its *possible* the game engine/GPU is coming into play at that setting level, causing some condition thats stalling the CPU. [Interesting thread on said topic: http://forum.beyond3d.com/showthread.php?p=1689708]. This is kinda the sort of thing you'd need real low-level analysis to see what is going on under the hood. This is the exact case I'd throw GPUView on [http://graphics.stanford.edu/~mdfisher/GPUView.html] and see whats going on. I'm suspecting the CPU is stalling due to the game engine design, but hard to say without actually looking.

Just because a GAME shows disparity between Intel and AMD doesn't automatically make it the benchmark of choice just for that one reason and all other benches are invalid because AMD can't possibly be competative with Intel.

The reverse is also true; just because a GAME shows the same results between Intel and AMD doesn't automatically make it the benchmark of choice just for that one reason and all other benches are invalid because AMD can't possibly be competative with Intel. Yet you have no trouble holding up BF3, in GPU bound situations, as "proof" AMD >= Intel.

The majority of benchmarks show Intel > AMD this generation, therefore, we must conclude Intel > AMD. The other benchmark results are typically explainable, though I'm sure both Intel/AMD have their weird outliers that need more formal investigation.
 

jdwii

Splendid



One of the main reasons why Amd is broke is because of Global foundries, their payments with them are done and not to mention Amd is getting the CPU+GPU deal for all consoles and yes all the new consoles will pay off even the Wii U will get decent sales(Nintendo never really needed much 3rd party support)

On top of that Amd has been in worse times the fact that Nintendo+Microsoft+Sony all signed the deal for Amd means they even think they will last at least 7 years.




Want to take a bet on that? People said the same thing with the Wii and we all know the failure of that console. Starting to think your predictions are lacking. Even If they came in 3rd place that doesn't nesscerry mean nintendo is going to crash not to mention their 3DS is very successful
 


Toms makes a great point in the "value proposition" when you take a look at the whole thing.

The price difference between the FX and the i7 (and i5, for that matter) start to disappear once you count in all the other components. In terms of perf/price. The rule would be: if your components price is skyrocketing the overall price, then the price difference of the CPU is insignificant. Big numbers rule, I'd say.

It's like when you buy a Lambo; if you think "aah, the shark leather seats add 10K more to the price tag" and then say "the base price is 450K anyway...", those extra 10K shouldn't sway you away from getting them, unless you're real tight on money and wanted the shark leather in a Golf GTI instead which is around 40K (when in which case, the point of a high end build is moot).

And yes, more car analogies! Yay!

Cheers!
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Not really the same if you're on a fixed budget.
$130 is the difference between a 7850 and a 7950. (1024 vs 1792 Stream Processors)

Or a 20" LCD and a 27" LCD.

More like do you buy the higher end Maxima baseline, or do you get the Altima with a high end tricked out package.
 


Or an EVO GSR and a MR. A big price difference, but a few perks you might not even want. Or a Camaro ZL1 and a RS? haha

Yes, you do make a good point as well. Still, the rule still stands. If you're going with a dual 7970 as a base and a USD$300 case, then the CPU price difference becomes less relevant if you ask me. It's depending on what you actually want to get around a certain config and components I guess.

But man... Shark leather seats... Holy crap.

Cheers!
 

lilcinw

Distinguished
Jan 25, 2011
833
0
19,010
@gamerk316

That is where I was going with it. What in the engine is causing higher latency spikes with less GPU load? You won't hear me argue that for a top of the line gaming rig/epeen an i7 isn't what you want.

I do agree with Cazalan though that for total value in a mid range system AMD has a very good value play. (Full disclosure: I run an 8320 with dual 560Ti's though the build was not designed solely for gaming.)
 

jdwii

Splendid
I want to see what haswell brings to the table in benchmarking, see if Intel was being a bit generous on the 650 like performance and what about the CPU.

If its past Amd levels then i'll call it game over unless Intel puts their new graphics on the I7 only and not a I3 which is where the A10 is usually fighting with.



In the conclusion on this article

http://www.tomshardware.com/reviews/fx-8350-core-i7-3770k-gaming-bottleneck,3407-9.html

Our benchmark results have long shown that ATI's graphics architectures are more dependent on a strong processor than Nvidia's. As a result, we usually arm our test beds with high-end Intel CPUs when it comes time to benchmark high-end GPUs, sidestepping platform issues that might adversely affect results designed to isolate graphics performance.

Why in would Amd have it this way?

Honestly would like someone to explain how this is even possible.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


GT3 will be on quad cores only. i5/i7
i3 will be limited to GT1/GT2.
 
Status
Not open for further replies.