AMD CPU speculation... and expert conjecture

Page 349 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

If so, HEDT SR is guns-a-go-go!
Speaking of HEDT, just went with a friend to pick up a 8320 at Microcenter for his new ROG build, a bit disappointed. The store display for the 8320 has the famous tin box (unlike the 8350, which had a CB box). But of course, the actual 8320s were in the CB box, so that was that.. Now to wait for the other half of his parts to arrive...

Fail hour: http://www.youtube.com/watch?v=P4Hp0xQhJwg
 


Looks like my friend was lucky enough to pick up a tin! 😀
 

AWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWWW!
 


why would I need a 290x, my 2 6970s out perform one 290x.

Proof
 
^^ calculated benchmarks with a flaw.

Please note that the above 'benchmarks' are all just theoretical - the results were calculated based on the card's specifications, and real-world performance may (and probably will) vary at least a bit.

Core Speed

6990
830 MHz (x2)

290x
800 MHz

6970 pcs+ are clocked at 970
290x is clocked at 1000+

4gb vs 2gb also so high res will favor the 290x. Performance wise they are near identical when crossfire is enabled. you don't need crossfire to get the 290x to perform.

as long as your happy with them there is no reason to upgrade. I want to get my money's worth from old parts instead of waiting till their price is $20.

in the long run if you waited till they were $150 each to buy them, it turns out the same. I paid $300 each, sell for $150.

http://www.techpowerup.com/reviews/AMD/R9_290X/27.html

1- 6970 = 52% of 290x quiet mode. crossfire driver overhead puts the 290x slightly faster than 2x 6970 (rl tested)

Company of Heroes 2 is no comparison since it doesn't support crossfire.

+ for 6970 cf, its right now $300 vs $550-580 for the same performance 80% of the time.
 


To each his own I guess...

I'd rather sell my HD 7870 and invest in a r9 290X than do crossfire with two GPUs with 2gb of VRAM. Then again, what I plan on doing with youtube makes a GTX 770 4Gb look like a insisting deal. I really don't want to go out and buy a capture card if it comes on the GPU already... but then again, MANTLE would make full use of my future CPU performance. I pray AMD works a Encoder into their cards soon.

My HD+ monitor requires not only GPU power, but a MOAR VRAM 😀 lol

(The Q6660 Inside, Don't Judge me. Lol, I'm working on getting a new processor now. IMO, it is worth it.)
 


First, that link was given before by me in the same message that you are replying. Pay attention to this part of it:



Indeed, as said above:



Also, as said above:



Moreover, I didn't say this in the message that you reply, but I said before in the thread: AMD is in a financial position very different than Intel. Intel can support making products for enthusiast and sell them at premium prices. AMD lacks the money for that. It only returned from the red one quarter ago.

People can continue [strike]fantasizing[/strike] speculating about a 10-core 5GHz SR FX CPU, but there is none.
 


I.e., you don't comment on AMD last talk, but pretend to hide again what you said, and continue doing SOI/GloFo marketing... LOL
 


Thanks by being the first that comment on AMD last material.

Look at the benchmark. Approx.

A10-6970K: 53 FPS
FX-6350: 55 FPS
FX-8350: 57 FPS

Do you really believe that slide means that AMD will be releasing new 8 and 10-cores?

10-core makes no sense, therefore I am ignore this. New 8-core and 6-core CPU are still possible... as a Piledriver refresh a la Warsaw, but if that is the plan you dont give a talk that basically says: hey don't buy an 'expensive' FX-8350, because a cheap A10 makes almost the same work. You can obtain about a 92% of the gaming performance with half the money.

Please note that slide considers 1080p. At higher resolutions the differences will be small.

For me that slide solves the question made here about what will be AMD promoting for pairing with the new R9 Radeons. I said that AMD was migrating to APUs and several people here complained that AMD did need new CPUs for the new R9 Radeon.

Look at the slide. They are pairing the new R9 280X with the new A10-6970K APU. My bet is that during Kaveri presentation AMD will show benchmarks of the new Steamroller A10-7x00 APU with the R9 290X.
 


Undo to the original, because nobody in this thread said what you pretended. It is just a new instance of you making things in your head.
 


In a game that uses a GPU demanding engine...sure. Go ahead and benchmark Crysis 3 or BF4 multiplayer and see what it looks like.

8350 > APU

My last post was in response to your request for proof that SOI is that much better than bulk.

Since you clearly were unable to find the information on your own...(or didn't bother to look at every article comparing the 2 substrates)...I posted it for you, from multiple sources.

EDIT: This bodes well for GF:

http://semiaccurate.com/2013/10/29/ana-hunter-leaves-samsung-global-foundries/
 
The 290X Crossfire better than Titan with SLI and smoother game play.
http:/HardOCP/

Maybe not Steamroller news, but Things are falling in place for AMD.

Ok, great. How much money are they making per unit sold? I can't help but think AMD is sacrificing profits for market share, and lets face it, people who buy these cards buy for price/performance; going NVIDIA or AMD will change on a generation to generation basis, making market share a useless goal.

I can guarantee NVIDIA makes more per unit sold, so even if they sell less, their GPU market is doing better then AMD's.
 


You're not considering die size and yield. NVidia has to make more per GPU because the yields are likely quite a bit lower on that massive GK110 GPU die, as opposed to AMD's die on the Hawaii XT that is 30% smaller. Meaning yields can potentially be much better.

Remember the GTX 470/480? Now NVidia is having to sell a similar part at bargain bin pricing to compete. Analysts even predict AMD will exceed 40% market share by Christmas. Loyalty is fickle in PC gaming, though they will still be selling more GPUs this quarter than NVidia, and it seems like they are banking on $X*1.5N GPUs sold > $Y*N GPUs sold. As long as the margins remain around 30-40% they should be making money hand over fist.

Additionally, with the rest of the R series being rebadge/rebrand jobs, they are likely making good margins on those at their current price points...(which obliterate NVidia pricing across the board almost)
 


When the 8350 did suffer with older games you typical answer was benchmark newest games like crysis 3. Now your answer is benchmark older games like crysis 3... LOL

I am rather sure that AMD is relying heavily on BF4 lately, because it is the 'prototype' of next gen games. Next consoles having weak cores forces developers to offload heavy stuff to the GPU.

The slide shows the APU giving ~92% of the FX-8350 performance in that gaming test. Say us which will be the performance in "multiplayer"?



There was no such request. My only request was to know if you are a GF PR or something because you look like one.

The rest of my post was a demonstration of how the 100MHz delta in OC between FX-6 and i5 destroy your 'theory' about how bad Intel chips OC due to choosing bulk.

I want to add that Kaveri ES clocked at 3.5GHz have been showed.
 


bf4_cpu_geforce.png


athlon x4 750k = trinity. its found at the bottom by the pentium g's, overclocked to 4.5 ghz and still getting beat by 4350 @ 4.2 ghz and completely obliterated by the 8350.

you really need to quit trying to live by marketing slides. best case scenario != all cases.

While (Juanrga = wrong) {printf("JUANRGA IS NEVER WRONG")};
 





GK110 is nearly 30% larger. AMD is probably making MORE off of 290x than Nvidia is making off of new GTX 780 and GTX 780 TI while being faster AND selling for less.

Nvidia got absolutely humiliated by Hawaii this round. It's why:

1. Important people have been fleeing Nvidia in the last month or so

2. Nvidia has been having this large, haphazard release of new products

3. Nvidia decided to play all it's "non-gaming" cards at once (G-Sync, Shadowplay, etc)

Nvidia's only selling points right now are to put a more expensive cooler on reference models to justify the price difference and to tout their software.

Nvidia is the one in massive trouble and they don't have anything to respond with other than cherry picking GK110 dies.



The top stock 8 core part (FX 9590) is 115% faster than the top APU part on that graph. I'm ignoring FX 2m/4c parts because they have L3 cache.

APUs are not competitive at high end gaming. You might find situations where they work fine, but they won't work fine as often.

It's the same as why you should go FX 6300 instead of Core i3. They may tie or the i3 may win a little bit, but when the FX 6300 wins it wins by a lot.

The same holds true for FX 8 core parts. I guarantee you could find me situations where FX 9590 is barely outperforming a high end APU, but all someone has to do is find a situation where they don't perform the same and you can come to the conclusion that they are NOT the same and that one does offer benefits.

I don't get why some of you have a hard time grasping this. If you make the claim that X is better than Y, and you show proof that X is better than Y, and then someone else shows proof that X is the same as Y, X is still better than Y.

Now if someone says X is better than Y, and they show proof that X is better then Y, and then someone shows proof that X and Y are the same, that does not mean Y and X are the same.

Think of it this way. A review has 5 gaming benchmarks. Product X wins 3/5 benchmarks and ties 2/5 benchmarks. You would not say that the products are even because they tied 2/5 benchmarks.

I would not even say they were even if they tied 9/10 and product X won one!
 
^^ the same (pclab) bench shows a 3.5 ghz ivb core i3 3250 marginally beating stock fx6300, an overclocked fx4300 (4.7 ghz), o.c. ph ii x4 965 b.e. (4 ghz) in minimum fps. too bad frame time and core/thread usage data weren't available - woulda been very interesting. moreover, the game (MP mode) seems to cause L3 cache hits in amd cpus.
i consider that as the baseline kaveri has to beat (intel) with it's 4 thread execution capability.
 
AMD needs market share. With that the money will start flowing into their R&D (i hope) and the CEO yatches and beach houses, haha. Probably a big chunk will go to the later, lol.

Anyway, since AMD/ATI decided to stop going big with their dies, they've been able to properly compete against nVidia (not winning all the time, mind you), so I would not think the trend will change dramatically unless nVidia goes the same route as AMD. In fact, isn't GK104 precisely that?

Cheers!
 


Crysis 3 and BF4 are not older games...are you feeling well?

92% Performance in what? League of Legends? StarCraft? Diablo 2? Bioshock Infinite? MS Office? None of those is particularly demanding and falls directly in line with what I would expect. Let's talk about the examples where it's not 92%, and reality sets in and you realize it's really closer to 60% of the performance of the 8350, when the core counts matter.

Also, if you read the post I made before, the data was extremely relevant, Intel has had to tweak the process for the bulk wafers they use with tons of process advancements to the FinFET structures in order to compete in performance. Even then, when they try to aim for lower power, they end up sacrificing performance. All industry experts say FD-SOI will allow you to have both. So why not use it?

I am not GF PR, quite the opposite in fact. They have blundered things many times, however, they are AMD's last hope for a decent process for HEDT. So, like it or not, we have to hope that they get their act together to ensure that AMD does well in HEDT.

The simple fact of the matter is this...bulk provides no benefits, and many drawbacks. While FD-SOI provides many benefits, and virtually no drawbacks. If AMD is going to compete with Intel on bulk wafers...they need to go to FinFETs with 3D transistors, and all the other tweaks Intel had to put in to get their bulk wafers to do what they do now.

In fact, if you read the example I gave, they even speculated that Intel's process adjustments to bulk likely made their FinFETs more expensive to produce than if they had just gone to FD-SOI and been done with it. The truth is, Intel was likely so committed to the direction they chose, they were probably past the point of no return when they figured out that the cost was roughly the same after process improvements to make bulk wafers work.

It's also the reason you see far more variability in the Intel CPUs than you do in comparable AMD CPUs. Though you don't have standard deviation numbers to prove my point...we all know that the "silicon lottery" is quite a bit riskier on Intel products. What else do you attribute that toward?
 


GK114 is what you're referencing, IIRC, though that project was basically derailed from what I recall of semi accurate's discussion of upcoming NVidia projects.

In short, NVidia has no answer to Hawaii XT until Q4 2014, and it won't be something impressive that comes out by the time AMD refreshes again. In fact, if AMD plays their cards right, and launches new GPU architecture Q3 2014, then NVidia may release a product that isn't even competitive at that point when they do launch their new architecture the next quarter.

Wouldn't that be interesting? "Look! It's the new GTX 880 benchmarks! It's...it's...quite a bit slower than the R9-390X...well...it has shadowplay 2.0 and G-sync 2.0, and outdated HDMI ports though, so at least there's that...and it comes with a couple of games on Unreal Engine 3 that use PhysX."

 

Fixed 😛
 
Status
Not open for further replies.