AMD CPU speculation... and expert conjecture

Page 311 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


Im not interested in microscopic mathematical details. Im not interested in absolute averages. Im interested in situations that will make you want to throw your computer out the window because it runs like crap. yes, it may run ok when its not being pushed to the limits, but when you need the performance, and its not there, how happy are you going to be about buying a mediocre product soley based on marketing hype.

http://translate.googleusercontent.com/translate_c?act=url&depth=1&hl=en&ie=UTF8&prev=_t&rurl=translate.google.com&sl=auto&tl=en&u=http://pclab.pl/art55028-3.html&usg=ALkJrhhkFMvigVIfEg1sFfDmQQ2fdZqxsQ

http://www.anandtech.com/bench/product/675?vs=700

DAO = 82%, DoW II = 80.8% BF4 = 84% (750k @4.5ghz ~= 4300@3.8ghz)



and this is supposed to perform as a fx-6xxx in your opinion. I say kaveri might reach the 4350 in BF4, no chance on the 6350, and will be priced the same as the 8320.

Interesting point tho, so why did that other site use dragon age? Likely because it shows the bad side of the situation instead of trying to focus all attention to "omg look how great this is, we must only show promising comparisons and ignore anything negative"
 

Ah. The 900D has absolutely no place in a regular ATX build :p Is the performance of Stereoscopic rendering better on a 6970 or 5870 than the 580, that might be able to hold you off, at the cost of using AIR? Hopefully Steamroller FX will come out in the next year, they always keeps quiet.... In AMD news, they are getting ready for the APU13 Summit, hopefully some inside Kaveri news :bounce: https://plus.google.com/+AMD/posts
PS: Pics of the build?, the thread needs something to keep it occupied even moar while the news flows :p

 


And again, the point got missed; I was pointing out, specifically, how the CPU placement within the PCLab benchmarks flipped based on choice of GPU. I also noted the NVIDIA results, in terms of CPU placement, matched with GameGPU, and speculated there was an AMD GPU driver bug, which in fact there apparently was.

I was NOT comparing the GameGPU results with the results from PCLab using the 7970, which is what you were inferring.
 


Lost the pics in a HDD crash awhile back but I can take new ones when I get time. 3D gaming is by far best done on an NVidia solution. They've been doing it since the original Geforce GPU's, it was an extra piece of software you used to have to download from their site. I originally had a set of glass's that wired into the VGA connector's V-SYNC line and used an 17 inch NEC monitor doing 800x600@85ghz (42.5 per eye). Original Unreal Tournament and AvP2 was amazing with that setup.

Their 3D Vision kit makes it much easier to get the whole thing working. Now it's just software programers over use of 2D overlays for the HUD instead of rendering it as a 3D object with depth.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Well, show me the part that says they two cards were benchmarked at the same exact part of the game.

You're making the claim that the only reason why these results are different are due to the graphics card used.

We're suggesting that the benchmarks are in different parts of the game, one which uses more threads and cores effectively and one where it depends more on single threaded performance.

You're putting your fingers in your ears and going "ITS THE GRAPHICS CARDS!@!!!!!" after we've introduced a counter-argument to your argument.

Prove the two runs were identical and then we can talk.

bf3-avg.png


7970 is FASTER in BF3 than GTX 770 and now you are implying that these different levels of performance are strictly due to the graphics card used and not the location of the game.

If it is strictly due to the graphic card, how come i7 4770k is 12% slower in the second benchmark when 7970 usually beats GTX 770 in BF3? You think DICE is making GCN run worse this generation or something?

Find me a graphic card only benchmark on BF4 comparing GTX 770 and 7970 that has GTX 770 12% faster or so and then we can talk.

Until then I'm going to assume that the difference in FPS is due to being benchmarked in a different part of the game as that better explains the CPU results and the graphic card results.
 

con635

Honorable
Oct 3, 2013
644
0
11,010
I find these discussions fascinating, it gives me more knowledge on how the things work nowadays listening to the amd vs intel wars as long they're semi-civil lol.
One question on that though, I've been away from computers and pc games for about 6-7 years so from a relative 'layman' perspective, why when comparing the 2 big brands are more expensive intels compared to lower priced amds? I mean, if I had at the *most* 125 pounds to spend on a cpu my amd choice from the only places I can buy would be an fx8320 and at 3 pounds cheaper the intel would be an i3 4340? Should this not be the true 'fair' comparison? At the end of the day we buy what we can afford after all regardless of core count or clock speed etc because for most the price/performance overall is what dictates purchase.
 

Go for the 8320 TBH, it is much better suited to future games and apps. The thread turns into a flame war quite a bit.
 


I'll be that guy and tell you to place a new thread for a build, but for the sake of discussion (not fights) I'll answer anyway, lol.

Intel places with a little more freedom the price points where it wants their products to be, so I wouldn't say they're on equal footing. Intel has the upper hand on performance and power (high perf, lower power in other words) than AMD, so they set the price structure for certain performance points.

That's why you see AMD put the 8350 in i5 K-series territory, because they have not much CPU muscle to compete directly in higher tiers (i7 Ks and Xs).

We can spend months getting nowhere discussing what's better, but at the end of the day, for what you have now in software, Intel products fare better at higher price points (i7s particularly), but since AMD puts itself in such a low price point entry, they are indeed attractive to certain specific scenarios where they're very competitive performance wise (P/P ratio, if you want).

Like you said, a locked i5 versus the 8320 at the same price point, AMD wins hand down for me, but if you want more performance and you're willing to pay for it, Intel has you covered (with an evil grin, haha) in the form of the i5 Ks or it's bigger siblings i7 Ks and Xs. It all depends on what you want the performance for.

Going lower in price points is trickier, because you get the APUs into the mix, so there you have it. That's the way I see it.

Cheers!
 

8350rocks

Distinguished


That is what I have been saying, when you look at what you get from Intel for similar money, it isn't even funny.

Yes, the Intel crowd has to compare more expensive CPUs to cheaper AMD CPUs (typically significantly more expensive as well).

They would argue the 4770k beats the 8350 at many things, nevermind the 50% cost increase. For that much more money it should run away with everything in a landslide, however, it doesn't...

This may be the most astute "layman's" question I have seen in quite a while.
 

Nailed it™

 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


Good luck by evaluating the gaming performance of a chip using a review that uses only _two_ games, that cannot round figures, neither compute the average of the two percentages, and that think that they reviewed a "Verisha" CPU.

Good luck also with using that funny anandtech database.

:sarcastic:
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


I got your point but it seems you missed the mine. I was not replying to you. I was replying to him. He commented that one of the benchmarks looked better threaded than the other. I simply explained why: pclab.pl used the "Domination" mode, whereas GameGPU.ru used the "Conquest" mode.
 

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790


The answer to your question is this: by the same reason that Intel fanboys mention single-threaded applications or outdated SuperPI scores in AMD vs Intel comparisons. A SuperPi score does look an Intel chip better than it is in reality.

There is another question very related to yours. Almost always gaming comparisons of CPUs use the same GPU for all the CPUs. It is evident that if you save some money with a cheapest CPU, e.g. a 8350 instead an 4770k, you can spend more money on the GPU, obtaining a better gaming machine in most cases, specially at higher resolutions dominated by GPU bottlenecks.



To be fair, this happens under Windows 7, with its bad scheduler, which doesn't even understand the CMT architecture of the 8350 and with most Windows software, which is poorly threaded. However, under linux and with linux software the 8350 is at the same performance level than the i7-3770k.

New stuff such as MANTLE, and the new gen. of games developed for 8-core consoles will imply that the FX-8350 will perform better for new games.
 


Discussed a few pages back. FM2+ is the new socket that everything is most likely going to be on. There is no reason to make another socket as FM2+ can do everything that AM3+ can do.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780




I disagree. I think it's split into two factions.

those who think AMD is going APU only and those who think AMD is doing APUs first and HPDT later.

2m/4c chip is not enough to push a big gaming rig and AMD is still actively trying to promote FX CPUs in gaming rigs with their GPUs as a gaming platform for OEMs.

I personally lean on the side of AMD unifying server and dCPU on a platform that supports HSA .

However I feel that AMD isn't doing anything with dCPU right now because their main goal is to drive up APU install base so they can get people who want to write HSA enabled applications for the hardware, as if there's no install base no one will write software for it. And dCPU is too difficult to do as AMD doesn't have control over the platform.

When AMD sells an APU they sell an HSA enabled platform.

If AMD was pushing CPUs, they'd sell a possibly HSA enabled system if the customer bought an AMD GPU.

If AMD was pushing GPU for HSA, they'd sell a possibly HSA enabled system if the customer bought an AMD CPU and not an Intel.

So you so where this is going? If AMD was pushing dCPU right now they would have to convince everyone that you need AMD CPU + AMD GPU for the HSA software that is going to come years from now.

Instead AMD convinces people to buy APUs because the graphics performance is great, and as a side effect they create a large install base for HSA enabled applications. Once the software is there, AMD can then release a dCPU HSA configuration that is HSA enabled and people will want it because there are usefull HSA applications they can run.

At least, that's how I feel about it. But AMD has been completely silent on it for now so everyone is left guessing.

But we have gone down this road before and it takes many pages. The TL;DR and unbiased viewpoints are

PRO-APU: APUs make up most of AMD's sales so AMD is just going to abandon the markets where it isn't performing so well, which just so happens to be dCPU like FX

PRO-dCPU: AMD is just focusing on APU to gain HSA install base OR AMD is focusing on APU because they have to because a GPU + CPU in a single socket will never compare to 300mm^2+ CPU with 300mm^2+ GPU.

The choice is yours, but AMD SHOULD say something official about dCPU FX platform between end of October and middle of November.

Until then just be patient and see what AMD does. They could go either way, to be honest. No one has said anything definitive and there are rumors flying around between everything other than FM2+ is dead to AMD is preparing 6m/12c 32nm Piledrivers with HDL libraries. I did do the math and that is entirely possible. It's entirely possible AMD will only sell APUs from now and just settle with making GPUs to run with Intel CPUs.

But just let it go for now. The two sides will fight like crazy, and we already have a fight between someone saying it's impossible to write a game engine that uses 8 weak cores properly and those who think it is possible.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780
This Chilenian Site Claims to have had a chat with AMD, supposedly AMD will stick with AM3+ till 2014 and releasing newer Piledriver CPUs:

http://www.chw.net/2013/10/amd-no-descontinuara-su-linea-de-microprocesadores-fx-series/

Good or Bad News?
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


Uh, maybe seronx was right about a PD refresh using HDL libraries.

For reference, I did the math, 6m/12c with HDL could end up a little bigger than Vishera die. 4m/8c PD can compete with 3770k. 6m/12c PD should be able to compete with 3930k rather well, specially given 3930ks lower clocks than Intel quad. AMD would make a killing selling 3m/6c, 4m/8c, 5m/10c disabled parts if the die size was similar to vishera now. Imagine 6m/12c part hitting $400 or $500 range and the rest trickling down?

Now AMD is making $400 or $500 off the high end 330mm^2 die instead of $200 on the 315mm^2 die. And then the disabled parts can go for $200 instead of $120 like FX 6300 goes for? It is an amazing way for AMD to make good money off of existing IP. Pretty sure AMD has mentioned trying to make more money off of existing IP before...

I've always contemplated this happening and it may be why AMD has pushed for 200w+ CPU support in new motherboards. 4ghz+ 6m/12c chip would probably make a lot of heat.

I have a few theories of where AMD is going to go but I don't have any proof to back anything up, so it's just a big old bunch of conjecture.

I'd take 6m/12c AM3+ PD if we didn't get FX SR.
 

designasaurus

Honorable
Sep 1, 2012
52
0
10,630
Hello. It sounds like bad news to me. Already it looks like Warsaw is the big server part, and FX chips are probably coming off the same production line as the server parts. I fully expect 2014 to see the FM2+ APUs being the only AMD chips with pcie 3.0 and Steamroller cores, while we on the desktop get 4.1-4.2 GHz 8360s (or something like that) stuck with a chipset from the Bulldozer era. Intel is lazing about on the desktop front, and AMD doesn't have the resources to take advantage.

It stinks because I want to build a new computer within the next year or two, but if I'm going to upgrade I want to actually have an upgrade, not 10-15% improvement. Steamroller is going to have to be a monster to have good gaming performance off 2 modules and no L3 cache, and based on AMD's track record, I just don't see it.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


That has been the glaring problem with seronx's idea that PD refresh will use HDL libraries.

How does AMD get 8m/16c parts out of 6m/12c die?

It is possible the 6m/12c part is the single die part and 8m/16c part is actually two disabled 4m/8c dies on an MCM.

I have not given up hope. However I am bored and tired and lurking this thread.
 

BeastLeeX

Distinguished
Dec 13, 2011
431
0
18,810


I still think price/perf is not improving very well. The R9-280 (I think) is the same chip as the 7970Ghz, but sadly it is the same price. Also I had seen some prices on the R9-290X and I think they were around $700. So we are getting stronger cards, but it seems that the price for the top end chip continues to climb.

 


Umm no they don't, at least not on the socket itself. Sockets are just that, standard connectors between the CPU packaging and the various components on the motherboard. You could just as easily make a MB that didn't have any HDMI/DVI/VGA outputs while still using the FM2+ socket.

Now it's also entirely possible for them to create a new socket but there would be no functional difference, at least not until consumer DDR4 comes out. It's just more economical to unify everything under a single socket which is what AMD traditionally does anyway. The only reason FM1 was created was for the additional video outputs necessary for an iGPU.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I am hesitant about $700 R9 290x price

The lineup will look like this then:

R9 280x: $350 (assuming same price as 7970 and 280x replaces 7970).
R9 290: Unknown
R9: 290x: $700.

That's a big price gap. R9 290 at $500 maybe?
 




o_O?

Your starting to make my head hurt. Do you even read people's posts or just fire back after scanning through it with selective reading? Now reread the post very carefully and think about it for at least one hour before posting again.

-=Edit=-

For the record, the costs of adding those ports are in the cent's due to economy of scale. Not having them would restrict the potential customer base for a particular product.
 
Status
Not open for further replies.