AMD CPU speculation... and expert conjecture

Page 338 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Most of what you have been saying both in private or public is wrong. The list would too large, but here a pair of examples: Your prediction that AMD was releasing 8-core SR FX for AM3+ is gone; your prediction that the cause for Kaveri delay was the change from bulk to FD-SOI (because GF schedule did coincide perfectly with the delay, you promised us) is gone, the prediction that someone was releasing a driver for a supposed AM4 mobo is gone...

Let me refresh your memory with one of those



Clearly you were wrong.

Add to that all your posting about non-steamroller stuff: from your nonsensical claim that the PS4 is using 8-cores for games because the OS is run in another chip, to your late pretension that the jaguar chip is clocked to 2.75GHz, or to your recent nonsensical claims about RISC/CISC/arm/x86.
 


Do you read?

Who said you that AMD are abandoning the Gaming PC market? The above post says the contrary.

Who said you that AMD are abandoning HEDT? The above post says the contrary.

Nvidia already announced they are developing a HE Server ARM. And AMD has people working in a high-performance ARM core project. Negating it will not make disappear the projects.

Who said you that Opteron-X is a high-end part? Nobody said that to you. Also are you still unaware that Opteron-X will be replaced by a 2x-4x faster Seattle server?

What part of the ARM supercomputer goal is to be 1000 times faster than the fastest x86 supercomputer you don't still get?

And so on and so on.
 


That Steamroller A was canceled and Kaveri uses Steamroller B is not hot news

http://www.fudzilla.com/home/item/29986-richland-successor-in-2014-is-kaveri

Steamroller was presented at hot chips 2012 before all the HSA stuff.

My bet is that Steamroller B is HSA enabled core. This B core appears in Berlin but not Warsaw, it appears in Kaveri but not FX.
 


The main performance advantage comes from doubling the decoder which comes at the expense of higher power consumption. Therefore a 20% gain (from the decoder) does not translate to the 20% efficiency.



I suppose that he knows that some people is slow to learn. In this same thread I keep saying the same thing to people who cannot learn, including the one who is no more posting here.
 


1. I was not wrong about RISC/CISC, I had a word confused in CISC, but the explanation was still spot on.

2. Sony has confirmed an additional chip on the PS4 to run background processes, and the design lead for the entire project stated the OS would be run primarily off the GPU leaving 8 cores available for developing games. Do you think Sony's project lead was lying? Why would he? That information will be verifiable easily...

3. I never said it would run at 2.75 GHz, I said that was what they filed the max frequency in the machine at on the Patents, I did say it would run faster than 1.6 GHz, and I was entirely right.

4. I did not predict a change to FD-SOI, I was speculating that with the time frames lining up as conveniently as they were, that it was possible.

5. There has been no word of any kind from AMD stating that an 8 core SR CPU is never coming. I never predicted it would come out and be on AM3+, I merely stated AM3+ is officially supported through 2015 per AMD's official word, and there was supposed to be another chip series coming to AM3+ per AMD.

6. Foxconn's website still has those drivers posted, what they are actually for...I haven't been able to figure out...however, it clearly lists an AM4 motherboard driver update.

Any more words you'd care to put in my mouth that I never said?
 


You, specifically, said they were going to APUs and ARM cores, with no FX successor coming, which is abandoning HEDT and Gaming DTs; or have you gotten amnesia since page 175? Crysis 3 will work on Linux before ARM has any foothold in gaming desktops...

NVidia can work on whatever they want...Tegra has been an utter failure compared to anything from Qualcomm, or even Samsung. They are most certainly not a big player in the ARM race.

Even if the Seattle chips succeed, they're a low end part for micro servers. There is no high end ARM plan at AMD, I can most certainly attest to that...where are you getting your information? ARMwillruletheworld.com?

You don't get it, the ARM supercomputer will be just enough processor "oomph" to make the Tesla GPUs run...if I had the money, I could commission someone to make a supercomputer using x86 10,000 times faster than the one you claim is 1000x faster using ARM, and all I have to do is use more GPGPUs than they do. It's a gimmick...nothing more. It isn't a superiority of technology, it's simply NVidia saying..."LOOK HOW POWERFUL TEGRA 27 IS!!!!" or whatever Tegra they're going to use, and they're deflecting all the attention away from the fact that something like 100 ARM cores are running just enough OS to feed 10000 Tesla GPGPUs. Are you familiar with HPCs? It doesn't seem much like you are...
 


And targeting 30 FPS on the PS4. Just like a lot of devs are doing.

So, here's my question: Why are all the devs still on 720p @ 30 FPS, even on next-gen? GPU shouldn't be the problem here...

Hmmm...
 
arm_fullwidth.jpg


Using this logic, MIPS will beat everyone in the long run. MIPS outsold ARM and X86 COMBINED by a healthy margin, and they are re-tooling the architecture to scale upwards more.

X86 is forever a special case, simply because no other architecture has ever had as many programs written against it, nor been as institutionalized. Its an ugly arch, but its here to stay for a very, very long time. Until we get to the point where emulating the entire instruction set becomes computationally trivial, it will be the dominant arch.

At the end though, everything is going to be virtualized, making the host architecture moot. In 20 or so years, I'm sure we'll have multiple host OS's running on totally different architectures running in two separate windows on a third lightweight OS. That's where this is headed.
 


+1

*I can't believe I am doing this*
 


Again? We're agreeing again?

*head explodes*
 

And so, that day the balance between time and AMD was broken, causing amounts of havoc that man in all its years of existence had never seen before!
 


Why does it surprise you?

Most homes have 720p TVs and 1080p tops. And you keep same FPS target and resolution, but you add more image fidelity. For example, more texture resolution, added particles effects (i would really hope so, lol) and advanced lightning. Good enough IMO.

Cheers!
 
i wonder how these new consoles will handle intense multiplayer games like bf4. the cpu cores are under 3.0 ghz, no turbo, bf4 will likely get less ipc out of jaguar... as compared to say... fx6300 4.0 ghz(o.c., no stock-turbo) or core i5 4570 (with and without multicore turbo)... edit2: if the consoles had those cpu cores instead.
 
Off topic rambling about PS4 graphics


I have been through this before, but Jaguar is a lot stronger than you're giving credit.

1. It has AVX, lots of SSE, etc and it's on a closed ecosystem so the software is free to use them without isolating non-compliant hardware

2. Consoles run significantly better on the same hardware

I had to use anecdotal evidence but I've seen massive speedups in some things just by switching OS and recompling.

Blender more than twice as fast in Gentoo than it is in Windows, LAME 60%+ speedup, nearly everything else at least 10% speed up.

That is just from recompiling generic code to work on a wide variety of hardware with a tool most people don't like (GCC).

As for PS4 running at 720p at 30fps, how is this a surprise to anyone? Consoles have ALWAYS started out mediocre in the beginning and towards the end of life they look far better

COD2 on 360, which was a release title:
http://www.gamesradar.com/call-of-duty-2/slideshow/#screenshot/376618

Best games of 2013 (scroll past PS3 to get to Xbox 360)
http://2013bestgraphics.blogspot.com/
http://twscritic.files.wordpress.com/2013/05/crysis-3-bow.jpg

Do you see the difference? Does it surprise you that PS4 and XBone are going to be underwhelming when they first come out? Do you really think these consoles are not going to improve at all over their lifetime?

I expect PS4 and Xbone to transfer in graphics quality the same way 360 did. It is just going to take time for people to figure out how to milk what they can out of Jaguar APU.

And low and behold the person who does that has a winning game engine that also runs great on AMD HEDT bulldozer class CPUs as a byproduct.

Meaning it all transfers over to AMD's gaming PC wins.
On topic SR:

But as for SRB being "old news", yes, but the thing is that before that slide it was mostly rumors and people coming online claiming it with no proof. Now we have an OFFICIAL slide from AMD stating that there are two versions of SR.

However, I have a really, really good feeling that SR is going to be underwhelming and people are expecting a 30% increase in performance over PD, and that's going to be a lot of why there is no SR FX.

If you want my opinion, AMD had some choices.

1. Release PD with 15% to 25% higher clocks and charge more for a "premium" product while still getting 15% to 25% increase in performance

2. Release SR with 15% to 25% higher IPC at the same clockspeeds and charge the same as they always have for Vishera parts.

FX 9000 series lets AMD make more money off of existing IP (although I can't imagine AMD has been moving a lot of these chips) as opposed to offering more performance for no additional income.

As I said before, AMD has to make more money off of PD and it's already pretty darn competitive in games like BF4 compared to Intel's offerings.

Put yourself in AMD's shoes.

Would you rather re-use existing products that are competitive in what you want (gaming) and create some sort of artificial value (omg higher default clocks!) or would you rather give people that amount of performance (or even less) for the same profits?

AMD probably took a look at how many people are upgrading from 2600k or 3770k to 4770k (and likewise down Intel's product stack) and realize that not many people care about upgrading for a 15% increase in performance.

And for good reason, it's really not enough to make a huge difference in real world. I would think that most people running FX8350 at stock wouldn't want to make the upgrade anyways.

To put this into perspective, SR FX at 4ghz with 15% IPC increase would STILL be slower than a heavily OCed PD.

Even I would have a hard time swallowing buying a chip like that, and I LOVE AMD.
 
The thing is its going to be more of a benefit then 15% more like 25% from adding a second decoder they basicly immediately improve performance from 15-25% when both cores are used on a module. Then they improved other things as well so i actually do think its quite possible to see performance improve 20% on average per clock with some cases being 30%.

But you do have a point Amd is pretty darn competitive but if they do release these new steamroller CPU's(not APU's) they might get some good PR if it turns out to be better then Intel's higher expensive parts.
 
I think a 4 Ghz 8C Steamroller CPU with L3 would have no problem delivering more performance than the current none-extreme i7s do. The PD FX 8350 is already faster as the i7 4770k in a few workloads... However, steamroller 8c would also increase in price. If they're not going to release such a high end desktop part, it may be due to lack of demand (as mentioned above) or they don't want to attack their fusion strategy with their own products.
 


Well yeah, but remember just a few months ago how we were going to be free of the shackles of 30 FPS/720p? The CPU's in these consoles are NOT going to age particularly well, even if you scale near 100%.

Another trend that worries me, is at least last generation, you saw a glut of new style games: The first truly open world titles, more enemies at a time, physics processing, even from the very beginning of the generation (Advanced Warfighter, anyone?). This time? More of the same, just with graphics!

Seriously, I know the state of the gaming industry is bad (few original ideas), but right now, its looking worse then I thought. I see nothing near release, PC or otherwise, that interests me. Its all the same, exact, stuff, we've had for the past decade.
 
Agreed gamer actually that troll guy hafWTF was correct it does basically equal a I3 when it scales to all the cores i guess you can do a lot with that but still. Kinda funny that the Wii U has more 1080P games running at 60FPS then the other 2 even with much weaker hardware.
 


New official information will be coming from APU13 (Nov 11-14).

http://developer.amd.com/apu/

 
I would so go to APU13 if It didn't cost a i7 for one day. Plus I've got a race on one of those days. So I guess it'd be better just to stay home.

I look forward to the announcements though.

@gamerk316, exactly how I feel right now. We're hitting a wall. And APUs are... kind of lame. Tbh, the only thing that is interesting is MANTLE. Let's see what software brings...
 
Status
Not open for further replies.