AMD CPU speculation... and expert conjecture

Page 243 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

noob2222

Distinguished
Nov 19, 2007
2,722
0
20,860


starting September 2013, so product not until what Q1 2014? .. its also bulk and not soi. GF offered qulcomm a discount to get the business.

"Globalfoundries is believed to have offered price incentives, such as a reduction in photo mask charges, to win orders from Qualcomm, the sources indicated."

seems like GF is getting desparate for some business. Good find tho, trying to find anything on GF is about like trying to find someting on SR.

http://semiaccurate.com/2012/01/09/global-foundries-fab-8-is-making-chips/
http://news.softpedia.com/news/AMD-and-GlobalFoundries-Interested-in-FD-SOI-for-the-20-nm-Process-267519.shtml

mostly older articles and if Kaveri is designed around SOI as it seems to be the case, AMD is stuck waiting for GF. We won't see any 28nm soi until kaveri and kaveri/sr won't see daylight until 28nm soi is functional.
 

8350rocks

Distinguished


That is extremely observant, and entirely relevant.

+1
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Clovertrail(32nm) isn't an issue for Jaguar, but Silvermont(22nm) might be. Its a new architecture and it's not out yet (September/October).

AMD has had a nice long window of ZERO competition for Kabini/Temash.
 

8350rocks

Distinguished


I would be reluctant to put a great deal of weight in any speculation right now. Particularly because AMD's own Roadmap has not come out for Desktop CPUs.

I attribute this to 2 things:

1) AMD is waiting for SOI on 28nm @ GF, because reviewing their process in bulk wafers to release on time exposed the massive advantage that using SOI gives them.

2) AMD would be wise to make sure SOI yields are significantly high enough to get full 6 and 8 core yields out of the wafers before committing to a release date for them. Note the FX 9590 was released only when the process from GF was mature. I view this as a similar situation.
 

jdwii

Splendid


If Amd get's rid of the steamroller fx they are years behind intel and will NEVER catch up in the performance area, or server area for that matter since intel will have 12+ core Xeons that kick butt, why would Amd all of a sudden not care when they market themselves so heavily in the gaming field with SSD products ram and what not if this turns out to be true and i bought a 200$ board for only my phenom not only am i dropping Amd CPU's, APU's i might even drop their video cards, unless they can make a 4 core APU that has more multitasking performance then a 8 core fx and more single core performance than a fx. Also a shame to see them leave the server market as well since those are COMPLEX chips with LOTS of transistors.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
Some of you folks who are under-estimating Jaguar are making a massive mistake. Instructions and optimizations can make a huge difference.

I've benchmarked Gentoo FX 8350 Blender vs Windows FX 8350 Blender vs Windows 3930k Blender. FX was about half the speed of 3930k in Windows but FX actually beat 3930k when the FX was in Gentoo and the 3930k was in Windows.

3930k was at 4.2ghz and FX 8350 was at a little under 5ghz.

If you think a Jaguar with all instructions being used and an optimized OS is going to compare to an i3 running generic code for many different types of CPUs on an OS that's capable of running on a Pentium 3 (Pentium 3 only supports SSE1 and MMX) then you're in for a really bad surprise.

The question that's been on my mind is if AMD can put an APU into AM3+ and just use it as a GPGPU unit. There's no way you could actually use it as an iGPU but you could at least run OpenCL and HSA on it (maybe?).
 

jdwii

Splendid


What about being practical and thinking about all the older software that will not use jaguar like that you know every single software that's been released since x86 came out, also everyone uses windows for normal use are we going to have a big update for this on 8,7,vista?
 


That is really the problem of a "one size fits all" approach. Everything is a compromise. There are many different target markets out there, all of which benefits from a different (sometimes radically different) architecture, CPU floorplan, and platform architecture. My best guess is that you need 3-4 different architectures and at least as many platforms to reasonably address the stack from phone to big iron server. Even Intel with its massive R&D and production budgets is up against an impossible task to try to use one architecture to serve pretty much the entire stack. I don't see Haswell as a failure as it serves well as a chip in the 10-50 watt range. The problem is that Intel needs something else for the 50+ watt performance segment and something else (NOT Atom!) for the under-10 watt range. AMD has the right idea with Piledriver being a decent 35+ watt arch, Jaguar being a good 5-35 watt arch, and their ARM chips taking everything below 5 watts. They may not have the absolute best in each category but they didn't completely strike out like trying to shove Haswell into passively cooled applications.

The i7-4770k is about a 5% faster than the i7-3770k, has some performance regressions (i.e. is slower than the chip that replaces), run hotter, and OC poor.

Haswell is a laptop chip, period. Its idle power is very low and it has low power consumption in the 2-3 GHz range where a laptop chip tops out at. Laptop chips don't overclock very well and we have seen that was true for years. I wouldn't expect it to run as well as a more desktop-oriented chip such as Sandy Bridge.

Now take a look at tablets. Haswell has a too high power consumption and Intel has tried to lie about it. In case you did not notice Intel has been caught again, a pair of weeks ago, with fake power consumption figures for its Haswell tablet chips.

AMD Jaguar chip is beyond anything Intel can offer, but if AMD was to scale jaguar above 2GHz we would see similar problems to those experienced by Intel Haswell. I think that AMD dual arch. approach, Jaguar for low freq. and Steamroller for high freq., is fine.

Bingo, now you are getting it. You need multiple arches to do well in the entire product stack. I do think Intel could best Jaguar in the ULV laptop/higher-wattage embeddable segment but they would need to design a chip specifically for that rather than try to underclock the larger and more complicated Haswell. Essentially Intel needs to make a redesigned Atom but focus on a balance of performance and power consumption rather than cheapness.



The 8 core Jaguar only equals an i3 if you are looking at poorly-threaded tasks which are really more the purview of the "high-power arch" like Piledriver. Game consoles are like the computers of yesteryear. They are lower-resource but closed platforms that will get much more specific optimizations than current general-usage PC code. You will see much greater use of multithreading in the PS4 than you do in current PC games simply because the designers have a lower-clocked, 8-core CPU as a target and can knowingly optimize for it. Most of you would whine if a game designer optimized for 8-core CPUs today as you would have to own a 4-module AMD FX, i7, or a Xeon/Opteron to get full performance rather than a no-Hyperthreading i5 or a laptop.

ATOM 8 CORE will embarass the jaguar cores. Intel may have 10x the resources but they have most of the cpu market for a reason as they are years ahead of where amd are. If amd release a cpu at same tdp as fast as an 8 core hasewell cpu at 130-140w tdp by 2017 it will be a miracle. Intel being about 4 years ahead of amd allows them to be very profitable.

Hardly. Atom is a bust. It was designed for maximum cheapness, period. Intel needs to significantly sharpen its pencil with the Atom to make it equal to Bobcat, let alone Jaguar. Atom is Intel's supposed special low power chip. But then why did Intel try to underclock Haswell to put in tablets rather than push Atom for those purposes? Think about THAT one.

Sorry, I can't understand the rest of your post. I suppose it has something to do with performance per watt but can't quite understand it. I understand if you don't speak English natively as most of the people I work with are immigrants who don't speak much English at all. I am bad at even trying to spell the name of their 50-some-odd different languages, let along to speaking any of them. Work is very "fun" due to this and we spend well into the five figures a year on interpreters. But I still can't understand what you're saying, so please try to repeat it.
 

GOM3RPLY3R

Honorable
Mar 16, 2013
658
0
11,010


Metro 2033 is one of those games that require serious power, and is obviously greatly multi-threaded and found an advantage in the use of more threads/cores. However you forgot to mention the other game, benchmarks and tests. It appears that on every other test that involved integer calculation, physics calculation, and encoding processing power, the 4770k pulled ahead by more than a cars length. There are a few where it does not, but overall, most of the scenario was performed greatly by the 4770k, and the Intel family.

I will give AMD props though, they are doing great in making processors that are, and it shows, great for the future with multi-threaded. However, there's a big and obvious thing that stands out that for some reason no one is realizing.

"CORES and THREADS"

Haven't you noticed that AMD for their big and bad timeline for the next couple years have been trying to make processors with less but much more powerful cores? It could only mean less threads. It appears they are trying to get on Intel's level of per-core-performance, which in it's state right now, that option would be suicide.

Intel is already advancing and looking closely selling affordable 6 core (6/12 thread) CPUs, basically a 3930k with a better chip, better per core performance, and for much cheaper. Thus, in the future, they can continue on making stronger processors with more and more cores and threads.

AMD's path is to make a processor with less cores, but performs stronger. In a future of multi-threaded, it would only make sense to take full advantage of more threads. However their problem is that if they add more cores and threads, it would cost more, the heat would increase, along with the TDP. Eventually, they'll be making today's server processors for standard PCs.

So the obvious option is to make the cores and threads much stronger, and then go with that plan, so they can get a lead. The only problem is time. It is obvious that by ~ Q3 2014 (maybe sooner), we will have consumer 6 core (maybe even octo-core) CPUs from Intel, with stronger cores, making an awesome and powerful CPU.

Unless AMD either can pick up the pace, or find another solution, with their current plan they won't be at this level until we get into 2015.

This is completely just assumption, but from the facts, this is what is obvious.
 

mlscrow

Distinguished
Oct 15, 2010
71
0
18,640
Jaguar cores =/= Piledriver cores, but both are x86, both 8 cores and both made from AMD. I'm sure that will play some role in how well games play on any 8-core AMD PC.
 
Being "difficult" does not mean "impossible". Difficulty =/= Impossible.

I've said on many occasions that it's not easy to design programs to scale well (scalability is the proper term for what we're describing). It requires you to go back to the drawing board and redesign the problem into one that can be attacked that way. That is ~EXACTLY~ what the Sony representative was saying, they have to go back and redesign their engine to scale well.

This isn't an option, this is as inevitable as the upgrade to 64-bit processing and the sun rising. People can kick, scream and cry all they want while viciously holding onto their dual-core CPUs, no matter as it will happen without their express consent. We are now in a period of quad-core computing, we'll be doing six / eight core computing in about three years, give or take one. We will most likely stay around 8 for five to six before needing to move on, but it's really hard to predict that far out with any sort of accuracy.
 
In that "new gen of games" mini talk we're having... There's a game in Steam that caught my attention. Not because of the [strike]gender[/strike] type of game (is it gender? lol), but of the specs/requirements to be run in Alpha.

http://store.steampowered.com/app/233250/

PC System Requirements (Minimum)
-OS:Windows Vista SP2 64-bit
-Processor:32 or 64-bit Dual Core or better
-Memory:4 GB RAM
-Graphics:Shader 3.0
-DirectX®:9.0c
-Hard Drive:2 GB HD space
-Additional:These specs are still being determined.

Heh, 64bits for an alpha build... Looks like we'll start getting some new gen games sooner than later.

Cheers!
 

BeastLeeX

Distinguished
Dec 13, 2011
431
0
18,810


Yes, it definitely would be. I wish AMD would just come out and give us a Desktop roadmap. I wanna put my Corsair H70 to work on the Steamroller FX chip! (Don't care if I have to wait a year, my Phenom II can unlock to 6 cores :))
 

cowboy44mag

Guest
Jan 24, 2013
315
0
10,810


I'm in the same "boat" as you guys. My system's specs are way more power than my Phenom II needs, I upgraded everything in preparation for Steamroller FX. If they would release a roadmap I would know if I should wait (I wouldn't care if I had to wait for almost a year either) or if I should get an FX-8350 and overclock the living hell out of it. Regardless of what Intel fanboys say the FX-8350 is an awesome piece of hardware and not too far behind i7 Ivy and Haswell in a lot of benchmarks:D
 

griptwister

Distinguished
Oct 7, 2012
1,437
0
19,460


+1 Good input! You're a good edition to the community.

I recently found this article. Not AMD related, but still an interesting one.

http://www.forbes.com/sites/jasonevangelho/2013/08/07/for-struggling-pc-market-its-pc-gamers-to-the-rescue/

**EDIT** Yesh! *and the AMD users rejoice* I look forward to seeing what this guy has to say:

http://www.tomshardware.com/answers/id-1758054/amd.html#11312073
 

BeastLeeX

Distinguished
Dec 13, 2011
431
0
18,810


+1, it is good input, also that article of Forbes is wrong the 7990 retails for $700 now, and it comes with 8 games (with the new drivers it seems like an awesome deal)

 
AMD doing a everything must go clearance sale, this was what Nvidia didn't do with the GTX600 family, now you got people buying GTX670's over GTX760's at the same price points, thats how you kill your stocks.
 


Oh, I'm quite sure the XB1/PS4 will have very fine threading control. I have no doubt of that. When you have a single hardware set, you can do a lot of optimizations you otherwise couldn't do.

PC's don't have one hardware spec. An optimization on one setup will likely be a performance degradation on another. Nevermind there is no mechanism to talk directly to the hardware; you HAVE to go through the Windows driver layer. So you won't see that low level of control.
 
Until Microsoft jumps onto the HSA bandwagon, then you will start seeing closed doors open up. I think they call it progression or evolution its a concept that works for those that know how to and others to catch up.
 


Vista/7/8 X64...and DX 9.0c.

Hence the continued downside of DX10 not being backported to XP: Even though XP is not supported here, DX10 is a dead API due to DX11 being released, and most hardware is still lacking DX11 support. Therefore, we get a DX9.0c baseline, even though the choice of OS guarantees DX10.

It will be interesting to see the overall memory usage. Specifically, why is more then 4GB Address Space needed? If its due to the improved physics, then I have a suspicion we're going to see a MASSIVELY CPU bottlencked game.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
http://semiaccurate.com/2013/08/07/amd-to-launch-hawaii-in-hawaii/

most important part of this article :

This is a massively updated GPU vs the minor tweaks in the last round. Given the timing, Hawaii is unquestionably a 28nm part so no shrink related performance bump but the architectural changes should more than make up for that. It is unlikely to be an incremental advance.

So according to S|A, AMD are able to deliver "massively updated" GPU on 28nm, with presumably similar clock speeds as Tahiti.
If true, frikin amazing.
 


Right...............
965BE>3220 overall, end of story. In properly threaded games, the 965BE is much superior to the 3220.
 
Status
Not open for further replies.