AMD CPU speculation... and expert conjecture

Page 653 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Ahh the old "you only need two cores!!" argument again, thought we put that one to rest ages ago.

It boils down to this, even if the program your using can only make use of two cores, the OS itself along with everything else running in background will need time on the CPU and doing that will periodically evict your program for short periods. This is why you want three cores at a minimum. The Pentium CPU was released as a benchmark queen and nothing more as any real world use will have it crawling. The i3 or 750/860K is the lowest I would ever go for CPUs with i5 and fx6300 being better for combining with dGPU's on a desktop. The magic number for APU's to be worth it is $140~150 USD. The 7850K is straddling that line and honestly I wouldn't chose it when the 7700K and 7800 are available. Cheaper and get you the same real world performance as current DDR3 isn't capable of providing for the 512 shader cores of the 7850K. DDR3-2133 is quite cheap now and fits perfectly with the 384 shader cores you find on the 7600~7800 chips. The 7600 still provides insane bang for buck and it looks like the 7700K is going to be entering into the same value segment.
 

jdwii

Splendid
It sure does Palladin i noticed in later games my old Pentium builds are getting slaughtered by cheaper A10 builds. I also agree when it comes to true PC gaming machines anything under a I5-6300fx just isn't worth it. You can build a 6300fx PC build with a 250X for just 450$ now. So IMO APU builds must compete below that price range for now.
 

abitoms

Distinguished
Apr 15, 2010
81
0
18,630


I agree. All reviews were able to take this pentium upto 4.5+ GHz only with a good cooler, which definitely adds costs and on top that the instability vastly reduced life of the CPU. Strangely, not many (if at all) reviews/reviewers seem to have mentioned this simple fact in their respective reviews.
 

jdwii

Splendid
^^^ It doesn't matter how high they clock a dual core it doesn't change frame latency much. If this was an I3 we would be on a different approach. For just 90$ you can get 4 core steamroller and for 100$ a 6 core fx i understand that is 40$ more compared to an Pentium but it is worth it. Anything less then that a APU would probably make more sense.
 

etayorius

Honorable
Jan 17, 2013
331
1
10,780


MANTLE is so awesome, ill take lower FPS but smoother gameplay anytime instead of tons FPS with also tons of LAG.
 
Boy, this console gen might be worse off then even I predicted:

http://www.pcper.com/news/General-Tech/Sony-PS4-and-Microsoft-Xbox-One-Already-Hitting-Performance-Wall

Get ready for a decade of no advancement in gaming people.

I think we're heading to a very bad place going forward. With very few independent gaming houses left, thanks to EActivision, the reliance on maga-games to generate revenue, the rise of FTP, the rise of mobile, and a very weak console generation, I predict a 1983 style collapse of the entire industry is incoming. I think major houses like THQ going down is a symptom of a larger problem, one that is going to be exacerbated by this console gen.
 

logainofhades

Titan
Moderator
I blame Ubisoft, not the consoles, for this issue. Look at the minimum requirements for the game. My system just meets those "minimum requirements". It is lousy programming, and nothing more. There is no reason an i5 and an HD 7970 should be minimum requirements for a game.

http://blog.ubi.com/assassins-creed-unity-pc-specs/
 

8350rocks

Distinguished


THQ went down for different reasons. LucasArts was their primary IP source, and they had not exactly done a spectacular game with any of that IP in quite a while, and LucasArts had not commissioned anything worth noting recently either that would have been in THQ's sphere of influence. Frankly, they had not been relevant for quite a while...

Now, if Ubisoft, or Blizzard, were closing one of their studio locations...doom and gloom away. I mean...id software has not been super successful for a while and they are still around. THQ and some other tiny studios were more a symptom of being "type cast" or under diversified/over specialized, however you want to look at that.
 


THQ had a LOT of IP though. All of Relics successful RTS games. They were the publisher that did almost all of 2k's releases (FYI: Activision is sniffing for a potential buy of 2k). Homefront did THQ in, since they put hundreds of millions into it, and didn't see a ROI.

I blame Ubisoft, not the consoles, for this issue. Look at the minimum requirements for the game. My system just meets those "minimum requirements". It is lousy programming, and nothing more. There is no reason an i5 and an HD 7970 should be minimum requirements for a game.

Which would make sense, if other Devs weren't complaining. EA, Activision, Ubisoft, CDPR. That's more then a spattering of complaints. That's pretty much the three major studios, plus CDPR, all complaining about the same problem. When you have an integrated system, you are bound by your slowest component, and in this case, its the CPU. And the CPU is being asked to do more and more to assist the GPU, which certainly isn't helping (Thanks HSA). But hey, at least this puts the XB1 on the same level as the PS4, so the differences between consoles will eventually vanish.

As for AC:U, I'm betting they designed the game around what they thought they could do in HW, then when they found out they couldn't, rather then re-engineer everything, scaled it back down to the point where it barely runs. I'd fully expect even the console release to be a stuttery mess if that's the case.

But hey, wasn't this generation supposed to bring significant performance improvements due to the "ease of porting due to similar architectures"? Talking heads: 0, me: 1.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
I think it's still way too early to pass judgement on PS4 and Xbone. It's been less than a year. And remember how awful xbox 360 games looked in this time frame compared to the end of the 360's life? I think people are prejudice against x86 because software optimization for x86 is something we rarely ever see in windows environments.

I don't know if I mentioned it on this forum, but PS4 and Xbone have had to deal with a whole new range of power requirements. Someone on S|A is now claiming that there's even an extra GPU in there to do h.265 decoding and such in low power mode so the main Jaguar SoC can go idle.

We aren't in an era anymore where you can sell 200w PS3 in the EU and get away with it. PS4 peak power consumption is down from 200w in PS3 to 140w in PS4. That's a huge drop.

I hate to always resort to AMD's fab problems here, but I really doubt that 28nm TSMC was their go to process for this SoC. But PS4 and Xbone's problem is power consumption requirements, which lead to weaker CPU, which lead to software problems. But you're not getting around it. Not unless you can show me something vastly more powerful in the same power envelope. And I don't think that really exists. Maybe you can find an Intel that'll pull it off, but that's going to be expensive and you're going to have to go dGPU.
 

jdwii

Splendid


Really want them to scale up the design to 3.0Ghz+. If they did that they would have a Intel killer.
 

jdwii

Splendid


Like i said when they design is so close to X86 you will NOT see optimization like we did in the past for consoles. These CPU's are extremely weak its basically like a 8 core jaguar CPU on the desktop with a 7870 video card.
 

jdwii

Splendid


Like gamer said all dev's are complaining, the only way we will see 1080P 60FPS on xbox one or PS4 is if the graphics aren't that great or the AI is more like last gen. Rendering took 50% of the CPU at 900P 30FPS. Perhaps with unreal engine 4 we might see some improvements.

Also don't take the developers word on the PC copy a lot of them are asking weird requirements when the game works on less capable hardware. I'm also certain ubisoft is using the same engine Watch dogs was on but more tweaked.

Also watch dogs just got another update for the PC wondering if it has to do with this game's optimization?
http://www.dsogaming.com/news/watch_dogs-new-pc-update-released-improves-performance-on-recent-graphics-cards/
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Well, Windows games aren't optimized too well. One binary must work on 1-7 years CPUs, so there's no place for full optimizations like console binaries. Actually the fact that is x86 means nothing, Xenos also was based on general purpose cores, but with different pros and cons. There was also Cell CPU in past-gens, but looks like the architecture wasn't too good for gaming.
 


You'd still need software to be multi-threaded- if you read on to the bit about single threads performance gains with more cores you'll see why.

Essentially what this system can do is extract the maximum *low level* threading possibly from a single software thread. The issue with that is that depending on the specific instructions, how much it can actually be split varies from 'lots' to 'not at all'. Their own estimates basically show they can utilise all the resources of 2 cores pretty well on a single thread, sometimes three, however they can't use infinite cores. The long and short is this technology could double possibly tripple current single thread throughput which is huge- but not enough to totally remove the need for multi-threading at the software end.
 


Cell was quite powerful, but you had to have a very low level understanding of how it worked to get the most out of it. That's why the quality of PS3 titles lagged for years, then caught up. The PS3s bigger problem was the GPU being based off the NVIDIA 7000 series, which was before Pixel and Vertex pipelines were combined into a single stage. The 256MB RAM certainly didn't help either.

The Xbox 360's Xenon processor, a three-core six-thread PowerPC unit running at 3.2 GHz, had a theoretical peak number crunching throughput of 115 gigaflops. A contemporary Pentium 4 at 3 GHz had a theoretical peak of around 12 gigaflops when the system launched. The PlayStation 3 was in a similar situation; its Cell CPU, jointly developed by IBM, Toshiba, and Sony, had a theoretical throughput of 230 gigaflops. Contemporary Core 2 Duos were topping out at 24 gigaflops at the time—and cost many hundreds of dollars to boot.

http://arstechnica.com/gaming/2012/04/the-x86-playstation-4-signals-a-sea-change-in-the-console-industry/

By comparison, an 8-core Jaguar at 1.6GHz gets just north of 100 GFLOPs.

In short, Jaguar is about as powerful as the 360's Xenos CPU, in terms of maximum possible throughput. If there's an upgrade there, its a very small one. For Cell, you can argue Cell is difficult to get to max throughput. Fine. From what I've been able to find, the "typical" throughput around 160 GFLOPs or so, or still about 60% faster then Jaguar. So for the PS4, the CPU is a DOWNGRADE from the Cell.

So yeah, there's a CPU bottleneck.
 

con635

Honorable
Oct 3, 2013
644
0
11,010

I dont know why people cant get their heads around this, this is the reason for underpeformance and this is the future btw, my other hobby/interest has been completely destroyed to the point I've nearly lost all interest (cars). I think there's a hidden agenda though hiding behind the 'save the planet' line.
 


Of course it's partly the fault of consumers as well... There is a reason that the big game studios are effectively selling the same game over and over- it sells! The sad fact is many consumers *like* it, to the point that if / when something new comes along they attack it.

The reviews of Planetary Annihilation are a good example, the game has received allot of bad press and whilst the studio has made a few mistakes the core concept of the game is the first real departure from 'the formula' for RTS in a long time. In this case they've changed the maps- specifically the game is played on 'planets'- as in multiple fully spherical fields of play at the same time. The underlying game is quite familiar however the change is maps does *drastically* effect how it plays out. There no more hiding in the corner, no more safe edges (your not even safe from above).

The result? The IGN reviewer gave the game a dreadful score with the tag line 'avoid' and his main argument boiled down to 'planets suck I wan't flat maps'.... they were slated for the very premiss of the game because it was different. The other main complaints where lack of things that have become 'expected'- like a very detailed guided tutorial, ,mid game save and other things. Whilst I think some of these would be a good idea, I also remember when I was young most games had none of these and that didn't detract from them, modern gamers are used to having their hands held. The issue with this is that it greatly increases production cost and time, in the case of PA the studio is really small and they self funded through Kickstarter + early access sales. Despite this they did manage to write a brand new engine from scratch to support the multiple playing fields at the same time. They managed to get a decent game together on top of this including a superb ful orchestral sound track and one of the cleverest UI systems I've seen (based entirely on Chrome- the entire UI can be modded and is very customisable) all within 2 years.

In comparison blizzard on the other hand, played it very very safe with Starcraft 2. They used their old existing engine from Warcraft 3 (which is ancient by modern standards). The game plays exactly the same as the original (and all other Blizzard RTS games) on small flat maps, with a low unit limit. What Blizzard do well though is story telling and all the 'nice' things (fully scripted tutorial / campaigns). It's a prime example of 'more of the same' and Starcraft 2 got rave reviews fir essentially being the same as the original.

Sorry for the rant, my point is just that sadly people as a whole often like getting the same things, and despite what many *say they want* when it comes down to it they like the niceties that come with games being backed by *big* publishers. Give people something truly new and they don't like it. The number of fanatics from SupCom or Starcarft that register on the PA forums *just* to slag off the game sums up exactly why we are where we are now.
 

You are wrong on every level.
 


He's not off the mark when saying the current Console CPU-components are weak compared to what you can find on desktops, but if you check the Perf/Watt they're not that bad. The power restrictions were the killer stab on Sony and MS for this generation.

You need to put a lot of context on why this gen is sucking for us, enthusiasts:

-Lower power requirements: killer IMO. Makes sense and all, but having to be below 200W is tough when the prev gen was allowed higher power output/consumption.
-Participants for the Consoles design: nVidia and AMD. Did IBM participate? What about Intel? If it was an Atom + nVidia video, I don't think the difference would be abysmal to what we see now. I bet they bailed out after seeing the power requirements.
-Purpose: consoles are not PC replacements! This is MS and Sony's fault; they're trying to pack too much into something that shouldn't be a complex piece of software suit.
-Software technology: like it or not, companies are cheap bastards, so they won't spend more if the ROI is not good. Changing frameworks is expensive as hell and using old tools for new stuff is not such a good idea.

So there, I believe there might be more, but those are very important ones at least. Point 2 would require some digging, but I wouldn't be surprised nVidia's quit to it was due to not having a power efficient design at the time (Fermi -> Kepler transition IIRC).

Cheers!
 


+1. The other thing I would say, many of my non 'enthusiast' friends have got the new consoles and are *very* happy with them coming from the last gen. The games *do* look significantly better :)

I think it's worth acknowledging that most console gamers don't game on PC, so comparing a game on the current consoles to what was achieved on the last gen- things do look quite a bit prettier. Also other functionality (e.g. on-line capabilities) are quite a bit better. I think part of the issue is the studios were *expecting* something more powerful than they got, so now they are complaining. At the end of the day though to a non enthusiast what matter is the game, and actually even at 900p the new console games look significantly better than the last gen in most cases.

The final point to note, when this gen came out I remember a comment (I think from AMD) that the approach with these consoles was going to be smaller updates more often. I can see in a couple of years time with a process node shrink + some newer more efficient cores there will be a refresh. Also as the architecture is unlikely to change, they may well keep supporting the current consoles along side the newer ones.
 
Status
Not open for further replies.