AMD CPU speculation... and expert conjecture

Page 647 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

con635

Honorable
Oct 3, 2013
644
0
11,010
Its not just 'any old game' though, its looking like the new 'crysis', surely the latest gen cards should comfortably beat the oppositions year old offerings? Dont get me wrong I'm not saying 290x>980, just, I'm not as impressed as others, mw is not the world beater its made out to be and certainly doesn't 'make everything else obsolete'.
fwiw at 1080, I bet if you customized the settings slightly on a lot of games on a used £100 hd7950 to match the fps of 970 you wouldn't really 'see' a difference in the 2.
 
Jaguars aren't really weak for what it needs to do. I doubt there is even a point in sticking in more powerful cpus in console which have the main purpose of running games. Games really aren't being CPU contained these days as they once were for the console environment. Pretty much any cpu seems to run games at 30 - 60 fps which is what they do in the consoles. The GPUs however are where the silicon and power budget is spent.

If the consoles didn't need 2 cores of overhead to make sure the RTOS to run background apps, they would have been fine with a 4 core jaguar. Then they could be designed with more GPU resources.
 
I must admit I'm with con635 on the 980 / 970. Great cards with excellent perf / w, but *not* the 'ball out the park' performance jump everyone are heralding them as.

I remember when we went from HD 4870 - HD 5870, *litterally 2x the performance*, or if you prefer team green, the release of the 8800 ultra (which then became the 9800, which then became the GTX 250 or something?)...

From what I've seen, the 980 is what, 10% faster than a 290X or 780ti (and this number goes *down* as resolution goes up due to the much wider memory bandwidth on the previous gen cards). Granted, it does this at much lower power consumption, and as others have said the true *big chip* version of maxwell will likely by very impressive. However there is little chance of that seeing the light of day any time soon, as it will come out in pro cards first and they're nowhere to be seen.

It's an excellent bit of engineering but if I already owned either previous gen card I really don't see the appeal?
 

cemerian

Honorable
Jul 29, 2013
1,011
0
11,660




quite the opposite a lot of games are cpu bound, and while gpu's are more important a slower cpu is a lot worse than slower gpu, atleast if the cpu is overpowered you don't ever have to worry about it bottlenecking your gpu, while the other way around your stuck with what your cpu can handle, although the new consoles are limited by both
 

logainofhades

Titan
Moderator
Jaguar couldn't even beat an i3. I am glad AMD got the design win, as they needed the extra cash influx. Still there are 15w 2c/4t mobile i5's/i7 that would have done the job just as well, if not better. ;) AMD won mostly due to price.

 

colinp

Honorable
Jun 27, 2012
217
0
10,680
It's in laptops where Maxwell will shine. The 970m is way better than the 870m for example in both performance and power consumption. AMD has nothing comparible right now.
 


This is not an argument you can make on consoles. The cpu is never bottleneck on consoles because the games are designed to be the way they are. The gpu is needed to render the game at what ever image quality and effects you want. CPU bound games just does not make sense in this context.

Also your analogy is flawed because you value cpu more than gpu for some unknown reason. You would rather the CPU sit idle than the GPU by your suggestion and that is just stupid when talking about a console. Both parts are needed to run a game and its much easier to utilize gpu power than cpu power in that sense. It is why the gpu is more than 50% of the die on the APUs for the consoles.
 
AMD hasn't had proper discrete GPU design wins in laptops since they made APUs a thing. Nvidia has like 90% of the discrete laptop gpu sales. I don't think AMD even cares any more.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Of course they won on price. Intel charges $250+ for those ULV parts. Then you got $50 left for a GPU lol.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


I guess you missed the recent Apple design win. R9 M290X or R9 M295X

http://www.extremetech.com/gaming/192305-analyzing-the-imac-5k-retina-display-how-do-you-squeeze-5k-out-of-a-last-gen-gpu
 

jdwii

Splendid


Dev's would disagree with you
 

jdwii

Splendid


I'm not sure it was supposed to do that, its not the 980TI either and it was made on the same 28nm process that's what makes it extremely amazing, i'm 70% sure Nvidia isn't done yet when it comes to making a high-end GPU i see a 980Ti coming but then again Amd on 20nm plus their new architecture should put Nvidia in the dust.
 

Dev's always want more power, they aren't the ones designing the console to be affordable and balanced.

Also you need to back up that argument by giving some quotes about how the CPUs is weak compared to the GPUs in the new consoles in a game environment that doesn't involve any emulation. I would love to see some quotes of this.
 

Last I see that isn't a laptop...
 

jdwii

Splendid


http://www.extremetech.com/gaming/191615-assassins-creed-unity-locked-to-900p-30-fps-due-xbox-one-and-ps4s-weak-cpu

This is just one case and then we can conclude a 7870 would be bottlenecked by even a Phenom II X6(without optimization) let alone a 8 core atom CPU or jaguar. I understand in consoles they can make up for it but its still their the weakpoint isn't the GPU. A 7870 is strong enough to do any PC game so far at 1080P medium yet consoles still lack that feature. Its almost common since, there is a bottleneck and i'm putting my beats on the CPU since the GDDR5 memory on the PS4 is more then adequate for 1080P gaming.

Basically when i see consoles do 1080P as a standard and not a special thing i will then conclude that the CPU isn't the bottleneck since a 7870 can do any PC game at 1080P medium settings. It would even be different if the games were 720P 60FPS buts it not.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


It's still a mobile dGPU design win, so they do still care about them.
 

That doesn't say anything in regards to what I said.

 


Well they haven't even officially released it as a mobile gpu, its just inside the imac. Its a nice win for them but I feel like they are not going to be coming back with much in the mobile dGPU space. I don't think they willingly gave up all their market share to nvidia but OEMs just don't seem to like AMD gpus in their laptops. The 7000m series were great chips but kepler dominated the laptop space. I feel like AMD just doesn't have very good relationships with the OEMs.
 

jdwii

Splendid


I think its already established that the CPU is to weak for 1080P gaming in a gaming environment since everything else can be ruled out for the PS4 over the GPU being capable of 1080P. At least with the Xbox one we can conclude the low memory bandwidth is holding it back but with the PS4 and its GDDR5 memory and 7870 like GPU all games should be playable at 1080P since it would be on the PC with an adequate CPU.

Edit, i'm not to sure i understand what you mean by emulation you mean playable PS3 games or PS2 games on the PS4, if so that's not what i mean.
 
Well, Jaguar being a bottleneck or not will depend on how they designed the CPU component. I don't know if they're running a full X86 ISA for the custom designs, so you can squeeze more performance out of it than the general purpose CPU we get for desktops. Remember it is still a custom design from MS and Sony.

Now, Devs using previous gen Frameworks and tools is also another BIG factor to the current "stall" in games. It's not even an "adaptation" thing, but a "Publishers are being cheap bastards". Well, they've always been, but there's a weird thing going on for MS here as well. If devs switch tools and all, maybe games in Windows will be even better than in the consoles (not that it isn't true now though) and as such, they'll cannibalize their own console market.

It is a very weird position that MS has put themselves in.

Cheers!
 

wh3resmycar

Distinguished
i still don't get it why console devs wants it to be either 60fps or 30fps, 50fps/1080p would be a reasonable compromise.

also, you folks are aware that these consoles are running vsync right? i believe that puts a significant strain on that 7870 that runs on the ps4.
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


No, CPU has nothing to do with resolution. It can limit FPS, but resolution depends only on GPU and/or memory.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780
Wow. Are we gonna spend pages talking about how generic x86 binaries on a software platform designed to run on a lot of different types of hardware is proof that Jaguar is slow in an environment tailored to the hardware?

You might as well let me throw out optimized Gentoo benchmarks where I beat 3930k and then start saying FX 8350 is faster than 3930k.

Also, some of you are way out of touch. When someone has to lower resolution and graphics settings, like in Watch Dogs, it's not because of CPU. In fact, you'd see consoles hitting 1080p/60fps no problems if the GPU wasn't the bottleneck, because scaling resolution upwards shifts the bottleneck further from CPU and more toward GPU.

If CPU was the problem, we'd be looking at big empty games with no AI and stuff going on but they'd be 1080p/60fps games.

Unless someone wants to show me a situation where CPU was the bottleneck and it ended up with resolution being dropped.
 
Status
Not open for further replies.