AMD CPU speculation... and expert conjecture

Page 648 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwii

Splendid


Then please explain how a 7870 can't provide 1080P gameplay when it can on the PC?
 

jdwii

Splendid


http://www.videogamer.com/ps4/assassins_creed_unity/news/assassins_creed_unity_is_900p_30fps_on_both_ps4_and_xbox_one.html
Technically we're CPU-bound," he said. "The GPUs are really powerful, obviously the graphics look pretty good, but it's the CPU [that] has to process the AI, the number of NPCs we have on screen, all these systems running in parallel.

"We were quickly bottlenecked by that and it was a bit frustrating, because we thought that this was going to be a tenfold improvement over everything AI-wise, and we realised it was going to be pretty hard. It's not the number of polygons that affect the framerate. We could be running at 100fps if it was just graphics, but because of AI, we're still limited to 30 frames per second."

Now on the statement about a CPU has nothing to do with high res gaming then why don't we all just get Intel Atom CPU's in are gaming rigs and game at 1080P with a 980?
 

jdwii

Splendid


When a GPU is so much more powerful then the Xbox one that won't happen, however there is times where the xbox one comes out ahead if you watch FPS videos comparing both.
 

blackkstar

Honorable
Sep 30, 2012
468
0
10,780


I'm aware of that and I'm not buying it.

Have you ever used a CPU bottlenecked system before? You can just keep raising graphics settings and the frame rate stays the same. It happens, ironically enough, on my A4-5000 system when running LoL. I get the same FPS at lowest settings as I do on medium. It's strictly CPU bottleneck.

My point is that it's not happening. Yes, I do realize a CPU can bottleneck you and that's why we don't have AM1 and Atom gaming rigs. But even if that's the case, you can generally increase GPU usage by increasing resolution.

You know, when we do CPU benchmarks at 420p and we see CPU bottleneck between something like FX 6300 and 4770k? And then we go to 4k and it goes away?

Also, 2.5ghz FX 8350 does not bottleneck BF4. http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html

Here's a review that tests 4790k vs Pentium G at everything from 1680 x1050 to 4k.
http://www.tweaktown.com/articles/6526/intel-pentium-g3258-haswell-20th-anniversary-cpu-gaming-performance/index9.html

Please take a look at Metro Last Light http://www.tweaktown.com/articles/6526/intel-pentium-g3258-haswell-20th-anniversary-cpu-gaming-performance/index5.html

This is a CPU bottlneck. Notice how frame rate stays nearly the same between 2560x1600 and 1680x1050 yet the other CPUs show massive changes? My point is that this is what CPU bottleneck looks like. You can just keep raising resolution and frame rate does not change.

And I am asking, if Ubisoft is right about CPU bottleneck, why is the game not 1080p/30fps? It wouldn't matter in CPU bottleneck because the CPU will push the same amount of FPS. If you were CPU bottlenecked, you could just raise resolution and not have any problems.

CPU bottleneck is not a reason to lower resolution from 1080p to 900p. In fact, CPU bottleneck has never been a reason to lower resolution. Usually it is a reason to raise resolution to improve GPU utilization. It is why I have FX 8350 + 7970 with 1440p instead of FX 8350 + 7970 + 1280x1024 screen. CPU would probably bottleneck at 1280x1024 but at 1440p it doesn't.
 

jdwii

Splendid
If you are CPU bottlenecked at 720P putting the game at 1080P won't make it perform any better, i guess this is really coming down to people not believing Ubisoft when they claim they are CPU bottlenecked. Which is fine their not an honest company anyways. It might be a cop out excuse then for why they are limiting the game to 900P.

So i ask this again if the 7870 can provide playable frame rates at 1080P why is the PS4 being limited to lower resolutions so often?
 

szatkus

Honorable
Jul 9, 2013
382
0
10,780


Sacrificing resolution for better graphics effects. If game looks better with more advanced GFX at 900p than 1080p with less effects then the first is better option. On PC you can do the same thing by yourself, but console games usually won't let you change graphics options.
 

jdwii

Splendid
7870 can even provide high graphic settings in crysis 3 at 1080P so i really don't see this game coming any where close to that or many others that are running on 900P. So i guess we can put the blame on horrible engines. Why not pick the unreal engine or cryengine for all the games doesn't make since when the engines work so well to pick their own engine. I noticed this with dead island the engine they used sucked and they could of used something different and maybe even save development time.

 

jdwii

Splendid


Yeah you know i forgot about that article when they stated that. I bet the 7870 has a lot more left when it comes to graphics but the CPU i'm not sure. Might hit are limit already when it comes to AI or lots of things going on at once. If we go by the total amount of power the PS4 has in terms of 1.84 TFLOPS which is equivalent to 7850, the GPU isn't the bottleneck.

 


You can keep saying that but its not true in the least. The games will all run at 1080p if there wasn't a gpu resource constraint. It is very obvious that the developers aren't getting the gpu performance they want. CPU has next to no effect on resolution but it is hard for game developers to even hit 1080p in some games. The other thing is, the consoles are still stuck with low end AA and medium ambient occlusion with lots and lots of graphics effects turned down. The GPU is a huge constraint. Also a 7850 isn't that powerful at all, it will not play new games at high settings at 1080p, I don't know what you are on about.
 

wh3resmycar

Distinguished


the 7870 aint doing 60fps @ 1080p on the pc.
 

genz

Distinguished


Xbox one is not identical to PS4 in all other aspects. It's not a fair comparison. ES-RAM and DDR make the two very different machines, with the Xbox still having superior memory performance as well.
 

jdwii

Splendid


Well the issue is even 1080P 30FPS seems to be impossible most of the time i know it can't do 60FPS but i know for a fact it can play most games at even Ultra settings at 30FPS. But 900P?
 

jdwii

Splendid


In CPU tasks, it also has a higher clock speed to but not enough to brag about.
 
I do believe I've discussed this before. You simply can not lump everything into "CPU" or "GPU" tasks, there is entirely too much interdependencys involved. When you look at the "graphics options", some of those are purely inside the GPU while others require CPU work before it can feed the GPU. AA and AF is a prime example of something that is 100% GPU, all the operations take place inside the GPU and the CPU has no additional work required. Move on to shadows, particle effects, texture and model complexity and suddenly the CPU now has to do more work for setup and geometry. It has to track more "stuff" and do more work to manage the GPU's state. This is why benchmarks are useless without context and people would be surprised at how far a little tweaking can go.

On the whole 30/60 FPS thing, it has to do with traditional AV setups. NTSC video signal is 29.9 fps measured at 60hz and PAL is 25 fps at 50hz. So pretty much every video system in the world will want it's signal to be in some evenly divisible value of those otherwise inconsistencies and artifacts can start to appear. It's what we know as screen tearing / stuttering on the PC world but for home video systems, it can be quite uncomfortable to look at from across the room. Trying to play a video game on a big screen HDTV with a variable frame rate has a good chance of making the player feel sick.
 


They shouldn't; that's the place where they should be pushing the APU like mad. If the APU can't win there, it can't win anywhere.
 


Remember, this is the first time NVIDIA based the Desktop chip off the mobile chip, rather then the other way around. So I find the reduced power requirements not shocking in the least. I'm more surprised at the performance, honestly.
 


It absolutely can. While the GPU does comparatively more work as resolution increases, CPU load does go up as well.
 
On the whole 30/60 FPS thing, it has to do with traditional AV setups. NTSC video signal is 29.9 fps measured at 60hz and PAL is 25 fps at 50hz. So pretty much every video system in the world will want it's signal to be in some evenly divisible value of those otherwise inconsistencies and artifacts can start to appear. It's what we know as screen tearing / stuttering on the PC world but for home video systems, it can be quite uncomfortable to look at from across the room. Trying to play a video game on a big screen HDTV with a variable frame rate has a good chance of making the player feel sick.

Which is why Variable Frame Rates are such a big huge thing in my mind.

Look, back in ye olden days, devs did a lot of things to keep console CPUs from stalling out. Take Sonic on the Genesis. Every Sonic game ran at 60 FPS, but the physics were only calculated every other frame (30 FPS). Which works 99% of the time, except at max speed when all those wonderful odd physics bugs started to show up (dropping through floors and the like). Devs have ALWAYS had to contend with CPUs that simply can't do what they want them to do, and come up with various workarounds to deal with it.

I'm more alarmed by the fact they expected such a big jump in performance.

((I'm done now; next time, I'll try and compress into one post when I check in post-weekends, promise! :D ))
 

8350rocks

Distinguished
Had a great conversation with Richard Huddy last week.

Many good things coming. Freesync monitors are expected to be announced by 1 vendor as available Q4, some current models from that vendor will be capable of receiving firmware updates to gain freesync as I understand it.

Also, nextgen gpus are on the horizon. No hard dates but keep your eyes open.

Tonga is a "relatively weak version" of the new gpu uarch, and just tip of the iceberg.

Last tidbit, AMD is helping khronos design oglnext. They will be working hand in hand to help optimize for the new API. AMD will continue to support mantle as well, but they are glad to see widespread adoption of their ideas for lower level APIs.
 

colinp

Honorable
Jun 27, 2012
217
0
10,680
That last "titbit" isn't exactly news.

Nor is the fact that AMD are going to release a new gen gpu at some point sooner or later. I mean, no s--t, Sherlock.

What would be news on that front would be whether they expect to be competitive on performance instead of just price. But he'd hardly admit it if he knew it wouldn't be, would he?

Good to see freesync being adopted though. Hope it's a reputable brand.
 
Status
Not open for further replies.