Vellinious
Honorable
Rogue Leader :
Vellinious :
Rogue Leader :
Vellinious :
Actually, running it at a lower setting would push more frame rate, and put more pressure on the CPU. Unigine benchmarks are notorious for being lightly threaded, which is why Valley is nothing more than a CPU benchmark any more. Bottlenecking the CPU intentionally, by running lower settings, and pushing the frame rates higher will show a bigger difference in the scores. By running a test with higher settings, or a higher resolution, you'd be taking the onus off of the CPU, and putting it more on the GPU. I would think that 1080 Extreme or 4k you'd see extremely similar scores between the two systems.
That said....it's a synthetic. Compare frame rates between a lightly threaded DX11 game, or a heavily threaded DX12 game, and you'll see a HUGE difference between them. Assuming, of course, the 1090T can run DX12.....I'm not really sure about that.
That said....it's a synthetic. Compare frame rates between a lightly threaded DX11 game, or a heavily threaded DX12 game, and you'll see a HUGE difference between them. Assuming, of course, the 1090T can run DX12.....I'm not really sure about that.
No, running it at a higher setting would push the GPU to its maximum making it the bottleneck and therefore any performance differences between the two would be CPU only (being he is using the same GPU). Your theory would be correct if the CPU was being stressed here but its not. Increased frame rate does not necessarily cause the CPU to have to work harder.
There is nothing preventing a 1090T or any other CPU made in the past 7 or 8 years from running DX12, it has little to do with the CPU.
Increased frame rates on lower resolutions WILL stress the CPU harder...especially when you consider the pretty weak IPC of an old Thuban. It will create a bottleneck on the CPU side, which WILL create a vastly different score between the two systems. Look at any high end system on Valley in the last 3 years, and you'll see exactly what I mean.
If you make the GPU the bottleneck by increasing the resolution, the score should be very close to the same. The only difference will be the level of the tech. PCIe 2.0, iirc? Slow memory. With the GPU the bottleneck, the CPU shouldn't matter all that much.
No it will not. The CPU in any GPU biased game is doing things like telling the enemies where to shoot, and laying out the map ahead. Higher frame rates do not make that part of the game run any faster or stress the CPU anymore. This is clearly evidenced in that he ran it again on Extreme and got the same results.
If the CPU is already too slow then yes a GPU that is capable of a higher frame rate will put out less FPS due to the CPU in that it cannot provide the most frames it can possibly. But a faster GPU that can put out more FPS does not stress a CPU additionally. If the CPU can't keep up with the game it can't keep up, you can throw all the GPU at it you want.
And any single or lightly threaded application that's CPU dependent at all, will create a bottleneck with a more powerful GPU creating a situation where you'll see a loss of frame rates because the CPU can't keep up.
Extreme is higher settings, btw....causing lower frame rates, and actually supports what I've been saying. Putting more stress on the GPU, takes the onus off of the CPU, and will create a situation where the CPU doesn't matter much.
Run it on 1080 basic or 720 low and test again. I guarantee there'll be a difference between the two systems.