What's with the memory configs? Shouldn't they all be the same?This is a more realistic scenario of what happens when you actually run CPU demanding games at cpu demanding areas.
PCGH and computerbase use official supported memory.What's with the memory configs? Shouldn't they all be the same?
It's 'only' a 400mhz difference(excluding timings), but still...
That is showing how GPU limited you are as that is the Raytracing slide and RT takes a huge performance hit on the GPU. Once you go to the Rasterizing slide the numbers change A LOT. Even though the game is CPU intensive you can see how quickly it becomes GPU bound with RT.This is a more realistic scenario of what happens when you actually run CPU demanding games at cpu demanding areas.
Nope, it's CPU bound. RT is heavy on the CPU too, not just the GPU. Here, from my PC, you can clearly see im not GPU boundThat is showing how GPU limited you are as that is the Raytracing slide and RT takes a huge performance hit on the GPU. Once you go to the Rasterizing slide the numbers change A LOT. Even though the game is CPU intensive you can see how quickly it becomes GPU bound with RT.
This is moving the bottleneck from the CPU to the GPU. The throughput is limited by the GPU in raytraced titles. It doesn’t test the CPU.This is a more realistic scenario of what happens when you actually run CPU demanding games at cpu demanding areas.
No it does not, RT is incredibly heavy on the CPU, not just the GPU (BVH). It absolutely tests the CPU.This is moving the bottleneck from the CPU to the GPU. The throughput is limited by the GPU in raytraced titles. It doesn’t test the CPU.
You can see the bottleneck effect in many reviews where there is a big difference at lower resolutions and next to nothing at 4k. Raytracing exacerbates the effect, the GPU has to work harder and runs out of horsepower.
At 1080 the use of a 4090 and a 9800x3d is, unless you are an E Sports gamer, a waste. You won’t get the benefit of either.
As faster GPUs come out the bottleneck they cause will diminish. The processor will be able to stretch its legs and at some point the CPU will become the bottleneck. I’m not guessing when but it will happen.
Umm… yes, when the 3070 was new… it wasn’t worth it so the next card was the 7900xt. That was a step up in raster.Have you actually played any game with RT?
then with that dragon age game, it must be different, as i have a friend with a 1060, and the games that support RT, cant use RT at at all with that card... and with my 3060, i can use rt, but it causes noticeable performance hit, with no option to use cpu based RT rendering...No it does not, RT is incredibly heavy on the CPU
Preferring (or not) RT isn't the point here. I'm just saying that the graph posted was 100% cpu bound.Umm… yes, when the 3070 was new… it wasn’t worth it so the next card was the 7900xt. That was a step up in raster.
I prefer straight frames, upscaling and frame gen doesn’t interest me in the slightest.. personal preference. I don’t stream video.
The choice of Nvidia or AMD comes down to raster fps vs price. AMD and Nvidia are equally competitive in that scenario.
BVH and BSP have been around for years, why is it suddenly so computer intensive?
Well your friend has a 1060 and you have a 3060. Obviously with that level of GPU you are going to be GPU bound, especially when you enable RT.then with that dragon age game, it must be different, as i have a friend with a 1060, and the games that support RT, cant use RT at at all with that card... and with my 3060, i can use rt, but it causes noticeable performance hit, with no option to use cpu based RT rendering...
is there a setting in that dragon age game that allows RT to use a cpu to render it ? i cant read german, and translating the page, doesnt help when a window pops up that i cant read, well, cause thats in german too... and from what i can tell, it looks like i have to subscribe to view it ????
no you saidI never said the rendering happens on the CPU
that insinuetes rt is also done on the cpu, but every where i have seen, RT is a GPU feature only, and cant be done on a cpu, at least with consumer hardware.. cause if we can do RT on a cpu in games, then why havent we been able to use rt until the rtx line was released ?No it does not, RT is incredibly heavy on the CPU, not just the GPU (BVH). It absolutely tests the CPU.
How can you even say that the graph is GPU bottlenecked when the 7800x 3d is below 70 while the 14900ks is at 76. Have you actually played the game? Have you actually played any game with RT?
This is a translation from the review
With ray tracing active, Dragon Age: The Veilguard takes every core and thread it can get its hands on."
Same situation in cyberpunk, RT / PT is incredibly CPU demanding.
i never said anything about being gpu bound, i used that as an example that with out geforce rtx ( or newer radon cards that support rt ) , you cant use RT... as its a gpu only feature.. at lease from what i have read..Well your friend has a 1060 and you have a 3060. Obviously with that level of GPU you are going to be GPU bound, especially when you enable RT.
Ok manno you said
that insinuetes rt is also done on the cpu, but every where i have seen, RT is a GPU feature only, and cant be done on a cpu, at least with consumer hardware.. cause if we can do RT on a cpu in games, then why havent we been able to use rt until the rtx line was released ?
i never said anything about being gpu bound, i used that as an example that with out geforce rtx ( or newer radon cards that support rt ) , you cant use RT... as its a gpu only feature.. at lease from what i have read..
If you click the rasterization tab you should see that the 9800X3D gets 170.8 avg FPS and the 14900KS gets 140.0 avg FPS. If we are concerned about non ray tracing performance the conclusion is clear. Considering what you said in follow up posts, I am not sure how putting on RT and seeing both the 9800X3D and the 14900KS get similar FPS is indicative of a CPU bottleneck. They would have to run the same test with a weaker or more powerful graphics card to truly ascertain if its a CPU or GPU bound scenario. If the FPS with a less powerful GPU and the same CPUs in the same bench run get the same FPS that would be indicative of a GPU bound scenario. If they did the same run with a more powerful GPU and the 9800X3D / 14900KS showed a disparity then you would know its GPU bound, and conversely it would be indicative of a CPU bound scenario if their FPS remained similar.This is a more realistic scenario of what happens when you actually run CPU demanding games at cpu demanding areas.
Preferring not to is a point, you asked if I had used rt… answer, yes, I choose not to.Preferring (or not) RT isn't the point here. I'm just saying that the graph posted was 100% cpu bound.
Why it's heavy, I'd only assume it's a combination of nvidia drivers offloading a lot of work to the CPU and BVH structures for RT being heavier with the more effects or something, don't really know.
I can tell you though in this particular game (and others with RT) are actually more GPU bound when you...turn the RT off, cause the CPU can push a lot more frames. Weird, but it's true.
I have the game and the GPU and im telling you. I even tested the save files from the review (they have them available for download). But don't believe me ,take this guy with his 7800x 3d. Look how the GPU usage DROPS when he enables RT cause the CPU chokes.If this was a GPU bottleneck his GPU usage on the left picture should be at 99% and it should be pulling ~400+ watts. Instead it's just sitting at 240, much lower than the image on the right that has no RT and low settings...If you click the rasterization tab you should see that the 9800X3D gets 170.8 avg FPS and the 14900KS gets 140.0 avg FPS. If we are concerned about non ray tracing performance the conclusion is clear. Considering what you said in follow up posts, I am not sure how putting on RT and seeing both the 9800X3D and the 14900KS get similar FPS is indicative of a CPU bottleneck. They would have to run the same test with a weaker or more powerful graphics card to truly ascertain if its a CPU or GPU bound scenario. If the FPS with a less powerful GPU and the same CPUs in the same bench run get the same FPS that would be indicative of a GPU bound scenario. If they did the same run with a more powerful GPU and the 9800X3D / 14900KS showed a disparity then you would know its GPU bound, and conversely it would be indicative of a CPU bound scenario if their FPS remained similar.
Not what happens. Game becomes LESS gpu bound (GPU usage DROPS) when you enable RT cause of how hard the CPUs choke.Preferring not to is a point, you asked if I had used rt… answer, yes, I choose not to.
It isnt weird that more frames can be displayed when rt is off, you said it yourself, rt is gpu bound.
Accepting that, what have the devs done so badly to cause the cpu to tank? The techniques for placing object have been known for many years….Not what happens. Game becomes LESS gpu bound (GPU usage DROPS) when you enable RT cause of how hard the CPUs choke.
Im not sure it's a dev issue, since that only happens with RT enabled. If it was devs doing something horrible wrong the issue would persist with RT off, yet it doesn't.Accepting that, what have the devs done so badly to cause the cpu to tank? The techniques for placing object have been known for many years….
Tbh, the eye candy isn’t worth the frame rate drop.
There are ray traced games that show differences between 11th and 13th gen at 4k on a 4080.I have the game and the GPU and im telling you. I even tested the save files from the review (they have them available for download). But don't believe me ,take this guy with his 7800x 3d. Look how the GPU usage DROPS when he enables RT cause the CPU chokes.
Not what happens. Game becomes LESS gpu bound (GPU usage DROPS) when you enable RT cause of how hard the CPUs choke.
Yeap, in Spiderman with maxed out RT (max object range etc.) im getting ~110 fps on my 12900k. Without RT im getting over 200. Both CPU bound.There are ray traced games that show differences between 11th and 13th gen at 4k on a 4080.
You can check Spider-Man in Tom’s CPU vs GPU article.
It’s probably even more CPU dependent on a 4090
Those examples are fairly rare, but they do exist
Most games show little to no difference between 11th gen, 13th gen and a 7800x3d at 4k.
It is of possible that the game or NVidia’s drivers are just poorly optimized and that there shouldn’t be a noticeable CPU dependency in ray tracing, but it does exist.
so you agree then. RT needs a gpu that has rt capable hardware in order to be able to use RT, and you.... were wrong.Ok man
Yeap, absolutelyso you agree then. RT needs a gpu that has rt capable hardware in order to be able to use RT, and you.... were wrong.