Link to my current build: https://pcpartpicker.com/list/Btn7tn
Hello, everyone. I am trying to get a better understanding of CPU/GPU bottlenecks. I am using the following site, though I have tried a couple of other ones: https://pc-builds.com/bottleneck-calculator/
First, I can't get a report for my system, since the Arc A750 isn't in the database. Aside from that, what boggles my mind is how the numbers are impacted by resolution. For example, if I pair my Ryzen 7 5700X with an RX 6950 XT at 1080p, the result says that "AMD Ryzen 7 5700X is too weak for AMD Radeon RX 6950 XT." At face value, that would make me think I need a new CPU. However, if I change the resolution to 4K, the new result is "AMD Ryzen 7 5700X and AMD Radeon RX 6950 XT will work great together."
If I am trying to use the tool to figure out what upgrade path to take, this doesn't seem very useful. How does increasing the resolution make the CPU a better match with the GPU? If the idea is that the CPU and GPU should both be around similar utilization levels, then I can see it. However, a casual person would assume that if your CPU is at lower utilization while playing a graphically demanding game, that's a good thing. If anyone can help me better understand this whole thing, I would greatly appreciate it.
Hello, everyone. I am trying to get a better understanding of CPU/GPU bottlenecks. I am using the following site, though I have tried a couple of other ones: https://pc-builds.com/bottleneck-calculator/
First, I can't get a report for my system, since the Arc A750 isn't in the database. Aside from that, what boggles my mind is how the numbers are impacted by resolution. For example, if I pair my Ryzen 7 5700X with an RX 6950 XT at 1080p, the result says that "AMD Ryzen 7 5700X is too weak for AMD Radeon RX 6950 XT." At face value, that would make me think I need a new CPU. However, if I change the resolution to 4K, the new result is "AMD Ryzen 7 5700X and AMD Radeon RX 6950 XT will work great together."
If I am trying to use the tool to figure out what upgrade path to take, this doesn't seem very useful. How does increasing the resolution make the CPU a better match with the GPU? If the idea is that the CPU and GPU should both be around similar utilization levels, then I can see it. However, a casual person would assume that if your CPU is at lower utilization while playing a graphically demanding game, that's a good thing. If anyone can help me better understand this whole thing, I would greatly appreciate it.