News Cyberpunk 2077 Phantom Liberty Will Be Very CPU Intensive

Status
Not open for further replies.

gman68

Reputable
Oct 2, 2020
19
13
4,515
Not gonna upgrade my CPU or GPU for a DLC. Just turn down the settings. If the gameplay doesn't justify the cost of the DLC, no amount of light-show is going to make a difference to me. Although my system will probably handle it fine with medium/high settings anyway.
 

salgado18

Distinguished
Feb 12, 2007
981
439
19,370
On one side, if most of the CPU usage is for RT, I'll be very disappointed. Why can't game developers push other boundaries like NPC AI by using up on the CPU? I hope that's what they are doing, not just "moar graphs".

In the other side, anyone who uses up 90% of 8 cores can use 50% of 16 cores, or even 90%. If threading is well made, it can saturate anything. Ok, I think there's a point where it won't help much (8 cores give the maximum feature), but I really hope it can scale sideways too. Especially since they ask for current-gen CPUs, then last gen 12 or 16 cores should work nicely.

We need benchmarks! Go as far back as Zen 2 if possible!
 

colossusrage

Commendable
Jun 8, 2022
65
68
1,610
Someday a developer is going to create a game that has awesome physics, awesome AI/NPCs, awesome graphics, and it will run well, scaling on various hardware.
 
On one side, if most of the CPU usage is for RT, I'll be very disappointed. Why can't game developers push other boundaries like NPC AI by using up on the CPU? I hope that's what they are doing, not just "moar graphs".

In the other side, anyone who uses up 90% of 8 cores can use 50% of 16 cores, or even 90%. If threading is well made, it can saturate anything. Ok, I think there's a point where it won't help much (8 cores give the maximum feature), but I really hope it can scale sideways too. Especially since they ask for current-gen CPUs, then last gen 12 or 16 cores should work nicely.

We need benchmarks! Go as far back as Zen 2 if possible!
Frostbite has historically been extremely good at this (at least in the Battlefield games) where it will scale to whatever you've got. This is why back in the BF4 days AMD CPUs weren't as far behind Intel as they'd get much higher utilization than most games. I would love to see more games that scaled well and I thought the theory GPG used with Supreme Commander was great in that they broke up their game systems and they would spread out across whatever cores were available. It sure at least seems like this sort of thing is barely done anymore.

I am beginning to think game developers are in bed with GPU makers planning how to screw.......us peasants.
Can't tell if joke, but Crysis
 
  • Like
Reactions: salgado18

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
I trully expect at least 3 hit pieces every day blaming nvidia and intel for everything wrong with the game, just like the non stop barrage of negative articles about Starfield and AMD.
 
Status
Not open for further replies.