Lots of people seem to get the wrong idea about this.
Here, let me explain.
There is really no such thing as "this game is optimized for 8 cores" or "this game is optimized for 4 cores". It's all just a hyper-simplified explanation for non tech-savvy people so that they can understand, and we don't have to explain lots of things that they wouldn't understand anyway.
However, since you ask this question, I'm going to assume that you're tech savvy enough that you will understand the actual explanation about this.
So, here is the true explanation as to why system requirements are as they are.
First, I will tell you how CPU and GPU work when gaming.
CPU processes the instructions from the game, GPU visualizes the processes that CPU does.
The more CPU can process, the more FPS it can produce.
The same with GPU, the more process from a CPU a GPU can visualize, the more FPS it can visualize from the CPU.
If a CPU can process more than a GPU can visualize, the CPU will not use its full power and instead synchronize its processing power to match GPU's maximum visualization capability. That's what is called GPU bottlenecking.
If a GPU can visualize more than a CPU can process, the GPU will not use its full power and instead synchronize its visualization power to match CPU's maximum processing capability. That's what is called CPU bottlenecking.
An easy way to see a bottleneck is to see CPU's and GPU's usage while gaming. If GPU's usage is 100% and CPU's usage is below 90%, then it's GPU bottlenecking. If vice versa, then it's CPU bottlenecking.
It is recommended to have GPU bottlenecking instead of CPU bottlenecking.
This is because if your CPU does not run at full 100%, there will still be some room for the CPU to do many things, like multitasking, playing music, and, most importantly, streaming.
Now, why does this "optimized for X cores" argument exist?
The answer is quite simple, actually. It's because GPU's power increases over time.
The more power a GPU can visualize, the more power a CPU must have to be able to keep GPU usage at 100% so that CPU bottlenecking does not happen.
Back when PS4 and Xbox One were announced, the GPU were not very powerful. Even the most powerful consumer GPU at the time, GTX 780 Ti, is now only slightly more powerful than GTX 1050 Ti. That's why quad core CPUs like i7-4790K was enough keep 780 Ti's usage at 100% and buying higher core count CPU would be a waste.
Nowadays, GPUs have gone an insane power increase. The top-tier GPU, RTX 2080 Ti, is around 3 times more powerful than GTX 780 Ti and requires 3 times the power that 780 Ti needs to be kept at 100% usage. That's why quad-cores like 7700K are not enough for 2080 Ti at 1080p.
Now, of course, all of that applies IF YOU PLAY AT 1080p.
If you face a CPU bottlenecking with your PC at 1080p, there is a way to mitigate this problem: Increase the resolution. The higher the resolution, the more work GPU needs to visualize 1 FPS from CPU, resulting in less visualization capability.
In simpler words, if you increase resolution, the GPU will work harder, while your CPU will still be the same.