The cpu pre-renders the frames according to game code. Nothing else. If it can dish out 100fps, it'll do so regardless of detail settings or resolution. It'll always run at 100% ability to do so, but usage is not ability, it's how much of the cpus resources are needed to get that 100% ability. 100% usage means it can't handle the code, the game is too intense. Too much bandwidth use, too much code, too much thread usage.
After the cpu is done with the pre-renders, it ships that to the gpu. That's where details and resolution change things. Higher the details, higher the resolution, more resources the gpu needs to use to get that 100fps on screen. At 720p, that's chump change, doesn't use much gpu at all to get all 100fps up. At 4k it's going to fail, use much higher gpu usage and still not get all 100fps up, probably closer to 30.
DSR is where the gpu receives the 100 pre-rendered frames from the cpu, finish renders them at a higher resolution, but sticks that finished frame on screen at a lower resolution. The gpu works harder, needs more resources, but still only puts 1920x1080 amount of pixels on a frame, not the full 3840x2160 (4k) amount of pixels up. So the gpu output is the same workload, the gpu render is harder, gpu input doesn't change. Your 1650 can do DSR, you'd get a sharper, clearer, cleaner look per frame, usage would go up, and it'd depend on resolution and details as to whether it's still capable of that 100fps on screen or not.
Basically, if the gpu is capable of 100fps, but only received 60fps from the cpu, that's what you get, 60fps. The cpu is choked by Witcher 3, and only a more capable cpu (more threads) will set the gpu free to reach its boundaries. Only by choking the gpu will the cpu catch a break as it's no longer forced to supply the 100fps demand.