Yes, play at your monitors native resolution. 1722x1080 custom resolution is weird. Not sure if that was a joke or not. Were you just trying to get a 10% pixel reduction? It's better to just reduce game quality settings.
Although this is a rough description, I have grown to like it because most people can understand it:
- CPU tells the GPU what is in a frame and where things are. (like a rough outline/sketch)
- GPU draws the frame (final drawing) to be sent to the monitor.
- Repeat repeat repeat for each frame.
If you lower resolution or detail settings (or both), the GPU doesn't have to draw as detailed of a picture, so it takes less time.
Your FPS goes up. Generally here, your GPU usage is at/near 100% and your CPU isn't.
At some point (depending on the CPU and GPU combo) the difficulty of drawing each frame will become so easy, the GPU will finish rendering each frame before the CPU has figured out what's on the next frame, so the GPU has to wait for the CPU to tell it what to draw next.
If you continue reducing quality/resolution past this point, your FPS will cease to improve much or at all. You've now entered a CPU-limited scenario. Here, the GPU is NOT at/near 100% usage. The CPU usage is a little harder to come by since a game may not be able to use as many cores/threads as are available from the CPU. For example, if only 6 of 12 threads on a CPU are being utilized, most programs will report that as 50% usage.
*Some games are more CPU heavy than others. For example, Battlefield games have historically produced nearly equivalent FPS on a large range of CPUs, while Civilization games show a more contrasting performance gradient in that same CPU lineup. Also, online games tend to me more CPU heavy than offline/single player. Also as you can imagine, online games also have a number of additional "potential inconsistencies" that can affect FPS compared to offline ones.