A bottleneck is by definition a component that slows down the flow of data. The cpu isn't slowing down the flow, it's still pumping it out at 100%. It just happens not to be the same 100% in ability when compared to something faster. If anything would be a limiting factor, it'd be the software. Modern game code in a game like BF1 limits the amount of pre-rendered frames the cpu can put out to the gpu. Minecraft or Runescape, even CS:GO or LoL is relatively easy by comparison.
If op was planning on a gt710, then yes, the gpu would bottleneck the flow of data, the monitor would get far less than what the cpu is putting out. But a 1050ti will still put out everything it receives, it just has the ability to do more.
A Q9650 can handle 4k gaming. It's the exact same fps as 1080p. At that point the 1050ti would be the bottleneck as it cannot handle 4k resolution worth a ****.
Imagine this scenario. I9-9900k at 5.0GHz putting out 500fps in minecraft paired with a RTX2080ti which is putting that 500fps on screen. Switch to BF1 and now that i9-9900k is only putting out 250fps. Is the i9-9900k bottlenecking the RTX2080ti? The gpu is obviously capable of putting out 500fps to a screen. No. The gpu is capable of putting out the pre-rendered frames the 9900k gives it. The 9900k isn't a bottleneck to the gpu, it's going to give the gpu exactly 100% of the pre-rendered frames it can. The game itself dictates what fps that'll be. If the 9900k was still putting out 500fps, and you've cranked the settings and only get 250fps on screen, the gpu just became the bottleneck by 250fps worth of slowdown.
Personally, I prefer it when I can crank ultra settings and not get any fps loss as a result.