Management :
photonboy :
RENDERING the game and DISPLAYING the content are two different things.
This is true, but it is quite obvious that only 720p could actually be displayed on a 720p monitor.
I thought the question was about playing games at 4K on a 720p monitor (why someone would want to do that, I don't know, but that wasn't the question asked), not displaying a native 4K image on a 720p monitor.
Uh, no it's NOT clear at all.
You can RENDER a game at 4K, then DOWNSCALE it to 1366x768 which is completely different from rendering at 1366x768 and displaying at the same resolution.
Rendering at 4K is just as taxing as rendering at 4K for a 4K monitor. In fact, you add a small amount of extra work to downsample to 768p.
That is what DSR does in the NVidia Control Panel as said a few times.
And if you refer to my ENTIRE post it should be obvious what I was saying, especially the last line where I say ".. crappy monitor" so I thought it was pretty obvious that I was talking about rendering at 4K then downscaling to 720p (likely 768p really).
Anyway, the GTX1080Ti alone costs $700USD+ (too lazy to check exact) so you'd be far better off getting a weaker GPU and buy a good monitor (exact solution depends on total budget).
If it's not clear, here's the order of worst to best in terms of quality of image (not smoothness of gameplay), in fact it's mainly about smoothing the jagged edges more (i.e. anti-aliasing as 4K downscaling is basically supersampling. google is your friend):
1. 1366x768 rendered to 1366x768 monitor
2. 3840x2160 rendered to 1366x768 monitor
3. 3840x2160 rendered to 3840x2160 monitor
There are other issues as well such as HUD scaling that cause problems because if you tell the game to render at 4K you'll likely get small text/HUD elements that are not ideal for the number of pixels you have to work with.