I'm a PC gamer who despises aliasing, but many of the games I play do not have adequate antialiasing settings. If it is too bad I usually use a custom resolution of 3840x2160 through the Nvidia control panel, which makes things much more bearable (been doing this with FFXIV recently). I do have an RTX 2080 so it can handle the extra load.
I sometimes play on my TV (3840x2160 native), or on one of my desktop monitors (1920x1080 or 2560x1440). I was wondering if the ratio between the custom resolution and the native resolution should logically make a visual difference. I started wondering this when I noticed that my 1080p monitor almost looked better than my 1440p monitor while running at a 2160p custom resolution (maybe i'm wrong). I didn't know if that was because 2160p was double 1080p, but 1.5x 1440p. I thought maybe the increase being an even whole number (2x the resolution of my 1080p display) rather than a fraction (1.5x the resolution of my 1440p monitor) somehow made it more visually appealing ? Am i making this up in my head, or do you think that ratio between native and custom resolution matters?
I sometimes play on my TV (3840x2160 native), or on one of my desktop monitors (1920x1080 or 2560x1440). I was wondering if the ratio between the custom resolution and the native resolution should logically make a visual difference. I started wondering this when I noticed that my 1080p monitor almost looked better than my 1440p monitor while running at a 2160p custom resolution (maybe i'm wrong). I didn't know if that was because 2160p was double 1080p, but 1.5x 1440p. I thought maybe the increase being an even whole number (2x the resolution of my 1080p display) rather than a fraction (1.5x the resolution of my 1440p monitor) somehow made it more visually appealing ? Am i making this up in my head, or do you think that ratio between native and custom resolution matters?