I'm a not an expert in this, so if any of this sounds funny then please educate me.
Things are pretty complicated to explain so kindly take some time to read and reply, i might be over-explaining simple things here or under-explaining complicated things.
I will be listing a scenario/case and ask questions on it, it will be easy for me to explain and understand. Some questions may include sub-questions along with them, so please read them and help me understand this.
Scenario 1 (S1) : Rendering a game in 900p on a 900p native resolution monitor
Scenario 2 (S2) : Rendering that same game on 900p in a 1080p monitor.
Questions :
I'm pretty sure I'll ask follow-up questions if I do not understand properly, so please help me with replies as well.
Thanks a lot in advance!
Things are pretty complicated to explain so kindly take some time to read and reply, i might be over-explaining simple things here or under-explaining complicated things.
I will be listing a scenario/case and ask questions on it, it will be easy for me to explain and understand. Some questions may include sub-questions along with them, so please read them and help me understand this.
Scenario 1 (S1) : Rendering a game in 900p on a 900p native resolution monitor
Scenario 2 (S2) : Rendering that same game on 900p in a 1080p monitor.
Questions :
- Will there be "any" performance difference between S1 and S2? (CPU usage / GPU usage / Frames Per Second)
- In S2, Will my PC render 1600x900 (900p) pixels and up-scale the output on 1080p? OR do they render 1920x1080 pixels but in a down-sampled/down-scaled method? (I know this is the weird one)
- If the game offers pixel-density options, then playing the game in S1 with 144% pixel-density (since 1080p is a 44% jump in pixel count from 900p) will perform same as playing it on 1080p on 1080p monitor?- And vice-versa when playing with reduced pixel-density to match 900p on a 1080p monitor? Will this out-put look same as playing 900p on 1080p?
- What is the difference between pixel-density and render resolution when it comes to game settings?
- Which out-put will "look" better between S1 and S2? (Pixel Density and Up-scaled output) (OR the difference in output matters on the monitor / game settings?)
- What are the settings that I need to set high to bring the quality up if I'm playing like in S2? (Anti-Aliasing? Pixel Density?) Or is there any hardware specific feature on monitors that help to reduce the difference?
I'm pretty sure I'll ask follow-up questions if I do not understand properly, so please help me with replies as well.
Thanks a lot in advance!