Refresh rate\FreeSync\VSR(DSR) questions.

Wrathinside

Honorable
Oct 23, 2013
23
0
10,510
I already know that if you buy a 60hz 4k display, you are stuck with 60hz. And while it's not embarrassing even for 1080ti, 4k is a bonus where it works, which is still not everywhere. So, 4k is not an option.

However, I wonder if goes the other way, sort of backwards compatible. So if I get 144hz 1440p screen, will I have 144hz 1080p option? And what if you use VSR\DSR? I've tried DSR on my current GPU and... while I've seen the texture smoothness that looked better than AA, I read that such way of upscaling a resolution - is not a real resolution. And since I'm not getting a 4K display, I would never know the difference physically.

If I plan to play mostly on 1440p, with rare exceptions of 4k if the game is picturesque enough for it, and preferably no dips to 1080, unless higher resolution is not supported, what would be the difference in "native 1440p vs VSR 1440p", "4k VSR on 1080p and 1440p"?



So, if 144hz 1440p is the best of both worlds, what about 1080p 144hz? Can I get a cheap low-resolution monitor that can handle all the hurdle of extra cost of 1440p\4k - with DSR\VSR? And will this upscaled resolution retain the panel frequency? Of course, I'm not exactly expecting a 144hz 4K result for under 500$, instead of 5000$, but is it technically possible? Of course, with a proper GPU powering the stuff.
Can 144hz 1080p panel at least deliver 1440p 144hz VSR?


Finally, FreeSync. How does it work in relation to 144hz and Vsync? I remember reading about some "FreeSync ranges", like 40-75hz. Does that mean that, in a GPU-intensive game, if VSR-ed to 3440x1440p and at 40-75 FPS - will it have automatic Vsync? Will anything above this range just require Vsync?
Basically, what is the practical application of FreeSync. Just like Vsync is disabled almost in anything, where FPS matters, I assume that any action game with 144 FPS goal is out of the picture. Did I answer my own question?

To put it all into example, I see a quite unique model: LG 34UC79G.

Large, IPS, 144hz. Almost touching the 1440p IPS price range, but much bigger than 27 inch options. Perfect candidate for attempts to VSR 3440x1440(though no idea how 3840x2160(4k) would work there...)
 
That's... a lot to parse. What is your gpu? You mention a 1080ti but not specifically that you're buying.

And yes, 1080p/144hz looks great an a 1440p/144hz panel.

Never tried freesync but really like my gsync. Also never tried DSR.
 
Apologies, brevity is not my talent. My current GPU is irrelevant(GTX660), and I mentioned 1080ti mostly for the comical notion of 4k over 60 fps.

I am trying to understand how these things work, and planning a PC update. Into AMD side, as I probably have hinted at.

So, if I understood you correctly, native frequency is backwards scaling with resolution. That's one of the answers, thank you.
 
Yep. Refresh is consistent across resolutions from the monitor side.

I would look for a monitor with the widest Freesync range you can find. Some of them are pretty small. I can't help on the Freesync/vsync/out of range behavior. Gsync runs from 30fps all the way up to the monitors highest refresh.
 
You can always go to a lower res than native res. Hz is not related to a specific res so it can do 144hz on any res.

Vsr/dsr works on the gpu and the gpu will always output your set resolution. So if you have a 1440p monitor set to 1440p, no matter the dsr/vsr setting, it gets a 1440p signal. The monitor doesn't know anything is different. It's like AA since you never know that it is in fact computing at a higher sample rate than you are seeing. You are still seeing 1440p just like with AA. You do not see 4k unless you get a 4k panel. Pixels are physical lights on your monitor and you can't magically create more. You've already tried dsr so you know the difference it makes.

1080p on a 1440p panel does not look great. Upscaling never looks great unless you sit far away. It's worse than 1080p on a 1080p display so why spend more on a higher res then not use it?

LG 34UC79G is 50-144hz freesync range. 40-75hz is for a random 75hz model. The range differs depending on the monitor. Freesync turns off outside of those ranges stated for the monitor. If you get tearing with vsync off and you keep going out of the freesync range then you either need to turn vsync on or deal with the tearing. The same goes for gsync out of range.
 
Thanks, that's an even better reply, however, let clarify my question one last time.

If I take, for example, this LG monitor, and VSR it to 3440x1440 - how different would it be from the real picture? I've seen DSR in action, yes, and it did perform better than plain AA, even if more taxing, but I haven't seen either 1440p or 4k physically. And, if google and youtube are of any indication - there is no point to explain it in "pictures". You either see it with eyes on the screen, or you don't. So if you have - can you comment on it? I understand(at least try to) the monitor limitations, but this is not making it easier to imagine the final result.

Also, about monitor limitations, I read that the pixels of this large 34-inch 1080p monitor are large, and that's explainable, this size is mostly reserved for at least 1440p. I don't know if and how will traditional AA help here, and why do people buy these screens, but if I do not get any practical benefit of VSR and these pixels would be visible and jaggy - I guess I can rest my case.

Oh, and just one more clarification about frequency, on the example of this same monitor. If, say, I run a game that can pull 140+ FPS in 1080p, and decide to VSR it, lose about 30-40 FPS, but get picture quality. Will I still get comparable picture to 1440p native, but with frequency that similarly priced 1440p native couldn't have, or is this pure fantasy?
 
From my experience, on my particular monitor and gpu, with my Mark I eyeball, playing BF1 at 1080p on 1440p looks great. Is it technically inferior? Yep. Does it detract from my enjoyment of the game? In no way. When I'm running, things are exploding, people are trying to kill me... I don't pay a great deal of attention to the scenery detail. But, everyone has different needs on that.

I'll take a lower res for higher framerate in a multiplayer shooter every time. I run all other games (Syndicate, Witcher 3, Andromeda) at 1440/ultra since they're still super smooth and it does look great.

I am with K1114 on supersampling. How can you make 1080p look better just by rendering higher then downscaling? Still same pixel output. But, I see more and more folks swearing by it so need to find some technical explanation. Research for another day :)
 
JED, I'm afraid you are misunderstanding my point. My goal is not to play 1080p on 1440p, but the other way around, through VSR. When I mentioned 1080p on 1440p, it was just as you said - if the game demands FPS, and that's an exception, not a goal.

And, again, I understood that it's not the same. I just wanted to try and understand "how" it's not the same. I assume you have seen the difference, a virtual 1440p\4k and a real 1440p\4k. How could you describe this difference? I wouldn't know by myself, that's the last thing I'm trying to understand from my original post.

If words can not describe it... well, okay, guess it's time to hit big stores with testing stands, that'd be the only way to find out.
 
Oh, no problem. It was still helpful to know the opinion about smaller resolution on a dense display. Like, if 1080p on 34inch display looks like it has huge pixels, then probably 1080p on 1440p is a similar scenario.
After all, it's not a solution request, it's a discussion, and I'm interested in any detail.
 
In effect, vsr increases sample rate and has nothing to do with a res increase. Think of it like 8x aa vs 2x aa. You're getting caught up in res when it's not about res. 1080p monitor will always be 1080p all day, every day, and there is nothing you can do about it. Vsr is not really supersampling but it is similar in effect and in essence you can say they are the same. To get technical, there are different algorithms for sampling (which also tends to need more performance) vs the algorithm for downscaling. You could say it's like a more efficient ss.

I'd always go for lowering settings before lowering res if fps is an issue. The difference that native res makes is much more noticeable than other settings.

Pixel size being noticeable is going to depend on sitting distance. But upscaling is not the same as bigger pixels natively. Virtual pixels get spread over multiple physical pixels causing blurring when upscaling while native will not.