[SOLVED] What happens if data rate exceeds maximum bandwidth?

Mar 7, 2019
19
0
10
I have an Asus pg258q. According to Asus it has connectivity options including DisplayPort 1.2. As I understand it DisplayPort 1.2 does not support frame rates above 240fps at 1080p since it would exceed the maximum bandwidth. My gpu supports DisplayPort 1.4. So what exactly happens if my fps exceed 240fps? Does the gpu start to render at a lower resolution? Does it drop frames? Is there a bottleneck that could cause potential lag?
 
Solution
sounds like a bit of playing around will tell you the best way to go then.

cap it at 200 and then 250 and see which looks better and provides the best experience. then go with that. sounds like those are your 2 options to chose from so makes a nice quick test to decide which to use :)
I have an Asus pg258q. According to Asus it has connectivity options including DisplayPort 1.2. As I understand it DisplayPort 1.2 does not support frame rates above 240fps at 1080p since it would exceed the maximum bandwidth. My gpu supports DisplayPort 1.4. So what exactly happens if my fps exceed 240fps? Does the gpu start to render at a lower resolution? Does it drop frames? Is there a bottleneck that could cause potential lag?
If your monitor is set to 240 Hz, then only 240 frames will be transmitted, excess frames will be dropped.
 
Mar 7, 2019
19
0
10
If your monitor is set to 240 Hz, then only 240 frames will be transmitted, excess frames will be dropped.
Thanks for the reply. Yes, my monitor is set to a fixed refresh rate of 240Hz. Does that process cause any additional lag? In terms of lag is it better to cap frame rates below, above, or exactly at the monitor refresh rate?
 

Math Geek

Titan
Ambassador
it's best to sync it to the monitor refresh rate. hence things like g-sync to do just that. if they can stay in sync, then you get less issues.

so an artificial cap of 240 fps would help it to run and look the best it can. any extra fps is wasted anyway so why bother having the gpu make them?
 
Mar 7, 2019
19
0
10
it's best to sync it to the monitor refresh rate. hence things like g-sync to do just that. if they can stay in sync, then you get less issues.

so an artificial cap of 240 fps would help it to run and look the best it can. any extra fps is wasted anyway so why bother having the gpu make them?
Thanks for the reply. I'm playing the new Modern Warfare. Unfortunately, the in game limiter only seems to work at certain values. If I set it at 200 it will maintain 200 but if I set it at any value between 200 and 250 it will cap at 250. If I use gsync I will have to cap it with RivaTuner but I prefer to use the in game limiter if possible. Additionally, I noticed that gsync off gives me consistently higher and more stable frame rates. I'm not really worried about tearing at that high a refresh rate since it's virtually unnoticeable. I'm mostly concerned with lag.
 

Math Geek

Titan
Ambassador
sounds like a bit of playing around will tell you the best way to go then.

cap it at 200 and then 250 and see which looks better and provides the best experience. then go with that. sounds like those are your 2 options to chose from so makes a nice quick test to decide which to use :)
 
Solution
Mar 7, 2019
19
0
10
sounds like a bit of playing around will tell you the best way to go then.

cap it at 200 and then 250 and see which looks better and provides the best experience. then go with that. sounds like those are your 2 options to chose from so makes a nice quick test to decide which to use :)
Yes, that's what led me to the question above. The best experience seems to come from capping at 250. At that cap my gpu consistently maintains framerates between 240 and 250. I was just wondering if the process of discarding additional frames is detrimental in any way? In particular, does it cause any additional lag? Thanks again for the replies.
 
Yes, that's what led me to the question above. The best experience seems to come from capping at 250. At that cap my gpu consistently maintains framerates between 240 and 250. I was just wondering if the process of discarding additional frames is detrimental in any way? In particular, does it cause any additional lag? Thanks again for the replies.
No, it doesn't cause any lag :)