[SOLVED] Is it possible for convert a 30fps gaming video into 60 fps smoother video?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

sxk1277

Great
Mar 19, 2020
129
3
85
I don't want to just double frames by copying it, I want smoother video. I want to watch gaming videos where both resolution and fps are high. Imagine watching a 8k 144fps video. But since no pc can handle that load, is it possible to let the game play on high resolution and have fps render afterwards instead of live on demand? This way, I can have plenty of content to watch in both 8k and high fps at the same time?!
 
Solution
I meant revolutionary as in if there is a known ongoing effort to increase fps in monitor or if they exist for some industrial use. I doubt brands are trying to create monitors that can play video at 1000fps due to lack of demand. However, I may wait a few years instead of investing into this avenue if others are already working on it.
If you haven't seen this, it's worth a read:


It explores the issues of conventional display technologies and benefits of various advancements. Towards the bottom, they have links to videos on Nvidia's prototype 1,700 and 16,000 Hz displays.

Also, this...
As far as resolution goes, do you agree with this?

"An inch is 25,4mm, and in that inch you can fit 2190 ppi (dpi) at 0,4 arcminutes, and 876 ppi (dpi) at 1 arcminute. This means that if a healthy adult brings any display or printed paper to 4" (10cm) from his face, the maximum resolution he/she can see is 2190 ppi/dpi."

"For comparison, Sony's Z5 Premium with a 5.5-inch display and 4K resolution was 801 PPI."

What about this?

Myelinated nerves can fire between 300 to 1000 times per second in the human body and transmit information at 200 miles per hour. What matters here is how frequently these nerves can fire (or "send messages").

The nerves in your eye are not exempt from this limit. Your eyes can physiologically transmit data that quickly and your eyes/brain working together can interpret up to 1000 frames per second.
 
Some games have offline rendering so you can record a movie file from a demo or replay. The whole procees does not have to be real time. You can use whatever settings you like. Have never tried to do it myself but unreal engine lets you create movies.

With dota 2 I used to use a website to upload the replay and then after a few hours I would get the video file.
 
Last edited:
Some games have offline rendering so you can record a movie file from a demo or replay. The whole procees does not have to be real time. You can use whatever settings you like. Have never tried to do it myself but unreal engine lets you create movies.

With dota 2 I used to use a website to upload the replay and then after a few hours I would get the video file.
Yes.
Recording gameplay has been a thing forever. Contests for speedruns, etc.

However...interpolating new frames (30fps to 60/120/720fps), and adding pixels( upscaling)....no.

It records a movie.
 
  • Like
Reactions: Barty1884
You can convert the video from 30 fps to 60 fps.
View: https://www.youtube.com/watch?v=bbKzhJfFD4g
You can upscalar at the same time. The result is a bigger file that won't be equal the an original 4k 60fps video in quality. Also 4k or 8k is pointless if you don't have a display that offers that resolution. For youtube the maximum is 4k 60fps (who has a 8k monitor?) because most graphics cards can't do 60 fps at 4k and none can at 8k.
 
Ok, I've got an idea. What if we fastforward the video? Would that increase fps?
Thats how fast forward works...

For example there is a normal video running on 30 frames per second.
if you enable the 2x Fast forward the video will show it 2x faster..
So what's the relation with FPS then?
the original video is still recorded at 30 fps and in order to speed it up by 2x they do the following:

Original video: 30frame/1sec --------> Fast video: 30frame/0.5sec which means 60frame/1sec.
(they don't double the FPS they just divide the second)

by doing this the video speed would increase by 2x but at the expense the video length will be divided in half.
 
You can convert 30 fps progressive to 60fps interlaced. It doubles the perceived frame rate of a video display without consuming extra bandwidth. In the old days graphics cards could do 1080i instead of 1080p. I think 1080i is just the lines painted on the screen in two passes of 540 lines each.
 
Thats how fast forward works...

For example there is a normal video running on 30 frames per second.
if you enable the 2x Fast forward the video will show it 2x faster..
So what's the relation with FPS then?
the original video is still recorded at 30 fps and in order to speed it up by 2x they do the following:

Original video: 30frame/1sec --------> Fast video: 30frame/0.5sec which means 60frame/1sec.
(they don't double the FPS they just divide the second)

by doing this the video speed would increase by 2x but at the expense the video length will be divided in half.

I've difficulty understanding this concept. Why don't frames double in half the time?
 
Those evil youtube videos haven't told you that your current display(the one you are watching the vids from it) should be capable of running those frame rates otherwise you will not actually see them?

for example in that useless minecraft video(the fps can't be even that high, youtube will not handle it) if you are watching it on 144hz monitor you will not notice anything over this rate. unless they are applying some smoothing to those vids everytime they say fps goes up to fake the fps effect
Well... depending on how the vids are captured, they can get some of the motion blur from being rendered at a higher FPS.

That said, motion blur isn't a substitute for higher FPS, because what your eyes blur is whatever they're not tracking. So, a moving object that you're looking at would appear sharp, while the backround and other objects are what would blur.
 
Myelinated nerves can fire between 300 to 1000 times per second in the human body and transmit information at 200 miles per hour. What matters here is how frequently these nerves can fire (or "send messages").

The nerves in your eye are not exempt from this limit. Your eyes can physiologically transmit data that quickly and your eyes/brain working together can interpret up to 1000 frames per second.
It's not only the nerves that can be a limiting factor - there's also the photochemistry of what's happening in your retina. I forget where I heard about this, but there's been some interesting research on how fast retinal cells can respond to light changes. I'll see if I can dig it up.

That said, I'm on your side, with respect to high-framerate. What people don't think about is how moving objects get rendered at a discrete point in time. If an object flys in front of you, being visible for only 1/10th of a second, a 60 Hz display will show the object rendered in only 6 positions. Especially if your eyes aren't tracking that object then you'll not perceive it as a smooth motion.

In real life (or on film), that object would have some motion blur applied. To get the same effect, you either need to simulate motion blur or render the object at a lot more positions. However, simulating motion blur has the drawback I mentioned in my previous post, which is the correct way to simulate motion blur can't be done without eye-tracking. So, the most straight-forward solution is simply to render & display at a higher framerate.
 
Entirely possible using machine learning software. I'm sure research has been invested into it already and it already exists
Indeed. This is positioned as synthesizing "super slo-mo" videos, but if you played these in realtime, it would be the same as simply up-converting the framerate.


It should be noted that most modern televisions have a built-in motion interpolation capability, but I'm sure Nvidia's method is more accurate than any current TV's implementation.

I don't know if there's any software available that incorporates Nvidia's method & model.
 
  • Like
Reactions: ixenroh
It's not only the nerves that can be a limiting factor - there's also the photochemistry of what's happening in your retina. I forget where I heard about this, but there's been some interesting research on how fast retinal cells can respond to light changes. I'll see if I can dig it up.

I found a few interesting links:

"The time course of the light response is faster in cones than in rods, and can inform of changes in illuminance as frequent as every 100–200 msec"
Source: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3398183/

"Here we show that humans perceive visual flicker artifacts at rates over 500 Hz when a display includes high frequency spatial edges."
Source: https://www.nature.com/articles/srep07861

"A team of neuroscientists from MIT has found that the human brain can process entire images that the eye sees for as little as 13 milliseconds"
Source: http://news.mit.edu/2014/in-the-blink-of-an-eye-0116

I feel a key difference here is that just because you can tell higher fps apart from lower fps, doesn't mean we can see those fps. For instance, when a fan is spinning, we can see a fan is at higher speed, but doesn't mean we can see the blades move. And for practical purposes, if you can't see a blade move, any frame above that is useless. But, how do I go about converting the point at which I can't see the blade speed to ms or fps?
 
"The time course of the light response is faster in cones than in rods, and can inform of changes in illuminance as frequent as every 100–200 msec"
Source: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3398183/
Uh... I'm not sure that's relevant to your concerns. For one thing, it corresponds to 5 - 10 Hz.

"Here we show that humans perceive visual flicker artifacts at rates over 500 Hz when a display includes high frequency spatial edges."
Source: https://www.nature.com/articles/srep07861
That seems to concern strobing vs. stable illumination, though. But, interesting.

"A team of neuroscientists from MIT has found that the human brain can process entire images that the eye sees for as little as 13 milliseconds"
Source: http://news.mit.edu/2014/in-the-blink-of-an-eye-0116
That's only ~77 Hz. However, they're talking about seeing a distinct image for only that amount of time vs. seeing a frame of video that's similar to those before/after it. So, it doesn't say that you can't notice artifacts that are visible for less time than that.

I feel a key difference here is that just because you can tell higher fps apart from lower fps, doesn't mean we can see those fps. For instance, when a fan is spinning, we can see a fan is at higher speed, but doesn't mean we can see the blades move.
Yes, that's my point about motion blur. You need higher fps to blur what should be blurred, which you otherwise couldn't know without eye-tracking.

And for practical purposes, if you can't see a blade move, any frame above that is useless.
Only if you can accurately simulate motion blur relative to the viewer's gaze.

But, how do I go about converting the point at which I can't see the blade speed to ms or fps?
Not sure I understand the question. There's not a single threshold where you can't see the blade - it just becomes increasingly blurry.
 
BTW, you asked a specific question that @zx128k and I have tried to answer, in posts #30 and #36, respectively. You should consider up-voting solutions that addresses your goals, and picking a "Best Answer" for your preferred solution.

If you just want to discuss high-fps, you can start a "discussion" thread, instead of posting a question.
 
Last edited:
I'll offer yet another suggestion, which would be to connect your PC to a TV with a good "motion smoothing" function (note: this tends to be disabled, in "gaming" modes, so you have to disable that).

I don't think it would be as good as our other suggestions, but it'd certainly be easy and could potentially give you both 4k and >= 120 Hz.
 
BTW, you asked a specific question that @zx128k and I have tried to answer, in posts #30 and #36, respectively. You should consider up-voting solutions that addresses your goals, and picking a "Best Answer" for your preferred solution.
Thanks for the up-vote, @sxk1277 . However, I rather thought my post about SVP (Smooth Vision Project) was more in line with your immediate goals:

 
But that's the point. The game could save the data for that pixel instead of manufacturing it while playing the game so the future up-scaling will be real. Why shouldn't this be doable?

If you really think this is possible but the programmers and engineers are just too lazy to do it, then you can try it yourself. If you don't know what needs to be done, or how it should be done then maybe it's not possible or worth doing? Just because you think it can or should be done does not mean it can be, or be worth the time to do it. I bet if you give a game and video card company a few million dollars to try it they may for you though. Say an average engineer and programmer would be working at about 80 - 150k a year, would probably need 4-5 of those in a team, plus some hardware costs for the setup.

You need to remember just because you want something does not mean that it's worth doing for anyone else. I may want to have McDonalds serve ostrich meat burgers with fresh baked buns for me, but I doubt they will start working on it even if I really really really want them to.

If you want it done, but it's not, that is why people start up new companies for or build their own ideas to see what can be done and if it's a viable market. I don't see why any game or hardware company would render the game twice , once at some super high end resolution and FPS that you need a 2080 level card for and then again for the screen just to make nicer looking recordings. Money and time are better spent in other places. Don't just say "well I think it can be done", go do it if you think it can be.
 
Last edited:
I've come up with some new ideas to calculate fps (I tried starting a new thread, it didn't work :/)

1. Maybe there is no limit to how many fps we can perceive and would therefore make this race pointless. For instance, when we see an asteroid fall down, it looks very slow. But if the same asteroid crossed right in front of us, we would barely see it. Therefore, as fps increases, its importance is notice with higher resolution, larger screen size and viewing distance. A theoretical extreme to deliver this point would be projecting gaming on the whole sky and playing on millions of fps. A gamer with such high resolution and fps should have an edge on a gamer on a computer monitor. But the question is, where does that put VR?

2. However, to answer custom fps limit for a computer monitor rather than accepting industry average, I created a methodology, let me know what you think:

Assuming this video is just as good as reality (otherwise try this in reality), I noticed the distance the biker has to the ground is about the same as distance we usually have from monitor. And at 150mph, the ground becomes blurry.
View: https://www.youtube.com/watch?v=qaBH-GGP8Pg


I also realize when I'm razor focused, I can only cover 3 inches of area with my eyes.

So, here is the conversion math: 150mph - 67meters/sec - 2637.8 inches/sec
2637.8/3 = 879.27
1/879.27 = .0011373 sec/3 inch - 1.1373 ms/3inch
1000/1.1373 = 879.28 fps

Btw, would it be possible to create 100's of identical simulation and merge them to create a file with higher fps?
 
Last edited:
I didn't see any proof of its effectiveness though. 🙁
They have 4 youtube videos, right on their homepage (which I linked). None are gaming, but they have two anime, one CGI, and a music video. As far as I can tell, they look perfect. Definitely better than the motion smoother from my 2013 HDTV, which is pretty good.

The page also says:
SVP features
  • Frame rate conversion up to 60/120/144+ fps
  • GPU acceleration, including NVIDIA Optical Flow support
  • Most video players, including VLC
  • HDR support (in selected players)
  • VR and BD3D support (in selected players)
  • Play, convert, stream
  • Regular updates
  • No ads

Aside from not being free, it seems like basically everything you wanted.

Btw, is there any hope of a revolutionary technology that could accelerate fps production?
They list GPU acceleration. Here's the GPU Compatibility page:

 
to answer custom fps limit for a computer monitor rather than accepting industry average, I created a methodology, let me know what you think:

Assuming this video is just as good as reality (otherwise try this in reality), I noticed the distance the biker has to the ground is about the same as distance we usually have from monitor. And at 150mph, the ground becomes blurry.
I don't want to be a wet blanket, but that approach has several significant flaws. First, when using a video or film clip, its motion blur is dictated by the exposure period of the camera.

Second, the video was compressed at least twice - once as the camera recorded it, and again when Youtube re-encoded it for storage. Possibly again, upon transmission to you. Each time it's compressed, subtle details like texture can get increasingly blurred out.

There's probably an easier & safer experiment you could try, by simply going outside, laying a bicycle on its side, so that one of its wheels is in the air, and then spinning one of its wheels at a known speed. Then, either watch the spokes, or you can try taping various images and patterns to it.

I wouldn't try this with any artificial lighting, because the light source might flicker - even if you're not immediately aware of it. For example, fluorescent lights tend to change color during different parts of the 60 Hz AC waveform period.
 
  • Like
Reactions: sxk1277