1984miroslav

Distinguished
Feb 1, 2017
13
0
18,510
Can you see difference in video quallity between ~1000 kpbs bitrate (720p) and ~2000kbps bitrate (1080p) when viewing it on "40 display max (playback, not streaming) ?
 
Solution
i can barely tell the difference from SD and HD content when they are both done correctly.

like most of these type things, those folks who swear they can tell can't when faced with a blind test. if it looks good to you, then that's all you need. a 1.5 gb file is much easier to store than a 40 gb blueray when you can't even tell the difference :)

i don't even want 720p if i can make it sd and save half the space again..

1984miroslav

Distinguished
Feb 1, 2017
13
0
18,510
It depends on video codec used and what settings was used on encoding (somewhere between ultra fast and slowest).
Thank you for your answer.
So if I understood correctly, it doesn't mean that with more kbps, I will get better video?
Because, I can't see difference between 1000 and 2500 kbps bitrate video, and difference in file size iz double...
 
So if I understood correctly, it doesn't mean that with more kbps, I will get better video?
Like I said, it depends. Say you encode the same input (raw file) twice, AND use the same encoder both times, AND the only thing you adjust are the target bitrate - for most encoders, one should expect visually changes.
BUT - also this depends. If there are a very detailed scene (i.e. forest in storm) - one would be more likely to see differences in bitrates, than if you have a stationary scene with little to no moving parts.
This is also the reason why one rather than a fixed bitrate would want to put a target number for output visual quality, then the bitrate will depend on how much details in the scenes.

Because, I can't see difference between 1000 and 2500 kbps bitrate video, and difference in file size iz double...
If you have little details in source video, this may be expected - you may just put more data (that just reaching to losseless quality in the parts that does not add new details) in the video stream than necessary to represent/decoding to screen.
 

Math Geek

Titan
Ambassador
i can barely tell the difference from SD and HD content when they are both done correctly.

like most of these type things, those folks who swear they can tell can't when faced with a blind test. if it looks good to you, then that's all you need. a 1.5 gb file is much easier to store than a 40 gb blueray when you can't even tell the difference :)

i don't even want 720p if i can make it sd and save half the space again..
 
Solution

1984miroslav

Distinguished
Feb 1, 2017
13
0
18,510
i can barely tell the difference from SD and HD content when they are both done correctly.

like most of these type things, those folks who swear they can tell can't when faced with a blind test. if it looks good to you, then that's all you need. a 1.5 gb file is much easier to store than a 40 gb blueray when you can't even tell the difference :)

i don't even want 720p if i can make it sd and save half the space again..
Thank you Math for understanding point of my question.
It's obviously problem in ripping from original file, because I found that optimal video bitrate for 720/1080 are much higher.
Speaking of that, I've watched video with ~5000 kbps, and that one can give smoother image, but file is more than 6gb.
For difference between most common ~1200kbps (720) and ~2500kbps (1080), I realized that on screens "22-"43 there is no improvement in video qualitty, watching from normal distance.
I'm just waiting someone to confirm me that 🙂.