Can you see difference in video quallity between ~1000 kpbs bitrate (720p) and ~2000kbps bitrate (1080p) when viewing it on "40 display max (playback, not streaming) ?
Thank you for your answer.It depends on video codec used and what settings was used on encoding (somewhere between ultra fast and slowest).
Like I said, it depends. Say you encode the same input (raw file) twice, AND use the same encoder both times, AND the only thing you adjust are the target bitrate - for most encoders, one should expect visually changes.So if I understood correctly, it doesn't mean that with more kbps, I will get better video?
If you have little details in source video, this may be expected - you may just put more data (that just reaching to losseless quality in the parts that does not add new details) in the video stream than necessary to represent/decoding to screen.Because, I can't see difference between 1000 and 2500 kbps bitrate video, and difference in file size iz double...
Thank you Math for understanding point of my question.i can barely tell the difference from SD and HD content when they are both done correctly.
like most of these type things, those folks who swear they can tell can't when faced with a blind test. if it looks good to you, then that's all you need. a 1.5 gb file is much easier to store than a 40 gb blueray when you can't even tell the difference
i don't even want 720p if i can make it sd and save half the space again..