Adobe Flash: A Look At Browsers, Codecs, And System Performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

acku

Distinguished
Sep 6, 2010
559
0
18,980


I kind of explained this on the last page but maybe I wasn't quite clear. You can install 10.2 no problem. The thing is that there is no performance benefit from 10.2 because it hasn't been implemented on the server side. You can have the software installed on the client but you don't see any improvement from 10.1 because no Flash site has currently been optimized for 10.2. Those features touted in 10.2 don't just appear enabled when you install the software. There is more to it.

I think my point on the hardware is valid but you make it good point. AMD is under-represented. I gave a explanation as to why that was in an earlier comment. Next time we will try to get a higher end AMD system represented. That said, the fixed function decoder performance is not going to differ on a faster card. On a desktop system, you may see 10-15% but with a comparable Nvidia card you would likely see something closer to 3-5%. The point I was making was that UVD2 isn't taking nearly the full load off the CPU like some of the other solutions available. This matters less on the desktop but it matters a lot on lower end hardware and much of the mobile landscape.

Andrew Ku
TomsHardware
 

acku

Distinguished
Sep 6, 2010
559
0
18,980


Understandably, this involved quite a bit of testing. We are talking about more than 30 cumulative hours of testing spread out over more than a single week, so at the time Opera 11 was not gold. Obviously, we can't keep restarting our benchmark session every time there is a new patch. We were testing more than one browser and multiple variables. But point taken, look for information in an update when we cover 10.2.

Andrew Ku
TomsHardware
 
So I do wonder why they didn't try it with the new Sandy bridge CPUs to see if there is any difference and as well why didn't you include IE9 and FF 4.0?

Sure they are betas but they probably much better than current finalized browsers.
 

caparc

Distinguished
Aug 29, 2009
78
0
18,630
This article reminds me of the hard drive manufacturing article sometime back. This is the Toms mojo at it's best. (The person who wrote this might be able to follow up with a how-to on the best recipe for putting videos on youtube that look as good as the best of them.) I learned something from the hard drive article. I'm learning from this one. Thanks
 

acku

Distinguished
Sep 6, 2010
559
0
18,980


Not necessarily according to our initial tests. To be fair, keep in mind we can't really test every browser and every possible new hardware configuration. This was mainly after what you should expect from the current generation of hardware.

If you read the whole article then you know the scale of testing. We tested every browser mentioned with aero on/off with Flash hardware acceleration enable/disabled and with Fullscreen/windowed in every Youtube video resolution available every along with every possible combination of the a fore mentioned variables. That is why this adds up to over 400 one-minute individual tests. Adding betas browsers would simply have increased testing by a substantial margin. If this was one setting it would be easy but we had to make conscious choices to our testing so that this didn't turn into a month long project. As it stands we were watching and loading movie trailers (we had to test when the movie was fully buffered with wifi off), so that is like 2 weeks of endless back to back testing and watching like 30 hours of pure video trailers, 20 hours which were just harry potter. If we had more people and more systems that might have been possible (along with lots of caffeine)....



Very interesting. Nice writeup.

Thanks for all the kudos. Chris and I will try and deliver more "mojo." I'm glad you got something more than a buying recommendation out of our article. Not that we don't enjoy doing those articles as well.

 

doped

Distinguished
Apr 15, 2006
131
0
18,690
it's all fine and dandy, but adobe should still get that linux support for accelerated h.264 on the road. I would like to see small embedded systems really chewing on those HD flash videos, or just be able to decode fucking flash with a "normal" home pc :(
 
G

Guest

Guest
What about html5-enabled browsers and youtube in html5 mode, then?
 

wizardprang

Distinguished
Jun 10, 2010
39
0
18,530
I find it amazing that with all of this information, security was never mentioned.

After Windows and Acrobat, Flash is the most common security problem in Windows. During 2010, it seemed that I was updating Flash every other week. It's nice to see all these nice features, but the real question is "why is flash so damn buggy?"

And why does Adobe make it so difficult to upgrade/install in IE without using a totally unnecessary DLM (DownLoad Manager)?
 

xer0

Distinguished
Jan 7, 2010
21
0
18,520
When are they going to fix the flash player POS so multi-monitor users can actually watch something full-screen on one monitor and "do" something on another screen without flash dropping back to a tiny size when you click outside of browser window? Someone had a hack for while that allowed it on earlier versions of flash, but none of the current releases seem to allow. Adobe had some BS excuse about security for this behavior, but in light of similar apps like Silverlight doing the multi-monitor full screen thing just fine, Adobe's previous BS doesn't seem to hold any water.
 

xer0

Distinguished
Jan 7, 2010
21
0
18,520
Nice ^ didn't see the info earlier. Too bad it only took Adobe 3+ years to add such basic functionality..
 
G

Guest

Guest
"However, when people talk about HD today, we spend so much time bickering about resolution that it seems foolish. For those of us that actually create 2D/3D content, it's the bit rate and codec efficiency that matter, not how many pixels run across the screen"

That is completely, totally and utterly wrong IF your talking about films or (HD video recorded in either 720p or 1080p). For those,having a display resolution of EXACTLY 720p OR 1080p OR an exact multiple of 720p or 1080p is critical. The reason is scaling. Look it up if you don't understand. You can say I'm wrong until you are blue in the face but I will still be right. I don't think laptop makers understand this either because it's hard to find laptops with true 1080p resolution displays these days.
 

acku

Distinguished
Sep 6, 2010
559
0
18,980
[citation][nom]jdjdjdjd[/nom]"However, when people talk about HD today, we spend so much time bickering about resolution that it seems foolish. For those of us that actually create 2D/3D content, it's the bit rate and codec efficiency that matter, not how many pixels run across the screen"That is completely, totally and utterly wrong IF your talking about films or (HD video recorded in either 720p or 1080p). For those,having a display resolution of EXACTLY 720p OR 1080p OR an exact multiple of 720p or 1080p is critical. The reason is scaling. Look it up if you don't understand. You can say I'm wrong until you are blue in the face but I will still be right. I don't think laptop makers understand this either because it's hard to find laptops with true 1080p resolution displays these days.[/citation]

You're kind of right but you're missing the point. I suggest you read Trevor Greenfield's blog post. And he makes a wonderful point of how "HD" is misused within the context of streaming video that is ever so pervasive today. Read the Apple Insider story on the low HD bit rate video that iTunes is spitting out. Technically that is 720p. Frankly I think everyone would rather have a 12Mbps 1080p movie file than a 4Mbps one. HD is a loosely used term and it doesn't help that we only have typical industry practices on bitrates even for HD.

Trevor Greenfield
Most of us filmmakers/editors who spend a great deal of time encoding and transcoding videos will tell you that the resolution, ie. 480p, 720p, or 1080p, has very little to do with the overall picture quality. What matters most is bitrate and codec efficiency, not the number of pixels.

http://trevorgreenfield.com/rants-and-raves/youtubes-1080p-failure-depends-on-how-you-look-at-it/
AppleInsider
A standard 720p file downloaded either through iTunes or an Apple TV consumes about 4Mbps of data, or just a tenth the total bit transfer rate of the optical format and a fifth of the nearly 20Mbps for over-the-air HDTV; even Xbox Video Marketplace video affords more, at 6.8Mbps. Some of this shrink in file size can be attributed to features left out of Apple's encoding, such as the 1080p resolution or 7.1-channel surround audio, but much of it is attributed to compression that can degrade the final picture quality significantly from the reference image.
http://www.appleinsider.com/articles/08/09/10/itunes_hd_videos_low_bitrate_include_ipod_ready_versions.html

George Ou at Zdnet even went on a rant back in 2008
As I’ve tried to educate my readers last year with my blog “Why HD movie downloads are a big lie“, these so-called HD movies use very low bit-rates compared to even standard definition DVDs let alone something like HD DVD or Blu-ray DVD. Raw uncompressed 1080p video at 60 frames per second is about 3000 mbps so even HD DVD’s 28 mbps needs to be compressed about 107 to 1 with the H.264 or VC-1 codec. By all reasonable standards this needs to be the minimum bit-rate for acceptable loss in quality on 1080p video.
http://www.zdnet.com/blog/ou/dont-believe-the-low-bit-rate-hd-lie/959


 
G

Guest

Guest
YouTube 240p 400x170 253 Sorenson H.263
YouTube 360p 640x272 449 H.264
YouTube 480p 854x362 791 H.264
YouTube 720p 1280x544 2016, max:11.3 Mb/s H.264
YouTube 1080p 1920x816

640x 272 is NOT 360p
854x362 is NOT 480p
1280x544 is NOT 720p
1920x816 is NOT 1080p

You HAVE to be accurate with these figures. You can't just say well I have the correct number of horizontal pixels and I don't give a crap what the number of vertical pixels are and then call it one of the standards. If broadcasters played so fast and loose with the standards there would be an unwatchable mess on TV. If your going to talk about or deal with digital video you HAVE to be accurate. Otherwise people will NEVER get this stuff correct. It will just be a never ending circle of falseness.
 

acku

Distinguished
Sep 6, 2010
559
0
18,980
[citation][nom]bbcfdvdvdvdbbdggdggdhjghgdhg[/nom]YouTube 240p 400x170 253 Sorenson H.263 YouTube 360p 640x272 449 H.264 YouTube 480p 854x362 791 H.264 YouTube 720p 1280x544 2016, max:11.3 Mb/s H.264 YouTube 1080p 1920x816 640x 272 is NOT 360p854x362 is NOT 480p1280x544 is NOT 720p1920x816 is NOT 1080pYou HAVE to be accurate with these figures. You can't just say well I have the correct number of horizontal pixels and I don't give a crap what the number of vertical pixels are and then call it one of the standards. If broadcasters played so fast and loose with the standards there would be an unwatchable mess on TV. If your going to talk about or deal with digital video you HAVE to be accurate. Otherwise people will NEVER get this stuff correct. It will just be a never ending circle of falseness.[/citation]

True, but don't take that up with me. Talk to Youtube or Hulu. Please read the whole page. Use a Flash downloader. You can confirm these values for yourself. Those are the resolutions you are watching when you click on a specific resolution at Youtube!

And to be fair, broadcasters don't play fast and loose. There are industry conventions on bitrate. That isn't so in the streaming business. Read my post above.

Furthermore, there is some video manipulation going on. Like someone asked how does 1080 get to 1980 x 816 on Youtube's 1080p setting? One of our community members on the UK site hit the nail on the spot "I think this is only for when it's displaying 2.35:1 aspect ratio content, where the black bars are added player-side, rather than wasting bandwidth encoding them into the original video. However, videos such as this are 16:9, and show up as 1920x1080 in when you right-click on the video and go to "show video settings"." The only part he had wrong is that they add the black bars on the server side.
 
G

Guest

Guest
"You're kind of right but you're missing the point. I suggest you read Trevor Greenfield's blog post. And he makes a wonderful point of how "HD" is misused within the context of streaming video that is ever so pervasive today. Read the Apple Insider story on the low HD bit rate video that iTunes is spitting out. Technically that is 720p. Frankly I think everyone would rather have a 12Mbps 1080p movie file than a 4Mbps one. HD is a loosely used term and it doesn't help that we only have typical industry practices on bitrates even for HD."


No, I'm not kind of right I'm exactly right. The problem today is that there is not enough Internet connection speed to the home to do HD video in real time and that is where the term HD gets misused. BUT if people are to learn the right stuff you HAVE to be accurate with the terms when you speak. You can't take shortcuts with the language just because the people who post the videos take shortcuts with the language. If you do people will NEVER learn the correct thing. It would be like teaching math where you say .98 is close enough to 1 to be correct. NO !
 

acku

Distinguished
Sep 6, 2010
559
0
18,980
Again my reply would be take it up with Youtube and Hulu. If you read the table then you understand that I am simply listing the setting to which the video is selected. We clearly state that it is "selected quality." All of these flash videos were parsed directly from the streaming servers. It's hard to describe what we are exactly seeing if we can't tell the reader what quality we selected. And again this article was about Flash, it wasn't an outlet to rant on HD video bitrate conventions or the lack of them. As you mentioned, we would be wrong if we were talking about film, but we are not. We are talking about streaming video. Furthermore, talking about streaming video is going to get a whole lot more complicated if you are going to forbid sites like AppleInsider, Zdnet, Cnet from talking about say the "720p" video that iTunes is selling. That is what it is branded as and advertised as. We are simply making comparisons and pointing out the differences.
 
G

Guest

Guest
"We clearly state that it is 'selected quality'."

If you were someone who was trying to learn about this stuff and didn't know much right now would you interpet 'selected quality' to mean "This is what Hulu says it is" or would you think "This is the quality level that was selected". You can't determine which one the reader would select from that can you ? And that is my point. You have to be accurate with technical language or people can't learn the correct things. Consistency in language is the KEY to learning. I don't care what the article was specifically written about any article can and will be used to learn about other things that are referenced in the article and if the language is not correct it leads to mass confusion. What I would like to see is a note under the table that says something like "720p is 1280x720 pixels and 1080p is 1920x1080 pixels. Those are the only two resolutions that can officially be used with the terms 720p and 1080p. Any other referenced resolution using those terms is incorrect."
 
A very complex piece of article indeed. I lost track of it several times along the way, but I still don't regret reading it.

The framerate drops in the three browsers came as a surprise however. Especially Chrome; considering that they're the fastest growing browser you'd think Google would have solved this issue.
 

acku

Distinguished
Sep 6, 2010
559
0
18,980


Thanks! I'm glad you enjoyed it. Chris and I thought it was warranted given many of the misconceptions about Flash.

But on your point with Chrome and the others, yeah that was a surprised to us too! We basically saw this occur on all the systems we tested. I'm not sure exactly how each browser handles video within their application or GUI pipeline, so I can't say for sure. The overhead that those three browser share might indicate some similar code. I haven't done all the research on the framework of browsers so I can't say for sure (I'm not a browser's war type of guy). I know that Chrome and Safari are both descended from Webkit... I'm so busy writing code for some of our other benchmarks I'll have to take a second look when I do a html5 vs flash article.
 

mariushm

Distinguished
Feb 15, 2009
45
0
18,530
If you really want to be pedantic, Youtube doesn't actually have 720p and 1080p at all, because these generally mean progressive content, therefore 50 or 60 frames per second. Youtube's content is 25-30 frames per second.

1980x816 is 1080p in the most common understood sense - Youtube just went ahead and cut the black bars from the 1980x1080 rectangle to improve compression, because often video encoders have issues encoding the sharp difference between black bars and actual video content, therefore consuming precious bits.

It would have been a problem if the content was full 1080 lines, resized vertically to half and then stretched vertically in the Flash player. This would be a legitimate reason to complain.

The issue is moot anyway - the hardware decoders on the cards don't have problems decoding any height, as long as it fits certain minimums like at least 48 pixels and height multiple of 4.

And I really don't know what the hell you're talking about 4 mbps 1080p video on Youtube...

Search for 1080p videos, Right click on the Flash object, select Show video info and you'll see how much it uses:


ex:

http://www.youtube.com/watch?v=_i2RCBa3l-g 1920x800 2.6mbps min , 5.8 mbps max

http://www.youtube.com/watch?v=yQ5U8suTUw0&feature=fvsr 1920x816 5 mbps min, seen up to 13 mbps

http://www.youtube.com/watch?v=XSGBVzeBUbk 1920x1080 4 mbps average, 6 mbps max (it's cartoon, it's enough)

 
Status
Not open for further replies.