Hello Everyone,
Sorry I'm late, this publishes in the middle of the night where I am, so I'll try to catch up all in one post.
@swilhelm - The article you linked states that in XP, Windows Task Manager (WTM) causes double counts with Chrome/Chromium, but that in Vista it causes undercounts. Meaning that,
if anything, the Chrome memory usage numbers in the article (we used Win7, which is closer to Vista than XP) are lower than they should be, not higher. But it looks kosher to me, the total is right in the middle of the others usage.
@Mark Heath - I don't know if monthly is warranted just yet, but if these companies accelerate releases ANY MORE, it just might be
RE: placing - Your pretty much on the money, for the past few months, well ever since Opera 10.50, it's been Opera and Chrome for the top spot. Less than a week after the first Web Browser Grand Prix, Opera pushed a minor version change that did just enough to push it past Chrome in a few tests. Google is known for doing the same (especially after a Firefox release, almost like clockwork), they always seem to have a minor version change waiting in the wings in case somebody else gets fast on them. I wouldn't be surprised if a minor version change came along for Chrome that just happens to boost performance in the next week or so.
RE: In-Development Browsers - See Page 2. That being said (and read), do you guys REALLY want to see how unstable apps perform? One of the draws about this article is that anyone can grab these apps RIGHT NOW - stable, done, ready-to-go. IF I were to test in-development browsers, I wouldn't compare them to the stable ones, just the other in-development apps. I also wouldn't be giving out any awards or naming a 'winner'. But ya'll are the boss, so if you want to see what's on the bleeding edge, sound off and we'll see what we can do in a different kind of piece.
RE: The Charts - The charts in this article are version 1 of an OpenOffice.org template, v2 (not yet used live) corrected some placement issues. Whenever an outcome is very close, O
automatically re-adjusts the scale to emphasize differences (Excel does this too, though it's probably been disabled in our Excel chart template). If ya'll want a static scale begining at zero for ALL charts, I'll change that setting for the next one.
RE: Final Results - As a few of you have noticed, the final placement doesn't add up, I believe that I counted Acid3 twice, once with all 5 browsers and once with just 3. This would explain the odd totals. I have since corrected the tables. The final outcome and placing remains unchanged due to this typo. Chrome still has the edge on Opera in all tests, but Opera owns for speed-only. Firefox is just barely ahead of Safari in third and IE8 is last. When totaling, keep in mind that one Opera result was thrown out, so Opera will have one less score, this also means one less 5th place entry. Also, Acid3 is ONE score, not two like I accidentally counted - The Pass/Fail puts IE in 5th and FF in 4th, the Speed results place the top 3. There is also a tie on the CSS3 Selectors Test - 2 3rd place finishers, no 5th.
@cadder RE: Security Tests - We're looking into it, got any good suggestions?