It's unclear where your frame rate numbers come from. The game engine has at least two different ways to give players frame rate information, and they often contradict each other. When that happens, /fpsgraph is right and /showfps is wrong. It really is about that simple.
Going from one area to another or tweaking settings can often qualitatively make a huge difference in frame rates. /fpsgraph picks this up quite well, and the data it outputs corresponds pretty flawlessly with the intuitive feel of how good the frame rate seems. Sometimes /showfps picks up the changes and sometimes it doesn't. That makes /fpsgraph by far the superior tool for a player wanting to monitor his frame rates as he tweaks settings, while /showfps is nearly worthless for this purpose. I'm not sure if /showfps is just bugged, or if it's tracking something that doesn't correlate very well with how the frame rate "feels".
The trouble is that /fpsgraph from a benchmarking view is that it doesn't give a numerical output. It gives bars that show how long each frame took to render, and they slide by, and off the graph within seconds. One could probably get meaningful numbers from /fpsgraph by taking lots of screenshots and coding some program to check how many blue and purple pixels there are in the bottom right corner of the screen, but it would be a major pain, and the article gives no indication of having done this.
It's more likely that the article used /showfps, in which case, the numbers presented are flatly wrong, and indicative of nothing but some idiosyncrasies of how a peculiar command works. That there's virtually no difference given between the average and minimum frame rates seems to point toward usage of /showfps, as that command tends not to pick up such things, even when changes in frame rates are glaringly obvious to a player.
-----
Also, trying to pick out "maximum" settings can be awkward. There is a main video settings slider that will adjust a lot of video settings at once, but it doesn't affect some of the particular other settings. For example, the slider doesn't touch anti-aliasing. Presumably the author of the article did put that slider to the rightmost position, which the game describes as "maximum". But that leaves it unclear what other settings were used.
But turning all settings to maximum is just a silly thing to do, no matter what hardware you have. The difference in image quality between maximum and going one notch to the left (what shows for me as "recommended", though the recommendation might be customized to the hardware that a player has) is basically zilch. The difference in frame rate performance is quite considerable, however. Moving two or three notches to the left rather than just one leads to glaringly obvious differences in image quality (as compared to each other, or to recommended/maximum), so the slider does work.