So you're basing your 'good' and 'bad' framerates on two single runs? The numbers you quoted represent a 'drop' in framerate of 8% minimum (3fps? - hardly worth crying over), 6% average (5 fps drop and you think there's a problem?) and 7% max (10fps difference? - where's the problem?) - that's not a catastrophic framerate reduction in my mind - and those framerates are still pretty good for FEAR if you have everything dialed up all the way. I don't see the problem. I think if you try running something like FRAPS you will see that your framerate varies dynamically with the environment it is rendering - large outdoor scenes will naturally have slower framerates than inside scenes with less to render. The variations you are describing sound to me like the typical statistical variation that comes with running the same graphics test only twice - run the same test 12 times and see how the numbers stack up, you will find your two numbers are well within the average range, and are nothing to be ashamed of, I might add.
If you start adding physics and all of the other stuff in modern games, videocards have trouble in a hurry.
No one seems to stop and think that even the grandest videocard on the planet will not run games like FEAR or Oblivion at 2048x1024 with HDR, 4XAA and 8XAF at playable framerates - right now game vendors have pushed the advantages of the new hardware beyond what the fastest is capable of doing completely maxxed out.
You'll find with most PC games - that extra 'HIGHEST' settings for stuff like AA and AF or even resolution increases don't make a tangible difference in how good the game gets. It looks it's 'best' somewhere in the 'medium' to 'high' range of settings and going the extra mile to 'highest' on everything only hurts performance with little/no improvement in image quality. It's been a long-standing tradeoff between image quality and framerate, and STILL people think they can have EVERYTHING with BOTH - it simply doesn't work that way.
As for consoles - that's easily explained - they have custom dedicated hardware that is still less powerful than most modern graphics cards but the key is it's DEDICATED - and all the game developers write for the DEDICATED hardware that can then be code-path optimized for the specific console - not possible with the myriad of GPU's out there. Additionally, most console titles' graphics are dumbed-down to live within the limitaions of the console hardware. The X360 is capable of 'high-def' gaming - ok, so what is that? 1080p, right - so you are now in HIGH DEFINITION mode at roughly 1024x760 on a TV screen - hardly high resolution and if you turn your games down to that resolution on your PC you would probably find you COULD turn everything up that high and get good framerates. Additionally, if you compared two EXACT scenes from the same game on a PC and a console, you could probably tell which is which. The difference is in the tiny details that are added for PC's and are deleted for consoles, and you probably don't even notice in gameplay that the graphics are 'dumbed down' because the detail added in PC games doesn't make that much difference - again the difference between 'high' and 'highest'.
So be happy with your framerate, or turn down the resolution/settings if you can't be, but it sounds to me like your PC is working flawlessly.