anon1239 :
Hmm... so basically Minecraft creates a bottleneck, being my cpu, because it focuses on single core performance, yet my cpu doesn't use up its full potential because the other cores simply can't be used?
Yeah, pretty much.
In single-core games, 5 of your cores will sit idle and mostly unused (just helping a little bit by running Windows and such). In dual-core games, 4 cores will sit idle, and so on. AMD CPUs suffer really badly from this in many games that use few cores, since they have much weaker single core performance, but in very rare cases it can become evident on Intel CPUs as well. Since you're getting like ~1000 fps anyway in a flat world it doesn't really matter. But that's the general idea of what
might be happening if other people with similar or weaker rigs are getting better performance. Ofc I have no idea whether the people who get ~4000 fps were exaggerating.
It's also worth noting that HDD speed can affect sudden temporary framerate drops in Minecraft, as new areas load. My i3-4360 and GTX 750 Ti averages 300 fps in Minecraft (normal world), but my HDD is quite ancient and slow, so when a new chunk loads it's common to fall to ~150 fps for a second. Not that it matters.
Partially these limitations are due to the fact that Minecraft's core mechanics were programmed by only one guy with a small budget and little time (he expanded the team later, but that was after the base game was already working), and partially it's because Java doesn't handle large games too well.
Finally, the addition of the "chunks" setting is relatively new to Minecraft. It's been in for several updates now, but there was at least a year or two when it didn't exist. There's always a chance people getting higher framerates were playing before that was added, as the default "8" is much, much easier to run than 16.
If I was in your position, I would try some other games or some artificial benchmarks to make sure your PC is performing as well as it should. Testing just one game doesn't indicate much.