MINECRAFT... 200fps(older Java version)
MINECRAFT... 1000+fps(updated Windows version)
I have an Intel i-9 9900k and an RTX 2080 TI.
DUDE... SIR/MADAM...
You're still getting over 120fps in the game, why does this matter? Plus, 1000+fps on a non Gsync or Freesync monitor equals a crap-ton of screen tearing, so you're likely using an adaptive sync monitor anyways.
It would be pretty crazy not to with that kind of hardware...
Do you know what happens when your PC renders frames higher than what your monitor can produce, combined with an adaptive sync monitor?
THEY ARE DISCARDED.
Assuming you have a 144hz adaptive sync monitor:
-that's 56 frames thrown away on Windows 10 Minecraft
-that's 1156 frames thrown away on Java Mincraft
Those extra numbers are superficial, and only serve e-peen purposes.
On a non adaptive sync monitor, the gpu will feed all those extra frames through the monitor, and it will fail spectacularly.
The Windows 10 version is using an updated graphics engine, thus the requirements are a little higher, so your results will be 'lower' than what you got in the Java version.