I'm currently doing some Time Spy Extreme tests and i see a 56.4ms average simulation time per frame, but i cant find any simple explanation on google. Is 56.4ms a good time and how does this affect my scores? This is on a RTX 3090 from Micron.
I answer from title only, as I have no reference to that particular pice of software.
The "56.4ms" term is just a measurement on how long time it take to process one frame. It's the opposite* of FPS. Say one want to encode a video to x265 format, then the processing time will vary widely between each frame - because it depends on the complexity of the video stream, the encoder parameters and also the type of frame. Therefore, it gives more sense to use an average measurement.
* Most software use FPS (average) as a measurement of progress, so if you want to calculate, it goes: [ FPS ] = 1 / [t] = 1 / 0.0546 = 18.3 FPS