We use fake frames because it is fake frames.
There is no special quantum blockchain AI crystal chemtrails happening here. It's just frame interpolation pure and simple. Render one frame, store it, render a second frame, store that, then interpolate 1~3 intermediate frames, aka motion blur, then push the whole thing to the output buffer while using the last frame as the first frame for the next set.
The only reason it's even being recognized as "frame" is because the drivers are doing a screen call to render and that is what benchmarking programs look for to count as a unit of "FPS". It would be like trying to argue to treat upscaling and native rendering as "equal", they are not.
Now if we want to discuses those features as QoL enhancements for performance impaired situations, then great. Interpolation can aid in making a low FPS situation appear more "smooth". Upscaling can assist in making a low FPS situation feel "better" by rendering it lower resolution. Combining them is rending at lower resolution and interpolating the results to produce a smoother experience then what would otherwise be possible. Discussing those in the context of the Titan or high end 80 model is kinda silly. Instead those technologies mainly assist in pushing the 50, 60 and 70 models to punch above their weight class.