Shared GPU memory is basically like the page file for VRAM, only it lives on RAM rather than a storage drive. When the video card runs out of VRAM, it starts using system RAM tied to the application itself. So if say RDR2 starts using 8GB of VRAM, it'll increase its apparent memory usage by 4GB.
Application's don't see this because Windows doesn't report it to them. It's not something you should be designing games around.