How useful is the extra VRAM though given the bandwidth constraint? At 384GB/s it would take 0.03s to transfer 12GB. That isn't anywhere near enough for acceptable frame-rate. I suppose it could act as a buffer on lower-end systems incapable of streaming texture on-demand due to SATA drives and/or limited system RAM.
It's more about the capacity and all the various parts of the GPU that can ask for data. Some of it will be satisfied by the caches, and what isn't goes to the VRAM -- and if it's not in VRAM, the data has to come over PCIe from system RAM. What's the real-world throughput of the VRAM? I doubt it's actually 384 GBps -- that's the best-case burst speed. The bigger problem if you don't have enough VRAM is that if stuff has to come over the PCIe bus, it's a fraction of the throughput (16 GBps peak for PCIe 3.0 -- technically 15.75 GBps).
Anyway, if a game routinely makes use of textures and other data that uses something like 10GB of VRAM, the 12GB will probably end up being 30-50 percent better performance relative to the same GPU with 6GB VRAM. Give or take. In extreme cases, it can be even worse.
A good example of what happens when you don't have enough VRAM is to look at the 4GB and 8GB RX 5500 XT. At 1080p and medium settings, the two GPUs are effectively tied (the 8GB card is less than 2% faster overall). At 1440p medium, the 8GB card is 4% faster, and 9% faster at 4K medium. Bump up to 1080p ultra, however, and even at 1080p the 8GB card has a 12% advantage -- again, overall, across nine games. At 1440p ultra, it's a 17% lead, and at 4K ultra it's 26% faster.
Looking at the individual results, Borderlands 3 and The Division 2 both show a 30-50% advantage for the 8GB card at 1440p/4K ultra. And of course, neither GPU is really running at high performance at 1440p/4K ultra -- 40 fps for the 8GB at 1440p, vs. 27-30 fps on the 4GB; and 19-22 fps at 4K for 8GB vs. 12-14 fps on 4GB. But even a lighter game like Forza Horizon 4 shows a benefit of around 30% at 1080p/1440p/4K ultra, and the same goes for Far Cry 5.
So best-case, with double the VRAM your performance could potentially improve by 50 percent, but overall a 20-30 percent increase is more likely. That's if a game can use nearly all of the extra VRAM, because caching and other tech is designed to ensure a GPU doesn't choke when some app wants to use more VRAM than what's available. Worst-case, the extra VRAM shouldn't be any slower (1-2 percent margin of error differences). Here's all the 5500 XT data from my testing: