This is one of those conspiracy theories that I've never seen 'proven' to any reasonable degree. I just did a bunch of retesting for the
updated GPU hierarchy. The RX 580 8GB is about 7% faster than a GTX 1060 6GB (though that's using a factory OC 580 against a reference 1060 6GB). Digging back into my old articles, I find this:
https://www.pcgamer.com/radeon-rx-580-review/
In that review, the 580 and 1060 were tied at 1080p ultra, 580 was 1% faster at 1080p medium, tied at 1440p ultra, and 580 was 2% faster at 4K ultra. That was with both cards being factory overclocked, though -- which is usually about a 5% difference. So, three years later, the overall rating of the two GPUs appears to still be within 2%.
I can point to things that have gone the other way as well. 570 4GB vs. 1060 3GB, here's the
RX 590 review. GTX 1060 3GB was 11% faster at 1080p medium, 3% faster at 1080p ultra, 2% slower at 1440p ultra, and 15% slower at 4K ultra. My latest data (albeit with different games): 1060 3GB is 10% faster overall, including 4K results. Or if you want, 9% faster at 1080p medium, 8% faster at 1080p ultra, 9% faster at 1440p ultra, and 10% faster at 4K ultra. So the AMD GPU got comparatively worse at ultra settings, despite having 33% more VRAM.
Overall, though, nearly all of the data I've accumulated suggests that the long-term change in performance from both companies, after driver and game updates, is less than 5% -- and probably less than 3%. Individual cases vary, but if you look at a larger swath of games and settings things are pretty consistent.