bloodroses :
Being first doesn't necessarily mean being better as software needs to be there as well to take advantage of it. AMD takes risks to try and stay competitive, Intel stays with the standard. the 64-bit CPU is a perfect example as while AMD may have hit 64-bit first with x86, 64-bit Windows didn't become viable until Intel joined the game as well. Anyone who ran the 64-bit version of Windows XP on their new AMD chip knows this very well.
Regardless, if it wasn't for AMD, innovation would stagnant as Intel wouldn't have actual competition in the x86 space.
Except Intel didn't make it viable, AMD did. Intel tried to force their Itanium 64 bit, A.K.A. IA-64 or did you forget this. IA-64 was an utter mess, a complete failure worse than AMD bulldozer. Not even a full iteration of IA-64 and it was gone, so much so that Intel had bought licensing for AMD64 tech and renamed it to x86-64 which is what we have today in all modern CPU's. I also remember that Far Cry was the first game to actually fully utilize AMD64bit to gain performance, I remember patching that game on my Athlon 64. I am no fanboy of either or any of these companies I have a Ryzen and a I7, I have AMD GPU and Nvidia GPU, I go where the value in performance is, for which tasks at hand I need Intel for gaming AMD for workload Nvidia for Gaming AMD for Stability. My 1080ti has had many hiccups along the way AMD has far less it just isn't as fast oh well, stable is more desirable on my eyes than fast.