Depends on the games used, relative age of the driver sets, optimization of the game engine (some favored amd highly, some nvidia highly and most were pretty neutral).
I mean when most of the 780ti vrs 290 reviews were relevant, amd had some real issues with drivers, most of those issues being mostly fixed in later Crimson drivers. There were also games basically tailor made for amd cards, so much so that even a 270x topped the 780ti in avg.fps and a R9 290 could beat out a Titan, and other games where a gtx760 tied or beat the R9 290x.
And then there was benchmark selection. Invariably, the benchmarks all revolved around 2-3 standard games like BF3, crysis and tomb raider and maybe you'd see 1-3 oddball games that I never knew anyone played but the reviewers were fascinated with for their gpu killing code or simplicity.
So what really needed to happen was a much broader scope, with games that were actually played, which would total somewhere about 50 or so, and with the amount of popular gpus from the 660-780ti to the HD7750-R9 290x, and not to mention 1080p low, med, ultra and maybe 1440p low, med, ultra that that would have been a massive undertaking, probably almost a years worth of work, the end result being we are now into the 300 and 900 series cards, so kinda a moot point. But that kinda averaging is the only fair test, as I said earlier, benchmarks on just a few games don't take everything else into consideration, too many other variables.