Those are two different sites coming to the same conclusion. I wouldn't say that it's pointless. It is a result, and even though for the R9 285/380 we have only one result, the overall trend of the GCN cards and DX12 is evident as of now, with Fury cards being the exception due to driver issues (probably). The Fury X performing better with an i3 than with an i7 supports the claim regarding driver issues. Until we have any counter evidence of AMD being superior under DX12, I see nothing wrong with my points, especially considering the architecture on paper is reflecting real world performance after a long time. If they are equal under DX11 and have an evident performance boost over DX12, is it not warranted to recommend the one that gets the (bigger) boost?
Remember that nVidia is the one that has the money to put more resources into driver development and optimizations. Ironically, people are currently telling us to wait for them to have better drivers for DX12, and nVidia will surely catch up to AMD. This implies that AMD currently has better DX12 drivers than nVidia, which in itself I find to be very improbable, to put it lightly. We all know that AMD had trouble with drivers under DX11 even to this day, despite many years and efforts of optimizations. And right now, both companies have had access to DX12 for pretty much the same amount of time, so... It's logical to conclude that driver issues can be expected more on the end of AMD (like on their Fury cards) rather than nVidia, not only due to nVidia's superiority on this side, but also the amount of resources they are each able to put into driver development. The results must be because of something else, and I'd say that the efficiency of GCN can only now be reached (for whatever reason), while nVidia's cards were already near their maximum performance due to their good DX11 drivers and serial nature.
If I personally had the choice, I would pick any GCN card over its competitor in the same price range, except maybe the 980 Ti and the GTX 950. Ever since GCN was released it's been shown to be a long term architecture. The HD7970 is right now basically a 280x, and it's still competitive. What has happened to say the GTX 780 Ti? It gets outclassed by an (in comparison) miserable GTX 960 in certain cases. nVidia itself reduces the life of their older cards with GameWorks. They improve very fast, but are only in the 'right now'. GCN on the other hand is being used in consoles that are supposed to have a life of at least four additional years, despite being 'old' and having the reputation of being power hungry. If nVidia indeed had the superior architecture, I think Microsoft, Sony and Nintendo would've preferred nVidia. They definitely see something in GCN, that has not been utilized yet.
But if you think we should wait before jumping to conclusions, I have no problem with that. But, don't expect this trend to change until a completely new architecture from nVidia, which will probably be the successor of Pascal.