AMD more consistently resorts to price-cutting, towards the end of a product's life cycle. Therefore, I think we haven't seen the full price/performance benefit of Navi that AMD is prepared to offer.
I'd wager their pricing was more dictated by 7 nm supply constraints than anything else.
I suspect that too. They probably don't want to cut into their CPU production, hence the relatively mediocre performance gains relative to the price of these cards at launch. I wouldn't be surprised to see them significantly discounted by the end of the year in response to Nvidia's next generation of hardware though. They were mainly just brought up as an example of big performance gains to hardware not necessarily translating to big gains in value.
For the 1080 and 1080ti moving to turing saw up to 50% improvement. On average it was half that, about 25%, a bit more when moving to 4k, maybe 30-35%.
Yep, though some of that is down to the graphics hardware being limited by CPU performance in many games, hence why 4K shows greater improvements. And some of the other parts saw a bit larger relative performance gains compared to their similarly-named (but not similarly-priced) predecessors. The report where that rumor comes from was also suggesting
"up to 50% more performance" though, not
"50% on average". So, it sounds like a similar scenario, though perhaps without shifting around model names further to make it happen. Maybe performance will be around 50% better in some games, but the average will probably be lower than that.
And it's unknown how much or how little that 75% performance gain at certain compute tasks in a supercomputer might translate to gaming performance. We can't even be sure that these Tesla GPUs they will be using will even have corresponding equivalents in the consumer market. The compute gains mentioned here may also come primarily from additional Tensor cores or something. With Tensor cores tied to RT performance in the 20-series cards, if they were doubled for example, that might translate to a big improvement in raytracing performance. If the impact of enabling RT effects was cut in half, that could make them a lot more usable. It's very possible that "up to 50% more performance" is referring to performance in certain games where RT is enabled, in which case the RT performance gains might be accounting for a good chunk of that uplift. If a game gets 100fps with RT disabled, and only 50fps with RT enabled on a current card, but its successor can push 75fps with RT enabled, that's a 50% performance uplift right there, even if the gains in non-RT games might be minimal. I do suspect RT effects will likely become the norm in the coming years though, so that's likely to be very relevant down the line.
In any case, it sounds like Nvidia will be announcing their next GPU architecture soon, so more details should be available before long. I suspect a lot of marketing may cloud how the cards will actually perform until they eventually come out and can be tested though, perhaps half a year or more from now.