photonboy :
"Here's hoping we'll soon see a single card that's capable of pushing those new Asus and Acer 4K 144Hz monitors to their limits."
Riiigghht...
Before I even mention GPU's you first have a CPU bottleneck before a solid 4K/144FPS happens. It's going to be YEARS before game code gets multi-threaded enough to see that on a regular basis.
Historically gains are not huge between GPU generations.
Leaving the potential of CPU bottlenecks aside, AMD's Computex presentation (https://www.tomshardware.com/news/amd-7nm-gpu-vega-gaming,37228.html) included a claim that Vega 20 is 1.35x as fast as Vega 10 (which we know as Vega 64, in its full configuration). I'm not sure they said 1.35x as fast at
what, but if we take this as an across-the-board figure, that would only put it in range of the GTX 1080 Ti. So, no ground-breaking performance gains, but 35% faster also isn't trivial.
Valantar :
As for this coming to consumers: nope. AMD has all but confirmed that this silicon design is purely focused on the AI and compute crowds, with plenty of double and half/quarter precision hardware that gamers have no use for.
■ If you carefully read what AMD has said, they never ruled out the possibility of it reaching consumers. It's just not their initial focus.
■ Did they actually say it has more double-precision units than Vega 10? How much?
■ Consumers can use half-precision. Just look at the Far Cry 5 benchmarks (https://www.tomshardware.com/reviews/far-cry-5-performance-benchmark-ultra,5552-7.html). So far, both AMD and Intel have fast half-precision packed math in their consumer GPUs. We'll see if Nvidia's next gen follows this trend.
■ Nvidia's consumer Pascal GPUs have "worthless" 8-bit dot product support. So, I guess those aren't for consumers?
Valantar :
A large die with near-zero performance gains and plenty of unutilized silicon area would be a terrible idea to launch as a flagship GPU. Suppose they could make a new Frontier Edition to match the Titan V?
As I mentioned, they claim 1.35x performance gains. I'd hardly call that a rounding error. Also, they demo'd it running Cinema 4D - not exactly a game, but also not deep learning or really even HPC (I doubt it uses much 64-bit arithmetic).
In case you missed it:
https://www.tomshardware.com/news/amd-7nm-gpu-vega-gaming,37228.html