QoS packet marking has been available since the ip protocols were defined 30 years ago. This is nothing new and they still have not solved the major problems. First everyone will just mark all their data to be the highest and fastest class. Next all ISP must agree on what the marking mean and which traffic gets priority.
That's not how I read it. I think they're going to throttle connections at/near the edge, in order to eliminate congestion in the core. This not only reduces packet loss, but also keeps queues mostly empty and thus we see a reduction in latency. It doesn't need to rely on packet tagging or traffic classification. The only real assumption built into it would be that the most latency-sensitive connections won't also be the highest bandwidth ones, because those might indeed see a slight worsening of latency.
As you point out, they can only control their own network. However, a lot of big content providers are using content delivery networks (CDNs) which have co-located nodes inside the networks of big ISPs like Comcast. If most traffic is staying inside Comcast's network and they can virtually eliminate congestion there, then you'll likely experience an overall improvement when connecting out over backbones they don't own. Sure, if you're going transoceanic, all bets are off.
And third this violates the concept of net neutrality people are all up in arms about. So if some game company pays extra their traffic is prioritized over other game companies.
That battle was fought and lost (in the US), from what I recall.
I agree the whole bottom of the article is completely idiotic written by someone who has no idea how network latency really affects games. The refresh rates of monitors and video cards is massively faster than any network connection.
Let's say you game at 165 Hz (no framegen, because it doesn't help with latency). That's still 6 ms. It potentially
stacks with the rest of the latencies in the system. It's the additive nature of latency that gets some people get so nuts about trying to maximize framerates and any other bits of latency they have control over.
I think it's a smart move by Comcast, who can't really compete with fiber on raw bitrates. However, if they can achieve meaningful latency reductions, that's another way they can differentiate themselves and try to compete, maybe even offering a higher price tier for it.