Here's the funny thing: Name me some monitors that you know support DSC. It has been around for a while, yes, but it's actually hasn't been used much. This Samsung Odyssey Neo G8 is one of the first I've encountered with DSC support. So I guess anything that has existed as a standard and not been used until recently is "yesterday's technology." We could also rephrase your rephrasing to this: "I really need to buy 'future proof' hardware with support for standards that barely exist, in the hope that some day over the 3-year life of this product I'll actually be able to use that tech!" By the time monitors and displays that can actually benefit from DP2.1 are available at not-insane prices, we'll be on to the next generation GPUs, and I'm positive Nvidia will finally leave DP1.4 behind with Blackwell. (Though I was almost certain that would happen with Ada and that ended up being wrong...)
I'm not opposed to DisplayPort 2.1 support in AMD, but it's truly not a big deal. We only have a very limited number of monitors that just came out in the past six months that approach the limits of DisplayPort 1.4a with DSC (as in, 4K and 240Hz). If 8K actually takes off (doubtful in the next five years), maybe we'll see displays that truly need DP2.1 bandwidth — 8K and 120Hz. Now read that again. 8K. 120 Hz.
Do you know how much high fps 8K content there is? Practically none, and what there is exists mainly for marketing purposes. 8K and 60 fps is barely a thing, and it's already supported with DP1.4a. 8K and 120 fps is the practical limit of what we might ever really need, because while we could reasonably do 120Hz videos and gaming at some point, it's a massive stretch. Gaming at 8K and 120Hz? That's 4X the pixels of 4K, and even the fastest GPUs, using Frame Generation, rarely get above 120Hz at 4K.
8K is basically a living room standard for placebo effect. 8K for movie theaters with 80 foot screens, sure, but on a 120-inch projection viewed from 20 feet away? Your eyes literally can't see the difference. 8K on a screen that's three feet away? Sounds like great marketing material. We could call it the "Retina 8K display" and get all the Apple users to buy it, and then run at 200% scaling. "OMG the pixels are so small that I can't see than and need a magnifier!" That's already basically true of 4K on a 27-inch monitor, like the one I'm using right now (and have been using for eight years).
Calling DSC "trickery in video compression" could apply to every video codec ever released. Well, except DSC isn't a codec, it's a very high throughput algorithm that, depending on the screen content, can deliver either lossless compression (on a lot of stuff) or it can do up to 3:1 compression with minimal artifacting. And I really do mean that 'minimal' bit. Good luck spotting it. It's not at all like 4:2:2 or 4:2:0 "compression" where the result is lossy and you might actually notice (maybe, depending on the implementation — RTX GPUs and RX 5000 and later look fine; GTX 10-series not so much). The only real problem with DSC is that if you get any signal corruption, like even a single flipped bit, it can seriously screw up transmission. That's probably why it hasn't been used until now.