Meanwhile, texturing techniques have not really advanced at a similar pace mostly because texture compression methods essentially remained the same as in the late 1990s, which is why in some cases many objects look blurry in close proximity.
LOL, wut? Were they even
doing texture compression, in the late '90s? If so, whatever they did sure wasn't sophisticated.
Since then, I think the most advanced method is
ASTC, which was only introduced about 10 years ago.
More to the point,
programmable shaders were supposed to solve the problem of blurry textures! I know they can't be used in 100% of cases, but c'mon guys!
Nvidia claims that NTC textures are decompressed using matrix-multiplication hardware such as tensor cores operating in a SIMD-cooperative manner, which means that the new technology does not require any special purpose hardware and can be used on virtually all modern Nvidia GPUs.
Uh, well you can implement conventional texture compression using shaders, but it's not very
fast or efficient.
That's the reason to bake it into the hardware!
in complex scenes using a fully-featured renderer, the cost of NTC can be partially offset by the simultaneous execution of other tasks
It's still going to burn power and compete for certain resources. There's no free lunch, here.
I think they're onto something, but it needs a few more iterations of development and refinement.