Even if there are room-temperature superconductors that we could use in CPUs, is this statement even credible? Doesn't a lot of the power dissipation of modern CPUs come from leakage and transistor-switching?
Leakage comes from the extreme proximity between traces, I don't think superconductors will do anything about that and leakage is becoming an increasingly significant loss factor as things get packed tighter together. CMOS logic works by charging and discharging gates, the amount of energy spent charging or discharging gate capacitors is Q=Cg*V^2/2, superconductors aren't going to change that either, nor the energy associated with charging and discharging the remainder of parasitic trace capacitances. The only thing superconductors might change is trace conduction losses assuming you can shape crystals in a way that lets you put them on wafers.
Carbon nanotubes were hyped as the solution to semiconductor copper losses 20 years ago. I don't remember the last time I read about any sign of progress on that. Getting tubes to grow to the necessary lengths and self-assemble on a wafer sounds like extremely tricky business.
A bit like why Graphene has taken so long to find commercially viable applications?
But, unlike Graphene, there might be other materials in this same category that are easier to manufacture at scale.
Graphene has plenty of applications when you only need micron-sized flakes... such as thermal pastes, high-performance greases and pencil leads
Likewise, if LK-99 is real but can only be made in micron-sized flakes, I it may still be useful in less glamorous applications such as magnetic filler in ferrite-like transformer and induction motor cores to eliminate most of the remaining eddy current losses there assuming it can also bear the associated current and magnetic flux densities before quenching.
No shortage of things that can go wrong besides the LK-99 experiment turning out to be a hoax.