I think researches from RIKEN last year 2022, also provided a prototype for the first time of quantum error correction in silicon, which was I think made possible by implementing a three-qubit Toffoli-type quantum gate. They demonstrated full control of a three-qubit system. Regardless, it was not an entirely new concept though.
Not sure how is the progress going on now on this. For two qubits it was okay, but for error correction we actually need a three-qubit system.
But anyway, this new hybrid algorithm of quantum and classical methods sounds promising, and there seems to be lower computational cost to compile time-evolution operators as well.
There are still some challenges to overcome though:: I mean, for example improving the stability of the qubit, increasing the computational scalability, and also enhancing the error correction method to realize the full potential of quantum computing in atomic-level simulations/calculations.
Curious to know for this algorithm whether they are using either Python, or even Qiskit, Cirq, an/or Microsoft's Q# languages, or maybe some other ?