So far, classical computing beats out quantum based computing including D-Wave.
So what's all the hype about? Is it purely for encryption. Has it been proven to crack modern classical computing encryption??
Unless the payoff is supposed to be so vast and huge that it's worth dumping R&D into this (and I'm all about science, but at some point you've got to deliver a product)? I'm thinking Quantum computing is starting to look like the "Fusion Energy" of processing data. Always at a distance...like a mirage.
IF quantum computing as a form of application at the client computing end, it will be in the form of either a new chip, or part of existing fab technology. Just as we have Integer, Floating Point, and Vectoring, I could see adding Q-Bits to the instruction set; perhaps as an enhanced form of encrypted communications. But what do I know. Just my 2 cents.
@STDRAGON my understanding is that right now all code is designed for linear compute threads and built on machine code which every programming language is built on top of (I might be wrong saying "all" but some expert can correct me) that fundamentally requires 0's and 1's which, while it's worked well for decades is not how we as humans think (we think abstract, not linearly...at least supposedly passed puberty ) so we end up building these massive algorithms to simulate parallel and non-linear tasks and commands.
Quantum computing allows the base machine code to be replaced by a new low-level code that isn't built strictly on 1's and 0's and while this doesn't seem like a big deal at face value it will allow for far more flexible and EFFICIENT code to be written. From what I've read and been told this new fundamental code structure will sort of raise the ceiling for how complex computing can become and the speed at which it is run.
TLDR; In essence, it won't make your games or productivity apps that were designed for linear compute threads run faster but it will open the door for different, more complex programs to be written that we haven't even come up with yet. We keep going back to encryption because it's the only thing we can use as an example that would benefit from this new computing format on day one.