IBM Says Practical Quantum Computers are Close

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
I get so excited when I see this stuff but then quickly realize that most software we have today is barely capable of utilizing more than two cores.
 
"a "classical" computer system will integrate quantum computing hardware"

Assuming they're suggesting that Quantum processors will first exist in the system as a co-processor of sorts. Then they'll somehow include them on the same die with the rest of the main CPU. At that point it'll completely negate any need of a GPU at all. It won't be a question of 'can it play Crysis'. It'll be 'how many screens can it push running Crysis IX @ 4320p'.
 
I'm totally psyched about quantum processors, then I can have a CPU that exists and doesn't exist at the same time.

holy crap, I better patent that shit so I can sue everyone in a few years.
 
Did they mention that you'd probably have to have your processor below 30mK for it to work well? This is still a very practical temperature ranger for research labs and other potential operators with the money to run a dilution fridge, but not for the avg Joe.
 

Quantum computing is not about running CS5 faster, it is about a completely new paradigm of how computation is performed at the lowest levels. Because qubits can be in multiple states at the same time, when you perform operations on qubits you actually perform the operation on each state the qubit simultaneously occupies. In this way you are able to perform vast numbers of operations with very few qubits, but only in a probabilistic sense because of the nature of quantum mechanics. i.e. you are given a probability as a result, rather than exact numbers. With quantum computing you can solve optimization problems that couldn't even be attempted before. Ultimately quantum computing is not for the average consumer to use, but rather a tool to allow researchers and industry to perform computations that were previously impossible.
 
This is great news from a reliable company and will be very good for research. Unless you plan on having liquid helium in your living room to support the superconducting, don't count on this showing up anywhere but in the labs and large corporations for decades🙂
 
This sounds like a great excuse to begin the cloud takeover of computing. It's never going to be practical to have a superconductor in a house or apartment, but one in a factory server environment? That would work.
 
Hard to imagine a thing can have 2 different values at the same time. Like you deposit $1000 in your savings account and the receipt prints you deposited $0. Easy come easy go.
 
Thousands or millions of qubits as a potential for a superconducting system (read: low temperatures are a must) is going to run head first into the scalability problems of QC. Also, I'm going to take this with a grain of salt because, as with many past such claims about "oh we've solved the problem", practical examples have often come very, very short of their promises. When I see this on the arxiv, with multiple experimental verifications from other groups, I'll nod and move along with my day.
 
Well, you probably cannot expect them to couple this with 'standard' computers.
It would be a disaster for the most part.

If anything, computers today are an embarrassment to what they could have been.
By now we could have had synthetic diamond as a material for microchips due to its cost effective viability for industrial creation in 1996 (which is when the process seems to have been 'perfected'). Patents of course slowed the usage of synthetic diamonds until 2004, and it wasn't until then that semiconductors out of diamonds were made (right after the patent issue was dealt with actually).

So, barring patents aside, we could have had insanely powerful computers today that would also suck up less power than the ones we have now (coupled with the premise of non-existent temperatures).

Now, add graphene into the mix (which is 2 to 3x better than diamond in every respect) at least in some kind of hybrid form, and voila.

But of course, the market will first introduce a possible silicon/diamond hybrid, followed by a full blown diamond computer and then of course hybrid of diamond/graphene before they finally switch over to graphene entirely.

Well, maybe it won't take too long... but given how the market operates, coupled with planned obsolescence (unless we change the economic model and force them to provide the best of the best as soon as its available with upgrades in mind that doesn't break down after a short term use, and can be fully recycled) then I guess we can start see some real leaps.

 
[citation][nom]bleepboop[/nom]Thousands or millions of qubits as a potential for a superconducting system (read: low temperatures are a must) is going to run head first into the scalability problems of QC. Also, I'm going to take this with a grain of salt because, as with many past such claims about "oh we've solved the problem", practical examples have often come very, very short of their promises. When I see this on the arxiv, with multiple experimental verifications from other groups, I'll nod and move along with my day.[/citation]
IBM hasn't claimed anything more or anything less than hitting one of many milestones to make quantum computing possible. I don't thing IBM has anything to gain by exaggerating anything here, and I don't think they are going to give away proprietary information to competition just so it can be "independently confirmed" and satisfy skeptics who read the news. They would rather live and let live, and then proceed to make billions and billions of dollars later.

As far as the "scalablitly problems of QC" that you mentioned, this actually helps that situation because more stable and therefore "readable" results means that they can ease up on some of the redundancy that was previously needed in order to ensure correct results.
 
[citation][nom]thivaldi1234[/nom]Did they mention that you'd probably have to have your processor below 30mK for it to work well? This is still a very practical temperature ranger for research labs and other potential operators with the money to run a dilution fridge, but not for the avg Joe.[/citation]

We can run the silicon and most GaAs qubits at room temp somewhat decently. IBM is one of the few corporations truly pursuing quantum computing, and most of the issues with QC have been known for some time. If IBM can solve the decoherence problems and develop a scalable device, they wil have basically solved the main limitations to QC.
 
[citation][nom]beayn[/nom]I will pre-order my 250-qubit hard drive now please.[/citation]

Remember those old days where a 1 megabyte hard drive cost several thousand dollars?
 
I got 2500K OCed now, but my OS like win7 doesnt even use all of it to think for me. My computer =win 7 doesnt think. I still need to do everything manually.
 
Quauntum computing is based on the Quantum Theory, which is frankly, on very shaky ground mathematically, mechanically, and logically. The very non localized effects on which quantum computing is supposedly based depend on the Copenhagen Interpretation of Quantum Mechanics to work... in theory. Problem is, the Copenhagen Intrepretation is just that, an interpretation of data, not a logical theory or mechanical underpinning that explains what is going on. If you think that any data can be computed on in logical steps when you are making statements that say something can be yes and no and both or neither at the same time, you are believing in a paradox or contradiction...good luck with that. Douglas Adams basically made fun of this in his novels with a starship that used an "improbability drive" to get from point "a" to point "b" by taking every conceivable path inbetween, and then choosing the shortest route. I think they should rename Quantum Computing something like "The Improbability Computer", then when the thing doesn't work, you can just say it was a joke... that cost billions of dollars to develope.
 
Status
Not open for further replies.