D-Wave Launches 2,000-Qubit Quantum Annealing Computer, Announces First Customer

Status
Not open for further replies.

jdlech

Reputable
May 31, 2016
168
1
4,715
21
In 1946, the ENIAC computer cost $400,000, which is roughly $5.3M today. So, if ever there will be a personal quantum computer, even if specialized, the first few will likely cost about $9K in 2017 dollars. I'm looking forward to playing the quantum equivalent of 'pong'.
 

bit_user

Splendid
Ambassador
Do you know that it needs to run at a fraction of the temperature of interstellar space? I think the refrigeration is probably the most expensive part, in addition to consuming most of the energy and requiring most of the maintenance.

In addition to that, it must be specially shielded from interference. That's not going to be cheap or small, either.

So, I think we can safely rule out such quantum computers ever coming into the home. It would have to be some fundamentally different technology. Until then, you'll have to be satisfied renting time on them via the cloud.
 

bit_user

Splendid
Ambassador
This isn't a CS problem - it's physics.

How are you going to keep the qbits entangled @ temperatures achievable in a home setting? What's even the use case for such computers in the home vs. cloud?

I'm no expert on quantum computing, but I think it's naive to assume that just because conventional computers also started out big & expensive, that we'll one day buy quantum annealing computers for a couple bucks and carry them around in our pockets.
 

dstarr3

Honorable
Mar 18, 2014
1,527
0
11,960
52


I'm skeptical, too, but hey, never say never. Anything could happen.
 

bit_user

Splendid
Ambassador
Yeah, we don't know what we don't know. Maybe theoretical physicists could conclusively say that quantum annealing will never happen in a home appliance, but I certainly can't.

What I can say is that their website is worth a look, if you like reading about (and seeing pics of) engineering marvels along the lines of CERN LHC. Some of the highlights:

  • ■ Cooled to 180x colder than interstellar space (0.015 Kelvin)
    ■ Shielded to 50,000× less than Earth’s magnetic field
    ■ pressure is 10 billion times lower than atmospheric pressure
    ■ 200 I/O and control lines from room temperature to the chip
    ■ The system consumes less than 25 kW of power
Remember, they didn't build it to those specs for fun or to show off - this is what they had to do to keep large numbers of qbits entangled for long enough to complete the annealing process.
 

ShadowWolf5144

Commendable
Jul 19, 2016
4
0
1,510
0
I don't foresee this being a practical use for the common computer user, HOWEVER I wonder if it could be usable in the future (several generations from now long after I'm gone) in say future of smart homes (quietly dreaming of S.A.R.A.H.).
 

jdlech

Reputable
May 31, 2016
168
1
4,715
21
"A 25MHz 286 is more computer than any household will ever need"
-some IBM spokesman at the 1986 comdex convention.

Of course there's two lessons in that statement. I'm reminded not just that people can always make a need for a faster processor, but also that IBM didn't bother with color graphics cards for years because they "didn't see a need for one". Which put PC color graphics card development back about a dozen years. We should be careful, lest we make our own self fulfilling prophesies.
 

bit_user

Splendid
Ambassador
Sure. I'm well aware of that history, but there's an element of this that's about cost/benefit, rather than simple capability.

With the cloud as we now have it, the benefit of a home-based QA computer is much less than it would've been, in the past. In fact, with these things in the cloud, you can effectively already have a QA computer in your cell phone.

Secondly, we have a lot of experience with computing, and the kinds of problems QA are good at are thankfully not that common. But, here's where I agree with your point... Now that we have a way to solve these problems optimally, it's possible we'll find new uses for them.
 


"...they'll never make a computer that can do a megaflop, just think of the failure rate of those vacuum tubes, and the amount of electricity thy used would power a city..."

 

bit_user

Splendid
Ambassador
I realize it's hard to shake the idea that technology can overcome any obstacle, given enough time. But you should consider that there are some limits to physics that no amount time and intelligence can overcome.

Just because you see an analogy in the history of classical digital computers doesn't mean the same potential for improvement in scale and portability exists in quantum computers.

Perhaps you find the prospect that the next 50 years of computing won't be the same march of exponential progress as the last 50 years. It has been an amazing ride. But let's look at another technological trend people once assumed would continue at the same pace: that of power. If you take most of the 1950's and 1960's era predictions of the future, they were based on the assumption that electricity would continue to get cheaper until it was virtually free, and that generation or storage would be increasingly portable. This fueled predictions of jet packs, flying cars, and more elaborate constructions. But that technological trend was quickly slowing and so many of the predictions haven't materialized (or at least not in anything like the fashion and timescale that people imagined).

The thing about technology is that it's hard to predict what people will invent or improve. But you can be sure it won't violate physics. And the physics of quantum entanglement make it extremely sensitive to interference. That's why I'm relatively certain you'll never have a quantum annealing computer in your home, much less your pocket. And with the ease and flexibility of cloud computing, that's (usually) okay.

I hope that helps, but if you'd prefer to believe that anything is possible, then just be sure to leave out some milk and cookies for Santa Claus.
 

bit_user

Splendid
Ambassador
Rather than worry about whether you'll one day be able to build a DIY MegaQubit home QA computer, why not focus on the marvel of these feats of science and engineering? That was my (earlier) point.

IMO, some of the engineering that's gone into these things is mind-boggling and really at the cutting edge of science. I think it's pretty amazing to read about - far more so than any water-cooled multi-GPU setup. Don't you?
 
Status
Not open for further replies.

ASK THE COMMUNITY