IBM Says Practical Quantum Computers are Close

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

hoof_hearted

Distinguished
Mar 6, 2010
349
0
18,780
[citation][nom]jprahman[/nom]Quantum computing is not about running CS5 faster, it is about a completely new paradigm of how computation is performed at the lowest levels. Because qubits can be in multiple states at the same time, when you perform operations on qubits you actually perform the operation on each state the qubit simultaneously occupies. In this way you are able to perform vast numbers of operations with very few qubits, but only in a probabilistic sense because of the nature of quantum mechanics. i.e. you are given a probability as a result, rather than exact numbers. With quantum computing you can solve optimization problems that couldn't even be attempted before. Ultimately quantum computing is not for the average consumer to use, but rather a tool to allow researchers and industry to perform computations that were previously impossible.[/citation]

Like prime factorization. Shor's algorithm is one nice example. IBM will put the "big brother" fear in all. This will be better than them underground Cray farms at the NSA.
 

spagunk

Distinguished
Jun 9, 2006
28
0
18,530
Like someone else said, it would essentially be relegated to more of a cloud computing type environment, not home use. Think more like Startrek where there is a central computer system the performs the majority of tasks while people just interface with it. Your own PC will do mundane tasks like running your OS or something but using things like virtualized graphic systems (ala nvidia's latest announcements) and advanced AI on the quantum computer is more likely than having one of these in your own home (well, with today's technology that is).

I would think it would be some time before we can even think about having our own system in the home. Unless you are a rich bastard that is.
 

punahou1

Distinguished
Dec 26, 2010
288
0
18,810
So does this mean that all code based on the binary system will need to be converted to a trinary system at some point in the future?
 

hoof_hearted

Distinguished
Mar 6, 2010
349
0
18,780
[citation][nom]punahou1[/nom]So does this mean that all code based on the binary system will need to be converted to a trinary system at some point in the future?[/citation]

Nah, look at it more as multitasking on steroids at the bit level.
 

Phyrexiancure

Distinguished
Mar 28, 2011
316
0
18,810
[citation][nom]deksman[/nom]Well, you probably cannot expect them to couple this with 'standard' computers.It would be a disaster for the most part.If anything, computers today are an embarrassment to what they could have been.By now we could have had synthetic diamond as a material for microchips due to its cost effective viability for industrial creation in 1996 (which is when the process seems to have been 'perfected'). Patents of course slowed the usage of synthetic diamonds until 2004, and it wasn't until then that semiconductors out of diamonds were made (right after the patent issue was dealt with actually).So, barring patents aside, we could have had insanely powerful computers today that would also suck up less power than the ones we have now (coupled with the premise of non-existent temperatures).Now, add graphene into the mix (which is 2 to 3x better than diamond in every respect) at least in some kind of hybrid form, and voila.But of course, the market will first introduce a possible silicon/diamond hybrid, followed by a full blown diamond computer and then of course hybrid of diamond/graphene before they finally switch over to graphene entirely.Well, maybe it won't take too long... but given how the market operates, coupled with planned obsolescence (unless we change the economic model and force them to provide the best of the best as soon as its available with upgrades in mind that doesn't break down after a short term use, and can be fully recycled) then I guess we can start see some real leaps.[/citation]

I'm no expert or don't really support this but doesn't planned obsolescence advance technology faster since consumers continually have to buy newer technology.
 

ouroborous

Distinguished
Sep 13, 2008
4
0
18,510
Day 1: Quantum computing becomes a reality!
Day 2: TSP finally solved!

/snark

In reality, qubits are NOT for GP (general purpose) computing. Because of their superposition and probabilistic attributes, it would be exceedingly hard to run most traditional code on a qubit system. And this isn't just a "software sucks" rant, it's an acknowledgement of the truly different way in which a qubit system works (processing all possible states at once, in a superposition) vs. a standard silicon logic system (linear processing using logic units coupled with a numeric processor).

That being said, in systems that model the real world, qubits may be a breakthrough of truly revolutionary proportions, on the order of the invention and miniaturization of the transistor itself. For instance, given a large enough system, qubits could dramatically improve climate forecasting or pharmacological research. And the area where I'm TRULY excited about qubit technology is AI -- the probabilistic nature mimics some aspects of neuron activity. Couple that with memristance (for circuits that "learn") and some clever research, and Kurzweil's "singularity" begins to become (frighteningly?) real.
 

wiyosaya

Distinguished
Apr 12, 2006
915
1
18,990
From what is said in the article, I am not getting that IBM is saying "quantum computers are close." Especially since they say "In the future". Without a hard time-line, they are nowhere near saying quantum computers are close, IMHO. Even hard timelines, such as say "in 20 years" are wild-ass guesses at best. Scientists working on controlled fusion have been saying "in 20 years" for more than 20 years.

Show me the practical, working quantum computer prototype, and then I'll agree that "quantum computers are close."
 
G

Guest

Guest
Heck, IBM Almaden Research proved and published years ago that quantum actually works. Interesting enough, that research piece was part of what is needed to break RSA, which quantum could break instantly. Going to be a long long time before we see this stuff commercially, since so much of today's so called 'security' would be vulnerable. But then, you could certainly imagine governments wanting to have them badly (and likely some already exist). You could also imagine that once commercially viable, that their use would be severely restricted or purposely limited. Similar to encryption some years ago, which most likely was only allowed for common people to use after govts had the ability to compromise the schemes.
 

minchu0647

Honorable
Jun 7, 2012
3
0
10,510
[citation][nom]cookoy[/nom]Hard to imagine a thing can have 2 different values at the same time. Like you deposit $1000 in your savings account and the receipt prints you deposited $0. Easy come easy go.[/citation]
lol more like you true and false exist at the same time. Very confusing.
 

ashinms

Honorable
Feb 19, 2012
155
0
10,680
[citation][nom]capt_taco[/nom]I'm totally psyched about quantum processors, then I can have a CPU that exists and doesn't exist at the same time.holy crap, I better patent that shit so I can sue everyone in a few years.[/citation]
Wow. We're on the verge of creating a computer that will help us unravel the secrets of the universe, and all you can think about is playing a half decade old game... -_-
 

deksman

Distinguished
Aug 29, 2011
233
19
18,685
[citation][nom]phyrexiancure[/nom]I'm no expert or don't really support this but doesn't planned obsolescence advance technology faster since consumers continually have to buy newer technology.[/citation]

No.
For one thing, planned obsolescence forces manufactures to create marginally better revisions of mostly the same technology.
On another front, it also creates enormous waste.
Instead of designing computers to be completely upgrade-able with standardized interfaces, and creating 'the best that we can possibly accomplish with todays technology in the most efficient capacity' (while being sustainable and minimize our footprint), we make computers so they break down after a specific period of time... or they become slow because something 'faster' comes out.

But the 'faster'/more powerful bit is a mere revisions of existing technology.
They could have incorporated those changes (and more) from the get go, instead of putting out small revisions once every 12 to 24 months.
The 'consume and discard' model works for profits... but not technological progression.

We are using 'cheap' materials and means of production instead of the best synthetic materials and mans of production we can do in abundance with highest possible technical efficiency.

Cost efficiency = technical inefficiency.

If we were to create the best that we can from technology, to last/be durable, to be upgrade-able, and highly efficient - profits, would plummet.
 

freggo

Distinguished
Nov 22, 2008
2,019
0
19,780
[citation][nom]capt_taco[/nom]I'm totally psyched about quantum processors, then I can have a CPU that exists and doesn't exist at the same time.holy crap, I better patent that shit so I can sue everyone in a few years.[/citation]

Yeah, we will all have a quantum CPU, we will just not be able to pinpoint exactly where it is :)
 

GogogoStopSTOP

Honorable
May 20, 2013
2
0
10,510
Quantum-Swanntum... I've been searching for days for an explanation of "QC" & can't find an intelligible one anywhere.
In another life, I had an organization that shared a floor with a Josephson Junction research & product group in IBM's development facility in East Fishkill, NY. My group kept expanding but had to move into trailers because the Josephson Junction people couldn't be moved. They were there for 10's of years with hundred's of millions of dollars to spend. Where are they now?
Josephson Junction? At least you could see & understand a Josephson Junction... but they never came close to a product. They're gone, forgotten, Ka-Putt, Nada, Zilch...
Can anyone draw me a circuit diagram of a "QC?"
 
Status
Not open for further replies.