New Materials Could Make Quantum Computers More Practical

Status
Not open for further replies.

manleysteele

Reputable
Jun 21, 2015
286
0
4,810
14
"Both Google and IBM believe we’ll reach “quantum supremacy”--the point when quantum computers will be faster than conventional computers at solving a certain type of complex problems--when quantum computers have around 50 qubits (from the fewer than 10 qubits they do now). The two companies expect this goal to be reached in the next few years."

Right. I'll wait. Right now, this tech looks like the scientific equivalent of "Cold Fusion".
 

Zincorium

Commendable
Apr 18, 2016
4
0
1,510
0
ManleySteele- at absolute worst, it's the scientific equivalent of *hot fusion*. As in, it works, period, and it's a challenge of scaling it to accomplish useful functions. Our current quantum computers are like ITER or the Tokomak reactors- interesting, but not good enough, and we don't know which approach is the better bet.

Cold fusion, on the other hand, is debunked and not being seriously worked on. There's no real theory that explains how it would work if it did work.
 

dstarr3

Honorable
Mar 18, 2014
1,527
0
11,960
52
We already have quantum computers that work. This is just about finding how to make them work better and with more reasonable resources. It's the same thing as traditional computers from 70 years ago that filled up a warehouse just to do basic arithmetic. We've just got to figure out the best method of optimizing and improving the new technology.
 

ethanolson

Distinguished
Jun 25, 2009
318
0
18,780
0
Quantum computers rely on a lot of conventional computing technology. The quantum processor is where all the fuss is, hence it being the focus of the article. It'll be interesting to see if success pans out. I also wonder how much of the quantum thinking and math can be applied to conventional computing for modification and improvement in our daily platforms. Why do I wonder that, given the braod differences mathematically? Because I believe that quantum is rational behavior of most particles, but happening at a "frame rate" beyond our current ability to measure. Hence the idea of bi-positional qubits, to me, is just a single qubit vibrating or oscillating between two places and being sensed in both places as if in two places at once. The qubit is just too fast... that's all. If we've found usefulness in that, think of the possibilities!
 

InvalidError

Titan
Moderator

There are countless of promising future technologies that have been proven in labs but never got commercialized because no economically viable way was found to manufacture products based on them or by the time that viable manufacturing became viable, something else came along and rendered them useless.

Nuclear fusion is one example of technology that scientists originally thought would take only 20-30 years to research and 50 years later, a commercially viable implementation is still elusive. For quantum computers, the output is a statistical distribution of possible results and the amount of uncertainty increases with every qubit added, so you have the challenges of packing more qubits together, reducing the amount of output noise, finding suitable room-temperature materials to make them from, finding ways to get data in and out of them, etc

I get a feeling that the more scientists research quantum computing, they'll discover that there is even more that they still don't know about and that commercially viable quantum computing is further away than they thought.
 

dstarr3

Honorable
Mar 18, 2014
1,527
0
11,960
52


Well, the thing is, what's the alternative? Our current processing technology won't last. Moore's Law gave up the ghost long ago as transistor count got so high and gate size has gotten so small. It's at the point now where it can't get much smaller without needing so much error correction that it's overall performing worse than its predecessors. So, clearly, we need something new. And, well, right now, this is the most promising technology we've stumbled on. So, let's research and develop it and see what it's worth. If it's not worth anything, oh well. Something will have been learned, at the very least. No reason to not, until some other technology appears and proves itself more promising, assuming such a thing happens.
 

InvalidError

Titan
Moderator

Moore's law still has ~10 more years to go - transistor counts are still going up, spectacularly so if you consider 3D-NAND. Logic chips could go 3D too, but we need far more power-efficient transistors first as it would be effectively impossible to cool a ~1000W stack of 16 modern CPUs with a ~160sqmm HSF contact patch.
 

manleysteele

Reputable
Jun 21, 2015
286
0
4,810
14


With Intel saying they are going to introduce a new architecture after 2020, I'm wondering if we're not going to see Xeon Phi cores in a desktop processor. If so, how many and at what clock?
 

InvalidError

Titan
Moderator

I could imagine that happening in a sort of big-little mix like what is done on mobile with A72/A53: strong cores (Cannonlake-like) to run software that depends heavily on a few threads' performance and a bunch of simpler cores (Atom/Phi) to run less compute-intensive background tasks and massively threaded software.
 

jasonelmore

Distinguished
Aug 10, 2008
606
3
18,995
3
Time Crystals show the most promise because they don't rely on subzero temps. It will be interesting to see the comparison in scalability once both mature in their research.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS