Intel Joins Quantum Computer Race With 17-Qubit Research Chip

Status
Not open for further replies.

hannibal

Distinguished
Hmmm... If you are making weather forecast and AI programs, where there is not one right answer to the problem, go to Qubit, if you would like to play games of use office programs, take Coffee lake...

It would be interesting to see an article about real usage cases where Qubit type quantum computers are really useful and work as intended. The whole area is somewhat mystified area at this moment.
 

bit_user

Polypheme
Ambassador

Nothing interactive, for one thing. It takes a fairly long time to run a single iteration.

They're good at solving optimization problems with lots of interrelated variables.

The point about using them to optimize neural networks means that the deep learning revolution might get a big boost. Far more sophisticated models might be squeezed down to run on phones, robots, self-driving cars, and even IoT devices than would currently fit, today.

Other fields it could revolutionize include materials science and microbiology (i.e. things like protein folding - no more need for Folding @ Home).
 


"..D-Wave, the most famous quantum annealer, and universal gate quantum computing are not competitors. While they rely on the same concepts, they are useful for different tasks and different sorts of problems, while also suffering from different challenges in design and manufacturing..." https://medium.com/quantum-bits/what-s-the-difference-between-quantum-annealing-and-universal-gate-quantum-computers-c5e5099175a1

Not the same thing.

 

bit_user

Polypheme
Ambassador

Thanks. I gather they're still quite restricted in what sorts of computation they can do. That article pointed me here:

http://math.nist.gov/quantum/zoo/

For many things, I'd imagine we'll still be using classical CPU and GPU-type architectures. Especially if quantum gate computers continue to require impractical levels of refrigeration and EMI protection for the typical home user.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
Knowing Intel, they would develop a quantum CPU that's 100x more powerful than their existing Core CPUs. Then Intel would just focus on developing the integrated graphics and rest on their laurels until AMD finally caught up.

amirite guys?
 

bit_user

Polypheme
Ambassador

No, AMD is going to get run over by the train that is Quantum Computing. I don't imagine they have the money to invest in it, right now. More likely, they'll get bought by someone for their CPU and GPU IP who already has the QC piece.
 

alextheblue

Distinguished

That would be a massive understatement. Look up some of the criticism leveled at D-Wave's quantum hardware. Not all approaches to quantum were created equal. From what I've read, IBM was on the right track a long time ago but they chose the hard science road and thus they didn't get all the attention and money thrown at them like D-Wave. Hopefully Intel and others pick up where they left off.

https://www.wired.com/2014/06/d-wave-quantum-speedup/

Anyway, even if there is a true quantum speedup, it only appears to be in extremely limited scenarios. For most tasks, a very high-end conventional CPU will best it at far less power. Not to mention GPUs or FPGAs for seriously parallel number crunching. Heck since the quantum annealing processor is so limited in scope (as far as quantum speedup goes) you could probably build a big ASIC to compete with it. Especially if you have a massive development and power budget.

On AMD, quantum computing isn't likely to get a lot of market penetration any time soon for a huge number of reasons. Even then it will primarily complement existing conventional chips. So I wouldn't be surprised if they were able to forge alliances to build hybrid systems... if they survive the next decade or two in the conventional chip wars. Many have written them off before and yet here they are, so I hope they continue to do well for the foreseeable future.
 

bit_user

Polypheme
Ambassador

Down-voted for this. You don't use QA machines to run general-purpose code. And for those problems to which QA machines are suited, you're not going to beat a sufficiently large one with any sort of classical computer.

This is a fast-moving field. Don't put too much stock in > 3-year-old articles. They were probably testing a 512-qubit model, whereas D-Wave is already up to 2k.

BTW, it was a bit hyperbolic of me to say they would get run over by QC, but you didn't actually disagree with my main point - that they hardly seem to have money to invest in it. If QC does take over the computing landscape, any future with AMD in it will likely involve them getting bought.
 

bit_user

Polypheme
Ambassador

Thanks. I don't take it so hard. He's good about up-voting me when he agrees, so a couple down-votes don't sting too badly.

BTW, I explained why I down-voted, so it didn't just seem like tit-for-tat.
 

bit_user

Polypheme
Ambassador

Someone correct me if I'm wrong, but I think it boils down to the fact that quantum computers rely on entanglement, which so far requires extreme conditions to maintain and only scales up to dozens (or a couple thousand, in D-Wave's case) of qubits.

Compare that to classical computers that have billions of transistors that'll happily operate to near the boiling point of water, in fairly RF and EM-dense environments. So, while you can build very sophisticated logic circuitry out of semiconductors, you're practically limited to using qubits to do things that only qubits are really good at doing (and don't require very many of them).

If entanglement could be scaled up, then simple things, like data movement, could be revolutionized. Imagine being able to send data without any wire or even line-of-sight between two endpoints. This is the bedrock of quantum communications, and people are already doing it. But, what if you could scale it up to replace most of the electrical interconnects computers currently use to move data?
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
I don't see why a quantum computer couldn't run x86 instructions in the future. I know there's a huge issue with stability right now, but transistor based CPUs sucked in the 1960's too. Those cold war era egg heads had no idea what our modern CPUs were going to look like. They were still picturing room sized mainframes to perform even trivial tasks.

We're in a similar spot right now with quantum computers. They require extremely controlled environments to function properly. There's no way a regular household could ever afford a quantum computer, right? Right?

In other words, it's too early to make judgments about what they can and can't do.
 
Status
Not open for further replies.

TRENDING THREADS