Researchers Develop Single-Photon Gate

Status
Not open for further replies.

freggo

Distinguished
Nov 22, 2008
2,019
0
19,780
That's bad news.
Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.

We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.

(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)


Einstein just started rotating in his coffin :)
 

walter87

Distinguished
Jun 28, 2011
159
0
18,680
[citation][nom]freggo[/nom]That's bad news.Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)Einstein just started rotating in his coffin :)[/citation]

You have no clue what quantum computing is or just how large scale performance gains are compared to traditional computers...

Using single photon-gates will replace transistors.
 

jhansonxi

Distinguished
May 11, 2007
1,262
0
19,280
[citation][nom]walter87[/nom]You have no clue what quantum computing is or just how large scale performance gains are compared to traditional computers...Using single photon-gates will replace transistors.[/citation]I think you missed the sarcasm in freggo's post.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]freggo[/nom]That's bad news.Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)Einstein just started rotating in his coffin :)[/citation]

stop thinking 2d and think 3d. you could possibly pile the ram and crap that takes up space on one layer, and put the crap that does stuff on another, bringing the two closer together, further reduceing the amount of space the data tracvles, increasing performance, by how much, i dont know, but thats a concept at least.
 
G

Guest

Guest
[citation][nom]freggo[/nom]That's bad news.Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)Einstein just started rotating in his coffin :)[/citation]
Well... there is something called Higgs theory that beats your point, and it is being tested now at the Large Hadron Collider in France.
 

azraa

Honorable
Jul 3, 2012
323
0
10,790
freego was being sarcastic. Hence the phrase about Einstein at the end.

And anyway, why is everyone who has read about the LHC, now a freakin quantum physicist?. Most of us are scientific-type people, engineers most probably, but nearly none of us has a deep understanding oh parcicle physics. Even I that was a Chemistry student years ago can say that I am knowledgeable, for my understanding of quantum physics is only basic (solved schrodinger's equations, along with other photon-related mathematical analysis). Everything below that is merely know what the media tells us this is all about.

@topic: my mind is blown. With my limited knowledge I find this amazing, considering how little we know of actually CONTROLLING sub-atomic particles. This, as someone stated before, could be the discovery behind a new era of electrical nano-devices, considering a photon as a 1 and no-photon as a 0. Imagine the possibilities.
Great article, Tom's, I love to see this kind of news, keep digging up stuff like this!
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
considering a photon as a 1 and no-photon as a 0. Imagine the possibilities.

I recall reading an article about the development of a processor that uses more values than 0 and 1, with silicon transistors. If they can increase precision enough to fit in additional values, imagine the performance if software could take advantage of it.

The only issue would be OCing. Standard processors can tolerate some voltage variation as long as the 0s don't get reported as 1s and 1s as 0s...
 

azraa

Honorable
Jul 3, 2012
323
0
10,790
I agree. And even then, we need to get our feet back to the ground: this happens near zero kelvin.
If this process develops thermal resistance (the system is tweaked in a way that makes this possible closer to atm temperature) then the OC shouldnt be much of a problem, because its just that, increasing the cicles per clock. And I doubt this supposedly new gen of transistors, based on diminute photons, could generate MORE heat than a system where a current is dissipated (elec. resistance) within a transistor.
 

bustapr

Distinguished
Jan 23, 2009
1,613
0
19,780
dudes, if this process is ever implemented into circuits, there would really be no need more OCing. this would be pretty much as fast as it can get wherever it can be implemented.
 

spectrewind

Distinguished
Mar 25, 2009
446
0
18,790
[citation][nom]A Bad Day[/nom]I recall reading an article about the development of a processor that uses more values than 0 and 1, with silicon transistors. If they can increase precision enough to fit in additional values, imagine the performance if software could take advantage of it.The only issue would be OCing. Standard processors can tolerate some voltage variation as long as the 0s don't get reported as 1s and 1s as 0s...[/citation]

Link? Usage?
With there being 10 kinds of people in the world... those which read binary, and those that don't, which are you? ;o)
 
[citation][nom]Second Paragraph of Article[/nom]However, a second laser is uses to facilitate an effect called electromagnetically induced transparency (EIT). [/citation]

[citation][nom] Last Paragraph of Article [/nom] MIT said that’ such systems "could be immune from eavesdropping when used for communication, and could also allow much more efficient processing of certain kinds of computation tasks." [/citation]

Com on Tom's, proof-reading shouldn't be so difficult that it seems that no article is free of errors.
 

bustapr

Distinguished
Jan 23, 2009
1,613
0
19,780
[citation][nom]blazorthon[/nom]Com on Tom's, proof-reading shouldn't be so difficult that it seems that no article is free of errors.[/citation]
please notice the blue text hyperlink immediately below the last sentence of the article.
 


I know about the edit function, my screw-up there. I would've noticed it and fixed it anyway even if the next comment after mine wasn't referring to it since. Tom's doesn't seem to fix mistakes like that all too often (especially on minor articles) even after they are mentioned.
 

fuzzion

Distinguished
Feb 4, 2012
468
0
18,780
2028 is the year we shall see quantum based computing. Yes i am a time traveller from the future. And yes Obama will be re-elected. And no, there will be no major world war until 2052
 
I agree with the "More Efficient Computing" part..... the rest of the setup just seems like eons away from becoming something that could save the planet..... more so the journey to reach that sort of computing itself is going to cost us the planet and all it's resources.... I really have been finding it more and more difficult lately, to come to terms with the cost that we are paying today for the technology of the future..... seems like a really vicious cycle....
It's not that I am anti progress or something of that sort.... but geez..... "-273.15 degrees Celsius" plus the other stuff..... really..... they could change the climate of the Sahara with that sort of funding.....
 

bombebomb

Honorable
Aug 5, 2012
105
0
10,680
[citation][nom]azraa[/nom]I agree. And even then, we need to get our feet back to the ground: this happens near zero kelvin.If this process develops thermal resistance (the system is tweaked in a way that makes this possible closer to atm temperature) then the OC shouldnt be much of a problem, because its just that, increasing the cicles per clock. And I doubt this supposedly new gen of transistors, based on diminute photons, could generate MORE heat than a system where a current is dissipated (elec. resistance) within a transistor.[/citation]
I read it also, I think it was on the IBM news post on this website. About quantum computing at that, talking about bits being half on, or half off, etc. etc.
 
*Quantum Computing is NOT around the corner.*

It's one thing to manipulate a single Atom or Photon with lots of hardware. It's a completely different issue to actually build a computer comprised of millions of these.

How do the parts CONNECT together?
Where are the bottlenecks?

Don't expect anything in Quantum Computing to affect real-world computers in the next ten years. In fact, it's possible that computers will simply get faster through the normal, evolutionary update cycle and never, ever benefit from Quantum Computing research.
 

rebel1280

Distinguished
May 7, 2011
391
0
18,780
[citation][nom]photonboy[/nom]*Quantum Computing is NOT around the corner.*It's one thing to manipulate a single Atom or Photon with lots of hardware. It's a completely different issue to actually build a computer comprised of millions of these.How do the parts CONNECT together?Where are the bottlenecks?Don't expect anything in Quantum Computing to affect real-world computers in the next ten years. In fact, it's possible that computers will simply get faster through the normal, evolutionary update cycle and never, ever benefit from Quantum Computing research.[/citation]
As much as it pains me to say it, you are correct. Although I would love to see QT research grow by leaps and bounds it is by no means "around the corner" much less around the block. At this point its on a whole different continent. Intel is talking cognitive computing so lets just wait and see where the evolution of computers takes us.
 
Status
Not open for further replies.