Question Looking for alternatives

omar80747326

Honorable
Dec 16, 2017
174
0
10,690
Since new generations of hardware (cpus, gpus, etc...) are going to be more expensive, then why companies aren't moving to new chip designs and architecture families like ARM or designing new stuff from scratch? Why there's nobody talking about the transition to photonics or at least integrating laser into modern chips? Where are the answers for all this?
 
Since new generations of hardware (cpus, gpus, etc...) are going to be more expensive, then why companies aren't moving to new chip designs and architecture families like ARM or designing new stuff from scratch? Why there's nobody talking about the transition to photonics or at least integrating laser into modern chips? Where are the answers for all this?
Too early, too expensive and there's also quantum computing, DNA storage.....
 

Eximo

Titan
Ambassador
I think you are under the assumption that new would be cheaper? Not at first. Establishing new standards takes capital and the expectations of losses.

Logistical needs and fulfillment take a long time to establish and optimize. There is enough momentum behind silicon based lithography and TSMC/Samsung/Intel that you will be seeing this stuff for decades.

ARM is simply a different way to design a CPU. They are not necessarily significantly cheaper then cheap x86 offerings. They are cheap because they are mass produced to an extreme degree. The top end ARM processors are certainly larger and more expensive, but the older ARM cores are still being made, as the process nodes shrinks, it gets cheaper per chip, so they can be sold at very low prices. In effect a combination of old design and new manufacturing is the most cost effective.

Switching to something like optical computing would mean a whole change in chemical and physical construction process. New production lines, new sources of supply, and you have to convince everyone they are worth it over existing solutions. High speed optical computing is still a ways off, there have been promising lab demos for a while, but nothing in the scale of mass production. Most recent information shows a way to build logic into crystal structures that attenuate lasers in different ways, sensors on the other side record the results. So the concept is good. Nanoscale lasers were invented about a decade ago. All coming together, but still going to be a while.

Lasers are integrated (essential) into computing in the form of fiber optics. More directly opto-isolators can be found in circuits all the time. This lets you separate electrical circuits and still pass data or commands through or for sensing.

Quantum computing is still energy intensive and non-portable, so not a good solution. Quantum will likely make its way to major chunks of the internet before it makes it into personal devices. Still decades off. That would mostly be for security, and not actual end user computation. There are very few tasks a quantum computer can perform well that will translate to end-user performance in software.

Carbon nanotubes / graphene are also promising, with P and N type materials possible just by different twists in the structure or doping graphene sheets with specific elements, or the coolest one, two graphene sheets laid on top each other at different angles. Mass production is the main problem. Right now it is all about inducing nanotubes to form, locating them with a electron microscope, analyzing their properties, harvesting, applying to an experiment, and testing. Harvesting graphene sheets and arranging them in the same way. They can't just lay out what structures they want to make like lithography allows, so anything functional made with nanotubes and graphene is basically a one off. Many decades away from that becoming common, they have to first build the tools to allow construction, Most nanotube use you see today is in the form of its more physical properties,

RISC (ARM, PowerPC, RISC-V, etc) and CISC (x86-64, x86) are basically the only options right now. The software exists for them, so transitioning to a an easier to build design would take a lot of effort. There are some off the wall ideas out there like zero instruction set computing, which should eliminate the need for machine code, but that is more down the path of machine learning where the CPU basically tries until it gets the correct answer, optimizes, and then locks that in. Useful for certain tasks and applications, but not so good for general computing. Most of the rest are just variations, or the answer is again RISC or CISC in slightly different variants.

The way things are going, it is going to be tighter integration that is the path forward for now. Chiplets and universal substrates, die stacking, should make for extremely dense portable electronics.
 

Aeacus

Titan
Ambassador
then why companies aren't moving to new chip designs and architecture families

Every new generation of GPUs from Nvidia, is new architecture.
E.g Maxwell (GTX 900 series), Pascal (GTX 10 series), Turing (GTX 16 series and RTX 20 series), Ampere (RTX 30 series), Ada Lovelace (RTX 40 series).

Same is with Intel and their CPUs.
E.g Skylake (6th gen), Kaby Lake (7th gen), Coffee Lake (8th gen), Coffee Lake Refresh (9th gen), Comet Lake (10th gen), Rocket Lake (11th gen), Alder Lake (12th gen).

Each new architecture is improvement of the previous one, while the underlying component is still the same, a semiconductor.

Why there's nobody talking about the transition to photonics or at least integrating laser into modern chips?

Besides costing an arm and a leg + then some, what practical use would photonics or lasers be within semiconductors?

Back in the old days, we had vacuum tubes for controlling electric current flow between electrodes, in a high vacuum. Now, we have semiconductors, that conduct electric current in the solid state, rather than as free electrons across a vacuum or as free electrons and ions through an ionized gas.

I don't see any way, how air (or laser), which is poor conductor for electricity, to be implemented into semiconductors, or completely replace them (as semiconductors replaced vacuum tubes). Futuristic tech in science fiction, is fancy but it is still a fiction and not reality.
 

Eximo

Titan
Ambassador
I don't see any way, how air (or laser), which is poor conductor for electricity, to be implemented into semiconductors, or completely replace them (as semiconductors replaced vacuum tubes). Futuristic tech in science fiction, is fancy but it is still a fiction and not reality.

Photonics has long been considered a long term replacement for integrated circuits.

As they shrink the nodes they run into issues with unintended capacitance, noise, and heat generation.

Quite a simple concept really. You take a laser, which is a single color, and you aim it at beam splitter, now you have two beams. You can detect the difference in distance traveled to represent ones and zeros. So it could be as simple as a series of mirrors and sensors inside of a vacuum. These can act as transistors, you can have laser amplification to act as a traditional transistor used in amplification, you can put in laser dumps to kill signals off completely (which could be another logical way to do it)

A good example of some photonic technology in action is DLP projectors. They manipulate tiny mirrors on a silicon chip, so it has always been possible to integrate the two technologies.

More recent scientific discoveries have found better ways. I mentioned nano-scale lasers earlier. This is a more efficient light source than LEDs, and they figured out how to to RGB on a single substrate to make white light, or they could just point it at a phosphor as many white LEDs do today. That is just lighting. But the key is they can manufacture tiny lasers, like really small, in multiple colors, which means you could have as many logical energy states as you have spectrum differentiation in sensors.

A very recent article covered the engineering of a working prototype crystal logic. Using a laser, they burned properties into a crystal that when a weaker laser is shown through will be attenuated down different paths. This has been shown to act as logic gates. And the best part, 3D from the start. So you can make circuits in a quartz substrate just by using a laser. Ease of mass production is baked in. The interesting part is connecting all the optical logic up to your traditional electronics. So you need very precise arrays of optical sensors. We have these things called CCDs, which we all carry around in our pockets. They convert light into digital signals. All you have to do is sandwich a layer of that onto the outputs of your optical computing and you have yourself a photonic processor.

I should add that IBM has a whole group of people dedicated to photonic computing research. What they are doing is even more cool than what I suggested.
 
Last edited:
  • Like
Reactions: Aeacus