News Record 1.84 Petabit/s Data Transfer Achieved With Photonic Chip, Fiber Optic Cable

George³

Prominent
Oct 1, 2022
228
124
760
Meanwhile the Ethernet Alliance roadmap stop on miserable 1.6 Tbps from years. Today last communication tech in production is 800Gbps.
 
Oct 24, 2022
3
1
15
Meanwhile the Ethernet Alliance roadmap stop on miserable 1.6 Tbps from years. Today last communication tech in production is 800Gbps.

Ethernet wants to be able to operate over a carrier WDM as used in metro ethernet with WDM and optical add/drop multiplexers (OADM). So one core and ideally one wavelength. 10GbE is one 10 Gb/s wavelength and cheap colored optic components. For example 100GbE is 4x25Gb/s closely spaced to fit in a 100 GHz band. So in something like a Wall Street fiber ring a brokerage trading office can easily get 10GbE to the exchange from local carrier or competitive provider. Plenty of other application of course.

400 GbE and up is more useful in data centers that have been bumping scaling limits for a decade of more.

Ethernet is not used at all as a link layer in longer hauls than metro but is used to hand off data to the transport layer. There is also encapsulations of Ethernet, for example 100GbE carried in ODU4 but the transport equipment sees this as ODU4 and has no idea it is carrying Ethernet inside. Some transport equipment can take 400GbE and 800GbE as an external interface. I'm not sure but I think 100GbE is still least cost per bit.
 
  • Like
Reactions: George³
Oct 24, 2022
3
1
15
They tested them individually, there will be no issues with having all these light waves interfering with one another?

Yeah. Look up wave division multiplexing (WDM). WDM has been widely used since the 1990s. The original paper for this article uses the term SDM (spectral division multiplexing).
 
Oct 24, 2022
3
1
15
The article that this article cites doesn't go into what exists and what is new and what the implcations are most likely because the author doesn't have the background. So this article reflects that. The original article in Nature makes no false claims AFAIK.

Multiple wavelengths of light on a fiber (or core in this type of fiber) is wave division multiplexing, in use since mid 1990s. Optical amplifiers is similar timeframe. Multiple cores per fiber is only applicable to very short distances (relative to size of the earth) and the original paper in Nature points this out. The longest production DWDM (dense WDM) as of about 2010 is US west coast to Australia without stopping for signal regeneration in Hawaii. That is 6,000+ km with fiber and amplifiers at about 100 km intervals (I think). I don't know the configuration but I think it is 160 waves at 100 Gb/s each but may be higher. Infinera gear on either end. Details probably still burried on the web site blog. One core because the power to drive 100 km and keep signal integrity for that many amplifications would melt the fiber if multicore.

They also didn't invent photonic integrated circuits nor are photonic integrated circuits new. Infinera was the first to use them commercially, shipping first product in 2005. Current product 800 Gp/s per chip pair (transmitter chip and receiver chip), soon 1.6 Tb/s, planned 4 Tb/s. But this is for ultra long haul with claimed 10,000 km reach. Heat is a huge problem in the transmitter chip as well as for the fiber itself.

What is new and the paper in Nature is clear about this from the title, abstract, etc, is the source laser. They don't use a single laser per wavelength or laser array (small number of lasers) but rather use a single polychromatic laser on a chip and a microcomb ring resonator. The paper in Nature points out this works for very short distances (relative to earth, therefore the Internet) and their demo is 7.9 km. This limitation is due to be very low power produced this way. That is not to say it has no use. It might be great for data center use where multiple building nearby is all the distance needed.

It is unfortunate when a one mainstream publication botches the details and imagines something revolutionary and that every bit of technology mentioned is new and unique (because they never heard of it) when in reality the contribution is useful, very impressive, but much more limited than the mainstream publication imagines it to be. It is worse when the mainstream publication that has no idea what they are talking about becomes a cited source all over the Internet. None of this flawed reporting is intentional. (And maybe I got some facts wrong too). It does remind me a bit when TV news anchors tried to explain what the Internet was in 1994 and 1995 or worse yet tried to give the layman's version of how the Internet works and completely botched it (and now famously). Lower consequences and lower exposure here. Now someone please explain to Redit and elsewhere.
 
Last edited: