News Fujitsu Develops Optical Tech Unlocking 1.2 Tbps per Wavelength

... because every network engineer since the dawn of time has standardized on bits per second as the measurement to compare layer 1 speeds.
Bauds were also a thing to separate symbol rate from bits per symbol in L1 protocols encoding more than one bit per transition. Still in use in technical documents when digging down to individual OFDM sub-carriers in modern-day analog protocols, useless in user-facing literature when modern OFDM-centric protocols have variable baud rates and bits per symbol depending on SNR, channel properties across a given sub-carrier's frequency band and regulatory limits across any given sub-band.
 
Why use tbps to measure speed? Neither it or blu-rays per second has any real world meaning. Say 153.6 GB/s and suddenly everyone can comprehend the speed. Idiots.
Do they not teach you the metric system?

Do you not comprehend how the Metric System scales units?

You should be easily be able to tell the difference between <Metric Prefix>bps (bit per second) and <Metric Prefix>Bps (Byte per second).
 
  • Like
Reactions: TJ Hooker
Why use tbps to measure speed? Neither it or blu-rays per second has any real world meaning. Say 153.6 GB/s and suddenly everyone can comprehend the speed. Idiots.
Is your issue with using bits instead of bytes, or using Tera prefix instead of Giga? Neither complaint makes any sense, just curious. Contrary to your claim, "Tbps" very obviously has "real world meaning".
 
Say 153.6 GB/s and suddenly everyone can comprehend the speed. Idiots.

Its not 153.6GB/s it's 150GB/s.

If you want to use IT Gigabytes (which is 8,589,934,592 bits now called a Gibibyte) then it would only be 139.7GiB/s as you are not converting from an IT terabit (which is 1,099,511,627,776 bits now called a tebibit) but an actual terabit which is 1,200,00,00,00 bits.