Solving Equations at the Speed of Light!

Ever since the advent of the digital age, humans have been troubled as well as fascinated by an inescapable enigma of whether computers can be more powerful than humans themselves or not. Speaking of power, the critically acclaimed Moore’s law asserts that the processing power (number of transistors) of any computer chips doubles every two years. Following this simple law, computers should win the race eventually. But, as it turns out, this trajectory has plateaued significantly, highlighting that we may be close to reaching the theoretical limit of microchip size. Moreover, conventional computers are still facing ‘the Von Neumann bottleneck,’ limiting the data transfer speed between the processor and memory.

Fig-1: Moore’s Law (credit: www.arm.com)

Researchers are reviewing and searching for new technologies to overcome these restrictions. Photonic or Optical Computing is one of the many novel approaches being used for the same. Developed by a group of researchers in Japan, Photonic Accelerator or PAXEL is a unique type of processor that possesses the ability to bypass Moore’s Law and increase the speed and efficiency of calculations. 

As the name suggests, Optical Computation uses strategies that utilize light (photons) instead of electricity to carry out data processing and warehousing. Much like Optical Fiber communication, this data transfer boosts the processing speed to a whole new level. It can eloquently separate information based on wavelength- known as wavelength multiplexing instead of thousands of electrical wires.  

Most research groups are trying to replace conventional computers with an optical equivalent called Optoelectronic computers which are much more pragmatic than optical computers, a hybrid using photonic elements integrated with contemporary silicon-based components. Since it is an optical computer which processes binary data, a significant percentage of energy is lost while converting electrical signals into photons and vice versa. This activity usually slows down the rate of transmission of information, however, this approach tends to produce the best short-term prospects for commercial optical computing. In pure optical computers data travels near the speed of light without conversion allowing them to perform blazing fast computations.

Currently, optics is used mostly to link portions of computers, or more intrinsically in devices that have some optical application or component. For example, much progress has been achieved and optical signal processors have been successfully used for applications such as synthetic aperture radars, optical pattern recognition, optical image processing, fingerprint enhancement, and optical spectrum analyzers.

One of the many applications for optical-computing technology is board-to-board interconnects on large-scale parallel processors. In such interconnects, the smart-pixel arrays are driven by electronic commands from the parallel processor but use optical switching in free space to do the interconnection. A single-lens focuses a square array of light sources such as LEDs or lasers onto a smart-pixel array that is modulated with the output from one board. A second lens draws the first array’s output onto a detector array attached to the second board, making this the optical equivalent of wired connections.

Fig-2: Schematic of an optical computing setup (credits: IEEE Spectrum)

An entirely different approach attempts to design optical computers from scratch, utilizing from the start the potentials of coherent light to access millions of pixels of data simultaneously. Also, researchers are trying to develop computer instruction in the framework of holographic interference patterns. Instead of performing parallel trains of sequential steps and arriving at many intermediate answers on the way to the final output, this scheme allows the interference pattern of photons passing through the holograms to determine the final solution in a single pass. Thus potentially achieving both high speed and low-energy operation. 

While optical computing’s theoretical advantages are numerous, such promises have been looming on the horizon for many decades. One of the biggest barricades faced is that its network cannot store information or is inherently volatile. It can only be utilized for data transfer or real-time computation. Moreover, Optical Computers lag behind their electrical counterparts when it comes to cost-efficiency ratio. There would need to be a significant stagnation in the electronic transistor developments to overcome this. This would enable extensive research into scalable optical transistor designs and data storage solutions. While such a standstill may lie shortly for electronic computing, we aren’t quite there yet.

Since most of the systems work on electrical signals, they would have to convert their output before being transmitted in an all-optical network. This pre-processing requires both a lot of energy and ancillary circuits, which may compromise performance. Also, when you switch from light to delocalized electrons, the power drain rises substantially. However, they move fast but also tend to peter along the way of their destination. It is time and energy inefficient to keep switching your signals back and forth between photons and electrons. Optical transistors can have much higher bandwidth than electronic ones but will require additional research breakthroughs.

While optical fibers are effective for long-distance information transmission, they are relatively large on a more localized computer processor than electrical traces. After all, transistors have been developed well into the nanometer scale, while visible light is on the scale of hundreds of nanometers. Additionally, the size of transistors is much smaller than the optical crystals utilized in many optical transistor designs.

Controlling network traffic requires limiting bandwidth to certain applications, guaranteeing minimum bandwidth to others, and marking traffic with high or low priorities. The fibers’ transport capacity is highly increased through the utilization of the wavelength multiplexing technology mentioned earlier. Thus, the resulting network is a multi-layer architecture requiring different types of equipment for each layer and where each layer has to be maintained and managed independently from the others. But it comes with additional drawbacks such as heavy overall overhead, partial overlapping of functions like protection and management. 

The problem of partial overlapping occurs when the space between wavelengths is reduced by choosing wavelengths that are closer together to carry the multiplicity of signals. However, as the bit rates increase, optical pulses get briefer. This shortening of signals forces the light signal to spread over a broader range of wavelengths which causes interference between closely spaced channels. Once a message is assigned a wavelength at its source node, the assignment cannot be changed at subsequent nodes. With capacity blocking, networks will now also experience wavelength blocking.

A high degree of channel multiplexing is an absolute necessity, largely because the bandwidth per optical interface needs to be maximized in order to amortize the cost over the minimum number of optical connections.

In short, there are reasons for optimism about the development of optical computing, but one should not overestimate its potential. While most of these technologies are still in the earliest stages of development, the rate of progress we are currently experiencing suggests we may be heading for a computing revolution.  At present, photonics-based computing has demonstrated a potential for in-memory computing, and volatile and non-volatile capabilities in a single device. It is possible that with further research and development, these hurdles can be overcome, and the promise of optical computing can be fully realized. As transistors reach the lower end of their physical size limit, such innovations may soon become necessary, and the paradigm of computing as we know it could soon be uprooted.

 

References:

  1. https://spectrum.ieee.org/semiconductors/optoelectronics/silicon-photonics-stumbles-at-the-last-meter
  2. https://advances.sciencemag.org/content/6/5/eaay5853
  3. https://www.osapublishing.org/oe/abstract.cfm?uri=oe-21-6-7008
  4. https://scitechdaily.com/researchers-make-progress-on-a-quantum-computing-proposal/
  5. https://www.lightreading.com/wavelength-division-multiplexing-(wdm)/d/d-id/575175
  6. https://www.ciena.com/insights/what-is/What-Is-WDM.html 
  7. https://www.fiberlabs.com/glossary/about-wdm/

 

About the Author

Articles

Leave a Comment

Your email address will not be published. Required fields are marked *