Optical computer

An optical or photonic computer uses photons produced by lasers or diodes for computation.

It needs:
 * optical processor(s) which uses optical transistors running at up to 20 GHz and beyond (at least 4 times faster than traditional processors)
 * optical data transfer using nanoscale fiber optic cables that convey signals between the electronics and the photonics
 * optical storage which can be 3D (for example CDs - 650 MB, DVDs - 5 GB, Blu-ray - 25 GB) and 5D (nanostructured glass for permanently recording digital data using femtosecond laser writing process creating a memory crystal initially capable of storing up to 360 TB)

Optical computers address the problem of energy usage. They use far less energy than traditional electron-based von Neuman architecture computers which require more power and therefore more cooling. Optical computers work like GPUs and are used in graphics processing, raytracing, machine learning, neural networks and AI. They are similar to quantum computers in that they work well with computing and solving linear algebraic problems requiring parallelism and not serial logic/gate computing such as "if-then logic" and applications such as operating systems.

A company called Lightmatter makes products that are 10 times faster than Nvidia GPUs and which use 90% times less energy. Some of their products are:


 * a processor called Envise that they call an "AI accelerator" that uses a different color of light (per core) to transmit and parallelize optical signals, greatly adding to speed and efficiency. It uses mux/demux to push packed light through the processer, and unpack it as it comes out, like prisms that function as processor input and output. Each colour of light represents a core of the processor, and Lightmatter envisions 64 of these in the near future, and say that they are scalable within the space of one chip.
 * Passage, a programmable photonic interconnect that enables arrays of heterogeneous chips to communicate with unprecedented bandwidth and energy efficiency.
 * a software stack that interfaces with standard deep learning tools