Good news! Possibly a breakthrough!
"Processors that use light instead of electricity show promise as a faster and more energy-efficient way to implement AI. So far they’ve only been used to run models that have already been trained, but new research has demonstrated the ability to train AI on an optical chip for the first time. ...
They are particularly promising for running AI because they are very efficient at carrying out matrix multiplications—a key calculation at the heart of all deep-learning models. ...
What sets the new chip apart though, is that it also has light sources and light detectors at both ends, allowing signals to pass forward and backward through the network. It also features small “taps” at each node in the network that siphon off a small amount of the light signal, redirecting it to an infrared camera that measures light intensities. Together, these changes make it possible to implement the optical backpropagation algorithm. The researchers showed that they could train a simple neural network to label points on a graph based on their position with an accuracy of up to 98 percent, which is comparable to conventional approaches. ..."
They are particularly promising for running AI because they are very efficient at carrying out matrix multiplications—a key calculation at the heart of all deep-learning models. ...
What sets the new chip apart though, is that it also has light sources and light detectors at both ends, allowing signals to pass forward and backward through the network. It also features small “taps” at each node in the network that siphon off a small amount of the light signal, redirecting it to an infrared camera that measures light intensities. Together, these changes make it possible to implement the optical backpropagation algorithm. The researchers showed that they could train a simple neural network to label points on a graph based on their position with an accuracy of up to 98 percent, which is comparable to conventional approaches. ..."
From the editorial summary and abstract:
"Editor’s summary
Commercial applications of machine learning (ML) are associated with exponentially increasing energy costs, requiring the development of energy-efficient analog alternatives. Many conventional ML methods use digital backpropagation for neural network training, which is a computationally expensive task. Pai et al. designed a photonic neural network chip to allow efficient and feasible in situ backpropagation training by monitoring optical power passing either forward or backward through each waveguide segment of the chip ... The presented proof-of-principle experimental realization of on-chip backpropagation training demonstrates one of the ways that ML could fundamentally change in the future, with most of the computation taking place optically. ...
Abstract
Integrated photonic neural networks provide a promising platform for energy-efficient, high-throughput machine learning with extensive scientific and commercial applications. Photonic neural networks efficiently transform optically encoded inputs using Mach-Zehnder interferometer mesh networks interleaved with nonlinearities. We experimentally trained a three-layer, four-port silicon photonic neural network with programmable phase shifters and optical power monitoring to solve classification tasks using “in situ backpropagation,” a photonic analog of the most popular method to train conventional neural networks. We measured backpropagated gradients for phase-shifter voltages by interfering forward- and backward-propagating light and simulated in situ backpropagation for 64-port photonic neural networks trained on MNIST image recognition given errors. All experiments performed comparably to digital simulations (>94% test accuracy), and energy scaling analysis indicated a route to scalable machine learning."
No comments:
Post a Comment