Cascading photonic reservoirs with deep neural networks increases computational performance

Research output: Unpublished contribution to conferencePoster


Deep neural networks (DNNs) have been successfully applied to solving complex problems, such as pattern recognition when analyzing big data. To achieve a good computational performance, these networks are often designed such that they contain many trainable parameters. However, this often makes DNNs very energy-intensive and time-consuming to train. In this work, we propose to use a photonic reservoir to preprocess the input data instead of directly injecting it into the DNN. A photonic reservoir consists of a network of many randomly connected nodes which do not need to be trained. It forms an additional layer to the deep neural network and can transform the input data into a state in a higher dimensional state-space. This allows us to reduce the size of the DNN, and in turn, the amount of training required for the DNN, due to less backpropagation being performed for a smaller DNN.
Original languageEnglish
Publication statusPublished - 10 Apr 2024
EventSPIE Photonics Europe 2024 - Strasbourg, France
Duration: 7 Apr 202411 Apr 2024


ConferenceSPIE Photonics Europe 2024


Dive into the research topics of 'Cascading photonic reservoirs with deep neural networks increases computational performance'. Together they form a unique fingerprint.

Cite this