Cascading photonic reservoirs with deep neural networks increases computational performance

Ian Bauwens, Guy Van Der Sande, Peter Bienstman, Guy Verschaffelt

Research output: Chapter in Book/Report/Conference proceedingConference paper

Abstract

Deep neural networks (DNNs) have been successfully applied to solving complex problems, such as pattern recognition when analyzing big data. To achieve a good computational performance, these networks are often designed such that they contain many trainable parameters. However, this often makes DNNs very energy-intensive and time-consuming to train. In this work, we propose to use a photonic reservoir to preprocess the input data instead of directly injecting it into the DNN. A photonic reservoir consists of a network of many randomly connected nodes which do not need to be trained. It forms an additional layer to the deep neural network and can transform the input data into a state in a higher dimensional state-space. This allows us to reduce the size of the DNN, and in turn, the amount of training required for the DNN, due to less backpropagation being performed for a smaller DNN.
Original languageEnglish
Title of host publicationProceedings of SPIE
EditorsFrancesco Ferranti, Mehdi Keshavarz Hedayati, Andrea Fratalocchi
Number of pages5
DOIs
Publication statusPublished - 10 Apr 2024
EventSPIE Photonics Europe 2024 - Strasbourg, France
Duration: 7 Apr 202411 Apr 2024

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume13017
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

ConferenceSPIE Photonics Europe 2024
Country/TerritoryFrance
CityStrasbourg
Period7/04/2411/04/24

Bibliographical note

Publisher Copyright:
© 2024 SPIE.

Fingerprint

Dive into the research topics of 'Cascading photonic reservoirs with deep neural networks increases computational performance'. Together they form a unique fingerprint.

Cite this