Discrete all-positive multilayer perceptrons for optical implementation

Perry D. Moerland, Emile Fiesler, Indu Saxena

Research output: Contribution to journalArticleAcademicpeer-review

3 Citations (Scopus)

Abstract

All-optical multilayer perceptrons differ in various ways from the ideal neural network model. Examples are the use of nonideal activation functions, which are truncated, asymmetric, and have a nonstandard gain; restriction of the network parameters to non-negative values, and the limited accuracy of the weights. A backpropagation-based learning rule is presented that compensates for these nonidealities and enables the implementation of all-optical multilayer perceptrons where learning occurs under computer control. The good performance of this learning rule, even when using a small number of weight levels, is illustrated by a series of computer simulations incorporating the nonidealities.

Original languageEnglish
Pages (from-to)1305-1315
Number of pages11
JournalOptical Engineering
Volume37
Issue number4
DOIs
Publication statusPublished - Apr 1998

Keywords

  • Activation function
  • Backpropagation
  • Liquid crystal light valve
  • Neural network
  • Non-negative neural networks
  • Optical multilayer perceptron
  • Weight discretization

Cite this