Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4 (Progress In Neural Processing, #4)

by Peter Edwards and Alan F Murray

0 ratings • 0 reviews • 0 shelved
Book cover for Analogue Imprecision In Mlp Training, Progress In Neural Processing, Vol 4

Bookhype may earn a small commission from qualifying purchases. Full disclosure.

Hardware inaccuracy and imprecision are important considerations when implementing neural algorithms. This book presents a study of synaptic weight noise as a typical fault model for analogue VLSI realisations of MLP neural networks and examines the implications for learning and network performance. The aim of the book is to present a study of how including an imprecision model into a learning scheme as a"fault tolerance hint" can aid understanding of accuracy and precision requirements for a particular implementation. In addition the study shows how such a scheme can give rise to significant performance enhancement.
  • ISBN13 9789810227395
  • Publish Date 1 August 1996
  • Publish Status Active
  • Publish Country SG
  • Imprint World Scientific Publishing Co Pte Ltd