Convergence Analysis of Recurrent Neural Networks (Network Theory and Applications, #13)

by Zhang Yi

0 ratings • 0 reviews • 0 shelved
Book cover for Convergence Analysis of Recurrent Neural Networks

Bookhype may earn a small commission from qualifying purchases. Full disclosure.

Since the outstanding and pioneering research work of Hopfield on recurrent neural networks (RNNs) in the early 80s of the last century, neural networks have rekindled strong interests in scientists and researchers. Recent years have recorded a remarkable advance in research and development work on RNNs, both in theoretical research as weIl as actual applications. The field of RNNs is now transforming into a complete and independent subject. From theory to application, from software to hardware, new and exciting results are emerging day after day, reflecting the keen interest RNNs have instilled in everyone, from researchers to practitioners. RNNs contain feedback connections among the neurons, a phenomenon which has led rather naturally to RNNs being regarded as dynamical systems. RNNs can be described by continuous time differential systems, discrete time systems, or functional differential systems, and more generally, in terms of nonĀ­ linear systems. Thus, RNNs have to their disposal, a huge set of mathematical tools relating to dynamical system theory which has tumed out to be very useful in enabling a rigorous analysis of RNNs.
  • ISBN13 9781402076947
  • Publish Date 30 November 2003
  • Publish Status Active
  • Publish Country US
  • Imprint Springer-Verlag New York Inc.
  • Edition 2004 ed.
  • Format Hardcover
  • Pages 233
  • Language English