Neural Networks: A Comprehensive Foundation – Simon Haykin – 2nd Edition

Description

For graduate-level neural courses offered in the departments of Computer , Electrical , and Computer .
Renowned for its thoroughness and readability, this well-organized and completely up-to-date text remains the most comprehensive treatment of neural from an engineering perspective. Thoroughly revised.

Considers recurrent networks, such as Hopfield networks, Boltzmann machines, and meanfield theory machines, as well as modular networks, temporal , and neurodynamics.

Table of Content


1. Introduction.
2. Learning Processes.
3. Single-Layer Perceptrons.
4. Multilayer Perceptrons.
5. Radial-Basis Function Networks.
6. Support Vector Machines.
7. Committee Machines.
8. Principal Components Analysis.
9. Self-Organizing Maps.
10. Information-Theoretic Models.
11. Stochastic Machines & Their Approximates Rooted in Statistical Mechanics.
12. Neurodynamic Programming.
13. Temporal Processing Using Feedforward Networks.
14. Neurodynamics.
15. Dynamically Driven Recurrent Networks.
Epilogue.
Bibliography.
Index.

No Comments

  • Feedback: Leave your comments here!

    Your opinions and comments would be greatly appreciated.
    If you have comments or questions we've added this section so that we might have a dialogue with you.

Complete all fields

5 + 14 =