Limit search to available items
773 results found. Sorted by relevance | date | title .
Book Cover
E-book
Author Graupe, Daniel, author

Title Principles of artificial neural networks / Daniel Graupe
Edition 3rd edition
Published [Hackensack] New Jersey : World Scientific, [2013]
©2013

Copies

Description 1 online resource (xviii, 363 pages) : illustrations
Series Advanced series in circuits and systems ; vol. 7
Advanced series on circuits and systems ; v. 7
Contents Ch. 1. Introduction and role of artificial neural networks -- ch. 2. Fundamentals of biological neural networks -- ch. 3. Basic principles of ANNs and their early structures. 3.1. Basic principles of ANN design. 3.2. Basic network structures. 3.3. The Perceptron's input-output principles. 3.4. The Adaline (ALC) -- ch. 4. The Perceptron. 4.1. The basic structure. 4.2. The single-layer representation problem. 4.3. The limitations of the single-layer Perceptron. 4.4. Many-layer Perceptrons. 4.A. Perceptron case study: identifying autoregressive parameters of a signal (AR time series identification) -- ch. 5. The Madaline. 5.1. Madaline training. 5.A. Madaline case study: character recognition -- ch. 6. Back propagation. 6.1. The back propagation learning procedure. 6.2. Derivation of the BP algorithm. 6.3. Modified BP algorithms. 6.A. Back propagation case study: character recognition. 6.B. Back propagation case study: the exclusive-OR (XOR) problem (2-layer BP). 6.C. Back propagation case study: the XOR problem -- 3 layer BP network. 6.D. Average monthly high and low temperature prediction using backpropagation neural networks -- ch. 7. Hopfield networks. 7.1. Introduction. 7.2. Binary Hopfield networks. 7.3. Setting of weights in Hopfield nets -- bidirectional associative memory (BAM) principle. 7.4. Walsh functions. 7.5. Network stability. 7.6. Summary of the procedure for implementing the Hopfield network. 7.7. Continuous Hopfield models. 7.8. The continuous energy (Lyapunov) function. 7.A. Hopfield network case study: character recognition. 7.B. Hopfield network case study: traveling salesman problem. 7.C. Cell shape detection using neural networks -- ch. 8. Counter propagation. 8.1. Introduction. 8.2. Kohonen self-organizing map (SOM) layer. 8.3. Grossberg layer. 8.4. Training of the Kohonen layer. 8.5. Training of Grossberg layers. 8.6. The combined counter propagation network. 8.A. Counter propagation network case study: character recognition
Ch. 9. Large scale memory storage and retrieval (LAMSTAR) network. 9.1. Motivation. 9.2. Basic principles of the LAMSTAR neural network. 9.3. Detailed outline of the LAMSTAR network. 9.4. Forgetting feature. 9.5. Training vs. operational runs. 9.6. Operation in face of missing data. 9.7. Advanced data analysis capabilities. 9.8. Modified version: normalized weights. 9.9. Concluding comments and discussion of applicability. 9.A. LAMSTAR network case study: character recognition. 9.B. Application to medical diagnosis problems. 9.C. Predicting price movement in market microstructure via LAMSTAR. 9.D. Constellation recognition -- ch. 10. Adaptive resonance theory. 10.1. Motivation. 10.2. The ART network structure. 10.3. Setting-up of the ART network. 10.4. Network operation. 10.5. Properties of ART. 10.6. Discussion and general comments on ART-I and ART-II. 10.A. ART-I network case study: character recognition. 10.B. ART-I case study: speech recognition -- ch. 11. The cognitron and the neocognitron. 11.1. Background of the cognitron. 11.2. The basic principles of the cognitron. 11.3. Network operation. 11.4. Cognitron's network training. 11.5. The neocognitron -- ch. 12. Statistical training. 12.1. Fundamental philosophy. 12.2. Annealing methods. 12.3. Simulated annealing by Boltzman training of weights. 12.4. Stochastic determination of magnitude of weight change. 12.5. Temperature-equivalent setting. 12.6. Cauchy training of neural network. 12.A. Statistical training case study: a stochastic Hopfield network for character recognition. 12.B. Statistical training case study: Identifying AR signal parameters with a stochastic Perceptron model -- ch. 13. Recurrent (time cycling) back propagation networks. 13.1. Recurrent/discrete time networks. 13.2. Fully recurrent networks. 13.3. Continuously recurrent back propagation networks. 13.A. Recurrent back propagation case study: character recognition
Summary Artificial neural networks are most suitable for solving problems that are complex, ill-defined, highly nonlinear, of many and different variables, and/or stochastic. Such problems are abundant in medicine, in finance, in security and beyond. This volume covers the basic theory and architecture of the major artificial neural networks. Uniquely, it presents 18 complete case studies of applications of neural networks in various fields, ranging from cell-shape classification to micro-trading in finance and to constellation recognition - all with their respective source codes. These case studies demonstrate to the readers in detail how such case studies are designed and executed and how their specific results are obtained. The book is written for a one-semester graduate or senior-level undergraduate course on artificial neural networks. It is also intended to be a self-study and a reference text for scientists, engineers and for researchers in medicine, finance and data mining
Bibliography Includes bibliographical references (pages 349-356) and indexes
Notes Print version record
Subject Neural networks (Computer science)
COMPUTERS -- General.
Neural networks (Computer science)
Form Electronic book
LC no. 2013372134
ISBN 9789814522748
9814522740