Neural Networks for Optimization and Signal ProcessingA topical introduction on the ability of artificial neural networks to not only solve on-line a wide range of optimization problems but also to create new techniques and architectures. Provides in-depth coverage of mathematical modeling along with illustrative computer simulation results. |
Contents
Mathematical Preliminaries of Neurocomputing | 1 |
4 | 23 |
5 | 29 |
Copyright | |
12 other sections not shown
Common terms and phrases
a₁ activation function adaptive analog annealing applications approach artificial neural networks artificial neuron b₁ back-propagation Chapter Circuits and Systems continuous-time convergence corresponding cost function described differential equations discrete-time dynamic eigenvalues eigenvectors energy function error function formulated functional block diagram global gradient descent Hebbian Hebbian learning Hessian matrix Hopfield IEEE Trans implementation input signals iterative Lagrange multipliers layer learning algorithm learning rate learning rule least-squares linear programming local minima LP problem minimax minimization problem minimize f(x minimum neurons noise nonlinear norm objective function optimization problem orthogonal output signals penalty function penalty parameters perceptron positive definite principal components processing units programming problem quadratic S₁ scalar shown in Fig sigmoid simulated singular value decomposition solution solving steepest-descent stochastic symmetric synaptic weights system of differential technique tion trajectory v₁ variables w₁ x₁ y₁ zero Σ Σ ӘЕ