ISBN-10:
0761914404
ISBN-13:
9780761914402
Pub. Date:
12/09/1998
Publisher:
SAGE Publications
Neural Networks / Edition 1

Neural Networks / Edition 1

Paperback

Current price is , Original price is $22.0. You

Temporarily Out of Stock Online

Please check back later for updated availability.

Overview

This book provides the first accessible introduction to neural network analysis as a methodological strategy for social scientists. The author details numerous studies and examples which illustrate the advantages of neural network analysis over other quantitative and modeling methods in widespread use. Methods are presented in an accessible style for readers who do not have a background in computer science. The book provides a history of neural network methods, a substantial review of the literature, detailed applications, coverage of the most common alternative models and examples of two leading software packages for neural network analysis.


Product Details

ISBN-13: 9780761914402
Publisher: SAGE Publications
Publication date: 12/09/1998
Series: Quantitative Applications in the Social Sciences , #124
Edition description: New Edition
Pages: 96
Sales rank: 833,673
Product dimensions: 5.50(w) x 8.50(h) x 0.24(d)

About the Author

Hervé Abdi was born in France where he grew up. He received an M.S. in Psychology from the University of Franche-Comté (France) in 1975, an M.S. (D.E.A.) in Economics from the University of Clermond-Ferrand (France) in 1976, an M.S. (D.E.A.) in Neurology from the University Louis Pasteur in Strasbourg (France) in 1977, and a Ph.D. in Mathematical Psychology from the University of Aix-en-Provence (France) in 1980. He was an assistant professor in the University of Franche-Comté (France) in 1979, an associate professor in the University of Bourgogne at Dijon (France) in 1983, a full professor in the University of Bourgogne at Dijon (France) in 1988. He is currently a full professor in the School of Behavioral and Brain Sciences at the University of Texas at Dallas and an adjunct professor of radiology at the University of Texas Southwestern Medical Center at Dallas. He was twice a Fulbright scholar. He has been also a visiting scientist or professor in in the Rotman Institute (Toronto University), in Brown University, and in the Universities of Chuo (Japan), Dijon (France), Geneva (Switzerland), Nice Sophia Antipolis (France), and Paris 13 (France). His recent work is concerned with face and person perception, odor perception, and with computational modeling of these processes. He is also developing statistical techniques to analyze the structure of large data sets as found, for example, in brain imaging and sensory evaluation (e.g., principal component analysis, correspondence analysis, PLS-Regression, STATIS, DISTATIS, discriminant correspondence analysis, multiple factor analysis, multi-table analysis, additive tree representations,...). In the past decade, he has published over 80 papers (plus 5 books and 3 edited volumes) on these topics. He teaches or has taught classes in cognition, computational modeling, experimental design, multivariate statistics, and the analysis of brain imaging data.

Table of Contents

Introduction
The Perceptron
Linear Autoassociative Memories
Linear Heteroassociative Memories
Error Backpropagation
Useful References

Interviews

From One of the Authors
Here is a brief description of the book: Neural networks are adaptive statistical models based on an analogy with the structure of the brain. They can be used to estimate the parameters of some population using only one (or a few) exemplars at a time. Neural Networks introduces readers to the basic models of neural networks and compares and contrasts these models with the use of other statistical models. Through the use of examples that can be computed by hand or with a simple calculator, the authors describe and explain the following: the perceptron, including the Widrow-Hoff learning rule, state space, and how the perceptron is akin to discriminate analysis; the linear associative memory, including Hebbian learning and how autoassociative memory is closely related to principal component analysis; the linear heteroassociative memory as a generalization of the perceptron and how it corresponds to multiple linear regression; and backpropagation networks and their use in the estimation of values of parameters via a gradient descent algorithm in problems equivalent to nonlinear regression. Audience: Researchers in psychology, economics, sociology, and statistics who have been looking for a brief introduction to neural networks will find this book very useful.
— Herve Abdi (herve@utdallas.edu), Co-Author

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews