Handbook of Brain Theory and Neural Networks

Handbook of Brain Theory and Neural Networks

Language: English

Pages: 1136

ISBN: 0262011484

Format: PDF / Kindle (mobi) / ePub

Choice Outstanding Academic Title, 1996.

In hundreds of articles by experts from around the world, and in overviews and "road maps" prepared by the editor, The Handbook of Brain Theory and Neural Networks charts the immense progress made in recent years in many specific areas related to two great questions: How does the brain work? and How can we build intelligent machines?

While many books have appeared on limited aspects of one subfield or another of brain theory and neural networks, the Handbook covers the entire sweep of topics—from detailed models of single neurons, analyses of a wide variety of biological neural networks, and connectionist studies of psychology and language, to mathematical analyses of a variety of abstract neural networks, and technological applications of adaptive, artificial neural networks.

The excitement, and the frustration, of these topics is that they span such a broad range of disciplines including mathematics, statistical physics and chemistry, neurology and neurobiology, and computer science and electrical engineering as well as cognitive psychology, artificial intelligence, and philosophy. Thus, much effort has gone into making the Handbook accessible to readers with varied backgrounds while still providing a clear view of much of the recent, specialized research in specific topics.

The heart of the book, part III, comprises of 267 original articles by leaders in the various fields, arranged alphabetically by title. Parts I and II, written by the editor, are designed to help readers orient themselves to this vast range of material. Part I, Background, introduces several basic neural models, explains how the present study of brain theory and neural networks integrates brain theory, artificial intelligence, and cognitive psychology, and provides a tutorial on the concepts essential for understanding neural networks as dynamic, adaptive systems. Part II, Road Maps, provides entry into the many articles of part III through an introductory "Meta-Map" and twenty-three road maps, each of which tours all the Part III articles on the chosen theme.

Pro Vim

Credibilistic Programming: An Introduction to Models and Applications (Uncertainty and Operations Research)

Programming Massively Parallel Processors: A Hands-on Approach (2nd Edition) (Applications of GPU Computing Series)













Filtering mechanisms that can reject stimuli that otherwise might mask critical functions. This use of stored sensory expectations for the cancellation or perhaps the identification of specific input patterns may yield insights into diverse neural circuits, including the cochlear nuclei and the cerebellum, in other species. Two articles introduce data and models for the olfactory system (see also the road map Mammalian Brain Regions). OLFACTORY BULB describes the special circuitry involved in.

Neurons in the cockroach femoral tactile spine have been shown to display power-law adaptation. From intracellular measurements, Basarsky and French (1991) found that the spike rate adaptation is due to cumulative slowing of the recovery of the membrane potential between spikes. Previous work had demonstrated that calcium channel blockers or blockers of Ca2‫ם‬-activated K‫ ם‬channels did not reduce adaptation, while modifying sodium channel inactivation did. These mechanisms might be seen as.

‫ ס‬w ¯ TU(xi) and covariance matrix with elements cij ‫ ס‬c(xi, xj) ‫ ס‬U(xi)TRU(xj) (see GAUSSIAN PROCESSES). The idea behind a GP is that we can free ourselves from the restriction of choosing a covariance function c(xi, xj) of the form provided by the GLM prior; any valid covariance function can be used instead. Similarly, we are free to choose the mean function f i ‫ ס‬m(xi). A common choice for the covariance function is c(xi, xj) ‫ ס‬exp (‫|מ‬xi ‫ מ‬xj|2). The motivation is that the.

Artificial Intelligence (IJCAI83), San Mateo, CA: Morgan Kaufmann, pp. 190–193. Koller, D., and Pfeffer, A., 1998, Probabilistic frame-based systems, in Proceedings of the Fifteenth National Conference on Artificial Intelligence (AAAI-98), Menlo Park, CA: AAAI Press, pp. 580–587. Lauritzen, S. L., and Spiegelhalter, D. J., 1988, Local computations with probabilities on graphical structures and their application to expert systems (with discussion), J. R. Statist. Soc., series B, 50:157–224. Pearl,.

Splitting of the population into a few subgroups (clustering), and the more complex collective behavior called slow switching. Collections of oscillators that send signals to one another can phase lock, with many patterns of phase differences. CHAINS OF OSCILLATORS IN MOTOR AND SENSORY SYSTEMS discusses a set of examples that illustrate how those phases emerge from the oscillator interactions. Much of the work was motivated by spatiotemporal patterns in networks of neurons that govern undulatory.

Download sample