Perceptual learning

Perceptual learning

BOOK REPORTS 1775 iconicity of existential graphs. 3.3.1. Lines of identity. 3.3.2. Existential versus universal quantifiers. 3.3.3. Scope of quanti...

134KB Sizes 0 Downloads 202 Views

BOOK REPORTS

1775

iconicity of existential graphs. 3.3.1. Lines of identity. 3.3.2. Existential versus universal quantifiers. 3.3.3. Scope of quantifiers. 4. The alpha system reconsidered. 4.1. “Endoporeutic” reading. 4.2. “Negation normal form” reading. 4.2.1. Conjunctive and disjunctive juxtapositions. 4.2.2. Semantics. 4.3. Multiple readings. 4.3.1. Scroll as conditional. 4.3.2. The multiple carving principle. 4.4. Transformation rules. 4.4.1. A natural deductive system versus the alpha system. 4.4.2. ,The rules reformulated. 4.4.3. The rules reinterpreted. 4.4.4. Efficacious graphical systems. 4.5. Sentences versus graphs. 4.5.1. ‘Danslation of sentences into graphs. 4.5.2. Applications: Logical equivalence and NNF. 4.5.3. Visual efficiency. 5. The beta system reconsidered. 5.1. Preliminaries. 5.1.1. Zeman’s reading. 5.1.2. Roberts’ reading. 5.2. A new reading. 5.3. Transformation rules. 5.4. Appendix: Direct semantics. 6. Logical system versus calculus. 7. Conclusion. Notes Bibliography. Index.

Statistical Computina: An Introduction to Data Analusis Sons, New York. (2002). 761 pages. $85.

Usino S-Plus.

by Michael J. Crawley.

John Wiley &

Contents: Preface. 1. Statistical methods. 2. Introduction to S-Plus. 3. Experimental design. 4. Central tendency. 5. Probability. 6. Variance. 7. The normal distribution. 8. Power calculations. 9. Understanding data: Graphical analysis. 10. Understanding data: Tabular analysis. 11. Classical tests. 12. Bootstrap and jackknife. 13. Statistical models in S-Plus. 14. Regression. 15. Analysis of variance. 16. Analysis of covariance. 17. Model criticism. 18. Contrasts. 19. Split-plot Anova. 20. Nested designs and variance components analysis. 21. Graphs, functions and transformations. 22. Curve fitting and piecewise regression. 23. Non-linear regression. 24. Multiple regression. 25. Model simplification. 26. Probability distributions. 27. Generalised linear models. 28. Proportion data: Binomial errors. 29. Count data: Poisson errors. 30. Binary response variables. 31. Tree models. 32. Nonparametric smoothing. 33. Survival analysis. 34. Time series analysis. 35. Mixed effects models. 36. Spatial statistics. Bibliography. Index.

Perceptual Leaninq. 455 pages. $65. Contents:

Edited by Manfred Fahle and Tomsso Poggio.

The MIT Press, Cambridge, MA. (2002).

1. Experience-dependent plasticity of intracortical connections Introduction. I. Anatomy and physiology. (S. Lowe1 and W. Singer). 2. Adaptation of inputs in the somatosensory system (H.R. Dinse and M.M. Merzenich). 3. Plasticity of receptive fields in early stages of the adult visual system (U.T. Eysel). 4. Neuronal representation of object images and effects of learning (K. Tanaka). 5. Electrophysiological correlates of perceptual learning (A. Schoups). 6. Perceptual learning and the development of complex visual representations in temporal cortical neurons (D.L. Sheinberg and N.K. Logothetis). 7. Functional reorganization of human cerebral cortex and its perceptual concomitants (A. Sterr, T. Elebert and B. Rockstroh). II. Low-level psychophysics. 8. Learning to understand speech with the cochlear implant (G.M. Clark). 9. Adaptation and learning in the visual perception of gratings (A. Fiorentini and N. Berardi). 10. Plasticity of low-level visual networks (B. Zenger and D. Sagi). 11. Learning to perceive features below the fovea1 photoreceptor spacing (M. Fahle). 12. Specificity versus invariance of perceptual learning: The example of position (M. Dill). III. Higher-level psychophysics. 13. The role of insight in perceptual learning: Evidence from illusory contour perception (N. Rubin, K. Nakayama and R. Shap ley). 14. The role of attention in learning simple visual tasks (M. Ahissar and S. Ho&stein). 15. High-level learning of early visual tasks (P. Sinha and T. Poggio). 16. Learning to recognize objects (G. Wallis and H. Biilthoff). 17. Learning new faces ( V. Bruce and M. Burton). IV. Modeling. 18. Models of perceptual learning (S. Edelman and N. Intrator). 19. Learning to find independent components in natural scenes (A.J. Bell and T.J. Sejnowski). 20. Topdown information and models of perceptual learning (M.H. Herzog and M. Fahle). Glossary. References. Contributors. Index.

Fundamentals of Matrix Computations. (2002). 618 pages. $89.95. Contents:

Second Edition.

By David S. Watkins.

Wiley-Interscience,

New York.

Preface. Acknowledgments. 1. Gaussian elimination and its variants. 1.1. Matrix multiplication. 1.2. Systems of linear equations. 1.3. ‘Triangular systems. 1.4. Positive definite systems; Cholesky decomposition. 1.5. Banded positive definite systems. 1.6. Sparse positive definite systems. 1.7. Gaussian elimination and the LU decomposition. 1.8. Gaussian elimination with pivoting. 2. Sensitivity of linear systems. 2.1. Vector and matrix norms. 2.2. Condition numbers. 2.3. Perturbing the coefficient matrix. 2.4. A posteriori error analysis using the residual. 2.5. Roundoff errors; Backward stability. 2.6. Propagation of roundoff errors. 2.7. Backward error analysis of Gaussian elimination. 2.8. Scaling. 2.9. Componentwise sensitivity analysis. 3. The least squares problem. 3.1. The discrete least squares problem. 3.2. Orthogonal matrices, rotators, and reflectors. 3.3. Solution of the least squares problem. 3.4. The Gram-Schmidt process. 3.5. Geometric approach. 3.6. Updating the QR decomposition. 4. The singular value decomposition. 4.1. Introduction. 4.2. Some basic applications of singular values. 4.3. The SVD and the least squares problem. 4.4. Sensitivity of the least squares problem. 5. Eigenvalues and eigenvectors I. 5.1. Systems of differential equations. 5.2. Basic facts. 5.3. The power method and some simple extensions. 5.4. Similarity transforms. 5.5. Reduction to Hessenberg and tridiagonal forms. 5.6. The QR algorithm. 5.7. Implementation of the QR algorithm. 5.8. Use of the QR algorithm to calculate eigenvectors. 5.9. The SVD revisited. 6. Eigenvalues and eigenvectors II. 6.1. Eigenspaces and invariant subspaces. 6.2. Subspace iteration,