Short Book Reviews - International Statistical Institute

20
Short Book Reviews Vol. 19. No. 2—August 1999 Editor Dr. A.M. Herzberg REVIEWS FITTING LINEAR RELATIONSHIPS : A History of the Calculus of Observations, 1750–1900. R.W. Farebrother. New York: Springer-Verlag, 1999, pp. xii + 271, US$59.95/DM129.00/£49.50. Contents: 1. Introduction 2. The methods of Boscovich and Mayer 3. Laplace’s work on the methods of Boscovich and Mayer 4. Laplace’s minimax procedure 5. The method of least squares 6. Statistical foundations of the method of least squares 7. Adrain’s work on the normal law 8. Gauss’s most probable values 9. Laplace’s most advantageous method 10. Gauss’s most plausible values 11. Gauss’s method of adjustment by correlates 12. Mechanical analogies for the method of least squares 13. Orthogonalization procedures 14. Thiele’s derivation on the method of least squares 15. Later work on the method of least situation 16. Concluding remarks Readership: Statisticians, numerical analysts Before there were modern computers, when a reference to the word ‘computer’ meant a human working with pencil and paper, the difficulties of computation were a serious barrier to the advancement of statistical technology. Farebrother’s history of the calculus of observations (as that portion of statistics encompassing linear models was once called) covers more than computation, but it particularly shines in explaining how some of the greats of the past (such as Gauss, Laplace, Cauchy) and many lesser figures (such as Boscovich, Donkin, Thiele) confronted major difficulties in statistical analysis, and how almost in passing they developed much of modern matrix algebra and even precursors to linear programming. Farebrother’s story is told in lucid prose and with detailed algebraic development, and it reflects well the decades of painstaking scholarship that lie behind it. The work requires an investment of effort by the reader, in coping with the variety of notation employed and carefully following long algebraic arguments. Some portions, such as those on Donkin and Thiele, are heavy going indeed. But the reader who makes the investment will reap the rich reward of a much fuller appreciation of 18th and 19th century statistical work, and consequentially of modern statistics. We learn about the development of orthogonalization methods, of both graphical and analytical methods for the minimization of total absolute deviations, and about the important problems that gave rise to these developments. In the process the author cannot resist numerous and interesting short side trips, for example where we learn what the explorer David Livingstone’s Problem of the Nile had to do with statistics. University of Chicago Chicago, U.S.A. S.M. Stigler ECONOMICS: THE CULTURE OF A CONTROVERSIAL SCIENCE . M.W. Reder. University of Chicago Press, 1999, pp. xi + 384, US$35.00/£27.95. Contents: 1. Overview 2. Economics and other sciences 3. The dominant paradigm: RAP 4. The Keynesian paradigm: KP 5. Of debt and taxes: KP versus RAP 6. Some other paradigms 7. The criteria of validity in economics 8. “Successes” of positive economics: Two examples 9. Welfare economics 10. RAP and the ideology of laissez-faire 11. What is economics good for? 12. Prizes, establishment, and heroes 13. The boundaries of economics Readership: Academics of most disciplines, government bureaucrats, journalists, all others interested in public affairs, and economists This book presents the reflections of an old-timer in the discipline on the nature of economics and the way it has evolved over the past fifty years or so. The author is a prominent economist who has made contributions over that whole period. His are mature, sophisticated and well- considered reflections. His intention is to reveal what economics is all about and he succeeds quite well. The book says a lot about methodology but it is not a treatise on methodology. Instead it probes the inner nature of economics; it reviews how economics is practised by the most reputable practitioners. No use of advanced mathematics is made, but the discussion is at a quite high level of sophistication. The points the author makes are insightful, the writing is lucid, and the style carries the reader’s interest well. For the non-economist, this is probably not an easy read but well worth the effort. It would reward such a reader with a close, inside look at the subject, at the issues that are disputed by its professionals, and at the way economists go about their arguments. For the economist it offers thought-provoking, interesting, and quite comprehensive collective self-examination. For both sets of readers, this is a worthwhile book which I recommend strongly. I would especially commend it to serious graduate students and young practitioners just starting their careers as professional economists. Nothing at all comparable can be found. It is a unique effort and one to be appreciated. Queen’s University

Transcript of Short Book Reviews - International Statistical Institute

Short Book Reviews

Vol. 19. No. 2—August 1999 Editor Dr. A.M. Herzberg

REVIEWS

FITTING LINEAR RELATIONSHIPS: A History of the Calculus of Observations, 1750–1900. R.W. Farebrother. New York: Springer-Verlag, 1999, pp. xii + 271, US$59.95/DM129.00/£49.50.

Contents: 1. Introduction 2. The methods of Boscovich and Mayer 3. Laplace’s work on the methods of Boscovich and

Mayer 4. Laplace’s minimax procedure 5. The method of least squares 6. Statistical foundations of the method of least squares 7. Adrain’s work on the normal law 8. Gauss’s most probable values 9. Laplace’s most advantageous method 10. Gauss’s most plausible values 11. Gauss’s method of adjustment by correlates 12. Mechanical analogies for the method of least squares 13. Orthogonalization procedures 14. Thiele’s derivation on the method of least squares 15. Later work on the method of least situation 16. Concluding remarks

Readership: Statisticians, numerical analysts

Before there were modern computers, when a reference to the word ‘computer’ meant a human working with pencil and paper, the difficulties of computation were a serious barrier to the advancement of statistical technology. Farebrother’s history of the calculus of observations (as that portion of statistics encompassing linear models was once called) covers more than computation, but it particularly shines in explaining how some of the greats of the past (such as Gauss, Laplace, Cauchy) and many lesser figures (such as Boscovich, Donkin, Thiele) confronted major difficulties in statistical analysis, and how almost in passing they developed much of modern matrix algebra and even precursors to linear programming. Farebrother’s story is told in lucid prose and with detailed algebraic development, and it reflects well the decades of painstaking scholarship that lie behind it. The work requires an investment of effort by the reader, in coping with the variety of notation employed and carefully following long algebraic arguments. Some portions, such as those on Donkin and Thiele, are heavy going indeed. But the reader who makes the investment will reap the rich reward of a much fuller appreciation of 18th and 19th century statistical work, and consequentially of modern statistics. We learn about the development of orthogonalization methods, of both graphical and analytical methods for the minimization of total absolute deviations, and about the important problems that gave rise to these developments. In the process the author cannot resist numerous and interesting short side trips, for example where we learn what the explorer David Livingstone’s Problem of the Nile had to do with statistics.

University of Chicago Chicago, U.S.A. S.M. Stigler

ECONOMICS: THE CULTURE OF A CONTROVERSIAL SCIENCE. M.W. Reder. University of Chicago Press, 1999, pp. xi + 384, US$35.00/£27.95.

Contents: 1. Overview 2. Economics and other sciences 3. The dominant paradigm: RAP 4. The Keynesian paradigm: KP 5. Of debt and taxes: KP versus RAP 6. Some other paradigms 7. The criteria of validity in economics 8. “Successes” of positive economics: Two examples 9. Welfare economics 10. RAP and the ideology of laissez-faire 11. What is economics good for? 12. Prizes, establishment, and heroes 13. The boundaries of economics

Readership: Academics of most disciplines, government bureaucrats, journalists, all others interested in public affairs, and economists

This book presents the reflections of an old-timer in the discipline on the nature of economics and the way it has evolved over the past fifty years or so. The author is a prominent economist who has made contributions over that whole period. His are mature, sophisticated and well-considered reflections. His intention is to reveal what economics is all about and he succeeds quite well. The book says a lot about methodology but it is not a treatise on methodology. Instead it probes the inner nature of economics; it reviews how economics is practised by the most reputable practitioners. No use of advanced mathematics is made, but the discussion is at a quite high level of sophistication. The points the author makes are insightful, the writing is lucid, and the style carries the reader’s interest well.

For the non-economist, this is probably not an easy read but well worth the effort. It would reward such a reader with a close, inside look at the subject, at the issues that are disputed by its professionals, and at the way economists go about their arguments. For the economist it offers thought-provoking, interesting, and quite comprehensive collective self-examination. For both sets of readers, this is a worthwhile book which I recommend strongly. I would especially commend it to serious graduate students and young practitioners just starting their careers as professional economists. Nothing at all comparable can be found. It is a unique effort and one to be appreciated.

Queen’s University

22

Kingston, Canada R.M. McInnis

STATISTICS FOR THE ENVIRONMENT 4: Statistical Aspects of Health and the Environment. V. Barnett, A. Stein and K. Turkman (Eds.). Chichester, U.K.: Wiley, pp. xviii + 404, £95.00.

Contents: PART I : Small Area Studies and Disease Mapping PART II : Atmospheric Pollution Studies PART III : Disease Risks and Social Effects PART IV : Effects of Radiation PART V : Agriculture and the Food Chain

Readership: Environmental research scientists and statisticians

This is the fourth volume in the sequence on Statistics for the Environment [Short Book Reviews, Vol. 14, p. 2; Vol. 15, p. 3; Vol. 18, p. 3]. It contains a summary of four introductory talks by scientists from Dutch national research institutes, plus twenty other papers, all presented at the SPRUCE Conference held at Enschede, The Netherlands, in September 1997. The third talk refers to the now widely recognized anomaly that despite the increasing income, health and welfare of the world population, we see an ever-increasing concern for the future. Many of these anxieties have been addressed by the SPRUCE sequence of conferences and the associated books. As with the other three volumes, this is a well-edited package, with an integrated and readable set of papers presented in a consistent style. It should be of interest to all statisticians studying spatial and spatio-temporal modelling, and also to environmentalists looking for assistance from statistics.

University of Manchester Institute of Science and Technology

Manchester, U.K. P.J. Laycock

NUMERICAL LINEAR ALGEBRA FOR APPLICATIONS IN STATISTICS. J.E. Gentle. New York: Springer-Verlag, 1998, pp. xiii + 221, US$54.95.

Contents: 1. Computer storage and manipulation of data 2. Basic vector/matrix computations 3. Solutions of linear systems 4. Computation of eigenvectors and eigenvalues and the

singular value decomposition 5. Software for numerical linear algebra 6. Applications in statistics

Readership: Undergraduate or graduate students interested in statistical computing

The solution of linear systems of equations and the calculation of eigenvalues and eigenvectors is of immense importance not only in statistics but also in a wide variety of other areas. This account of the subject starts with a discussion of the manner in which data are stored—essential for an understanding of the accuracy with which data are represented, to appreciate the errors that can arise in numerical calculations and to understand various techniques for minimizing such errors. There follows a standard account of vector and matrix results germane to the subsequent chapters. The chapter on solutions of linear systems discusses the commonly used methods and in some cases relevant algorithms are provided. There is a useful account of some of the software available for performing numerical linear algebra. The material throughout is clearly presented with plenty of exercises and

motivating discussion. It would make an excellent text to accompany a course on statistical computing.

Macquarie University Sydney, Australia J.R. Leslie

STATISTICAL DATA ANALYSIS. G. Cowan. Oxford: Clarendon Press, 1998, pp. xiv + 197.

Contents: 1. Fundamental concepts 2. Examples of probability functions 3. The Monte Carlo method 4. Statistical tests 5. General concepts of parameter estimation 6. The method of maximum likelihood 7. The method of least squares 8. The method of moments 9. Statistical errors, confidence intervals and limits 10. Characteristic functions and related examples 11. Unfolding

Readership: Graduate and advanced undergraduate students in the physical sciences who need to draw quantitative information from experimental data

I am always intrigued by the differences in flavour and emphasis that different disciplines bring to even the same statistical tools, not to mention the different choices of statistical tools which play the major roles in different disciplines. Thus this overview of statistical methods in the physical sciences does not refer to analysis of variance, but it does discuss regularization methods. “Unfolding” in this book refers to deconvolution to remove measurement error, a rather different use of the term from the quantitative social and behavioural sciences.

To me, at least, the book presents a rather unusual mix in the way the material is presented. Thus, for example, we have discussions (or, at least, definitions) of conditional probability. Bayes’ theorem, frequentist and subjective probability, and likelihood before the notion of a histogram is introduced. The book seems to sit halfway between an elementary primer on how to use basic statistical methods and an introduction to the more mathematical aspects of statistics. The background knowledge assumed includes linear algebra, multivariable calculus, and some knowledge of complex analysis, but no prior knowledge of probability or statistics.

The material presented in this book is dense. In less than two hundred pages, it takes the reader from the basic notions of probability, through neural networks, Monte Carlo methods, and regularization techniques. I would imagine that readers new to the area would find it hard going, and would benefit from some supplementary reading material: the author’s description of the book as a “guide” to the practical application of statistics in the area is astute.

Imperial College of Science, Technology and Medicine

London, U.K. D.J. Hand

23

MODELS FOR DISCRETE DATA. D. Zelterman. Oxford: Clarendon Press, 1999, pp. x + 233, £35.00.

Contents: 1. Introduction 2. Sampling distributions 3. Logistic regression 4. Log-linear models 5. Coordinate-free models 6. Additional topics

APPENDIX A : Power for the Chi-Squared Tests APPENDIX B : Program for Exact Tests APPENDIX C : The Hypergeometric Distribution

Readership: Statisticians, graduate students of statistics, numerate biomedical or sociological research workers

There is only one word to describe this book: excellent! The author has achieved an admirable blend of pertinent statistical theory and practical examples. Chapter 2 covers theoretical aspects of the Poisson, binomial and hyper-geometric and multinomial distributions; Fisher’s exact test and estimation of sample sizes.

Chapters 3 and 4 cover logistic regression and log-linear models for cross-classified count data. Acknowledging that there are now several good books on logistic regression, the author places rather more emphasis on log-linear models. Complete SAS programs using the GENMOD or LOGISTIC procedures are given and results of these analyses are interpreted in detail. It is good to see Simpson’s paradox discussed a number of times.

Chapter 5, adopting a generalized-linear-models approach, covers various incomplete cross-classifications. Conditions for the existence of maximum likelihood estimates are discussed, giving insight into problems that can arise in practice with awkward configurations of data.

The final chapter gives brief introductions to the analysis of longitudinal data, case control studies, sparse data and goodness-of-fit statistics.

The extensive exercises at the end of each chapter are divided into applied and theoretical sections, each of which amplifies the topics discussed. Some forty examples are discussed in the text.

Students completing a course based on this book should be able to analyze and to interpret most of the count or proportional data they are likely to encounter in practice; they would have insight into the mathematical basis of the subject, and after reading Chapters 5 and 6, may even have their appetites whetted for research.

University of Cape Town Rondebosch, South Africa J.M. Juritz

INTRODUCTORY STATISTICS AND RANDOM PHENOMENA. Uncertainty, Complexity, and Chaotic Behaviour in Engineering and Science. M. Denker and W.A. Woyczynski. Boston: Birkhäuser, 1998, pp. xxiv +509.

Contents: PART I : Descriptive Statistics – Compressing Data 1. Why one needs to analyze data 2. Data representation and compression 3. Analytic representation of random experimental data PART II : Modeling Uncertainty 1. Algorithmic complexity and random strings 2. Statistical independence and Kolmogorov’s probability

theory 3. Chaos in dynamical systems: how uncertainty arises in

scientific and engineering phenomena

PART III : Model Specification – Design of Experiments 1. General principles of statistical analysis 2. Statistical inference for normal populations 3. Analysis of variance

Readership: Engineering and science students

This book has the subtitle “Uncertainty, Com-plexity and Chaotic Behaviour in Engineering and Science.” It is intended for a course in introductory probability and statistics, aimed chiefly at engineering and science students, and taking a unique and particularly contemporary approach to introductory data analysis. (It might require a somewhat adventurous statistics instructor to use this with statistics majors, because some of the material will seem unfamiliar and possibly unnecessary to those accustomed to traditional introductory statistics offerings.) The prerequisites comprise introductory calculus, some differential equations and linear algebra, and a basic computer programming course.

The book is highly data-oriented, with an unusually large collection of real-life examples taken from industry and various scientific disciplines; this includes the natural, life, and social sciences. Indeed, the approach could be described as almost data-driven, and the learning method emphasizes hands-on computer experiments and numerical techniques. Many experiments and exercises programmed in Mathematica are provided, and, as the authors themselves say, using this book without doing the accompanying Mathematica experiments would be like playing Chopin on the accordion.

The book does cover the usual standard intro-ductory probability theory, including axioms and properties, expectation, independence, discrete and continuous distributions, multivariate distributions, moment generating functions, variance and covariance, and the Poisson and Gaussian approximations. (Some of this material is done at a higher level than might sometimes be desired for an introductory course.) Computer experiments are provided to illustrate the Law of Large Numbers and the Central Limit Theorem. There is considerable material on pseudo-random number generation and Monte Carlo methods.

On the statistical side, the text covers types of data, descriptive statistics, data compression, design of experiments, model selection, maximum likelihood and least-squares estimators, regression and correlations, confidence intervals, and hypothesis tests for normal populations, and one-way and two-way analyses of vari-ance. Again, some of these topics might be presented at a higher level than desired for some introductory statistics courses. The material is largely interwoven with the material on probability theory and other topics.

The book departs from the above standard fare, however, by including detailed coverage of such con-temporary topics as chaotic dynamical systems, the nature of randomness, computability and Kolmogorov complexity, encryption, ergodicity, entropy, and even fractals. Some of these topics might be more important to students in certain areas of the physical sciences, engineering, and computing. Statisticians might wish to learn more about them. While it should be possible to work around some of this material if it is not desired, and would be missing the whole point of the authors’ approach were one to leave too much out. It is an interesting book.

University of Waterloo Waterloo, Canada C. Cutler

24

NONPARAMETRIC STATISTICAL METHODS, 2nd edition. M. Hollander and D.A. Wolfe. New York: Wiley, 1999, pp. xiv + 787, £58.50. [Original 1973].

Contents: 1. Introduction 2. The dichotomous data problem 3. The one-sample location problem 4. The two-sample location problem 5. The two-sample dispersion problem and other two-

sample problems 6. The one-way layout 7. The two-way layout 8. The independence problem 9. Regression problems 10. Comparing two success probabilities 11. Life distributions and survival analysis

Readership: Upper-level undergraduate or first-year graduate

This book provides a comprehensive guide to nonparametric statistics through extensive use of ‘real-world’ examples. It provides the reader with the opportunity to understand the theoretical aspects of nonparametric techniques as well as to explore the ways in which the most common techniques are applied to real data. It covers areas of nonparametric statistics, such as example estimation, regression, bootstrapping, that many other texts have failed to cover and in that sense is a useful reference for ‘up-to-date’ methods. Particularly appealing is the way in which each technique is presented in a standard format: a description of the procedure, details of large-sample approximations that can be made, how to deal with tics, an example, some comments from the authors, some details of the theoretical aspects, some exercises.

Despite its applied nature, the book is fairly technical in places and in that sense may be best suited to those who have a reasonably strong quantitative background rather than other professionals wishing to apply the techniques within their discipline.

London School of Hygiene and Tropical Medicine

London, U.K. C.D. Higgins

LONGITUDINAL DATA ANALYSIS: DESIGNS, MODELS AND METHODS. C.J. Bijleveld and L.J. van der Kamp with C.C.J. Bijleveld, W.A. van der Kloot, R. van der Leeden and F. van der Berg. London: Sage, 1998, pp. xxii + 425, £57.00 Cloth; £18.99 Paper.

Contents: 1. Methodological issues in longitudinal research 2. Analysis of longitudinal categorical data using optimal

scaling techniques 3. Univariate and multivariate analysis of variance of

longitudinal data 4. Structural equation models for longitudinal data 5. Multilevel analysis of longitudinal data 6. Log-linear and Markov modelling of categorical

longitudinal data 7. Epilogue

Readership: Statisticians, graduate students of statistics, researchers in the behavioural sciences, education, sociology, medicine and biometry

Measurements made on the same individuals on a number of occasions occur in many fields of research. Methods for the statistical analysis of such data have given rise to a vast literature, but, as the authors point out, books

on the subject tend to deal with only one method of analysis. There are texts on ANOVA and MANOVA methods for continuous, normal responses; on structural equation models for investigating causal relationships; on multilevel modelling for hierarchical data; on Markov or log-linear models for discrete data and forms of multiple correspondence analysis for the graphical display of the relationship between variables, subjects and times. These, often very good, texts are informative but leave the reader a little uncertain as to how the whole area of longitudinal data analysis fits together. Until now that is!

Here is a single book which describes all these methods, their areas of applicability, their strengths, their limitations, and how they relate to each other.

The first chapter deals in general terms with the design of longitudinal studies and their subsequent validity and interpretation. It should be required reading for everyone planning such a study, no matter what method of analysis they intend to use.

In Chapters 3 to 6, the methods are described. The purpose of each is clearly stated and is motivated by a simple set of data, which illustrates its main features. Then more complex sets are analyzed and interpreted. Careful attention is given to the exact statement of the null hypothesis of any test described. Complex matrix formulations are avoided by skilful use of diagrams, but sufficient formulae are used to make the text precise. At every point copious references are given to the literature and comparisons with other approaches are described in the book. Computer programs that will perform the analysis are named. Emphasis is always on the practicalities of the analysis.

In the Epilogue, the relationships between all the techniques are further reviewed and recommendations on how to select a method are given. Finally, topics not discussed in the book, such as item-response theory and event-history analysis are mentioned with references.

This is an outstanding book. Its authors have performed a valuable service to the statistical community and to those who need to elucidate data collected on a number of occasions. Read it! Enjoy it! You will learn a lot!

University of Cape Town Rondebosch, South Africa J.M. Juritz

APPLIED SURVIVAL ANALYSIS. REGRESSION MODELING OF TIME TO EVENT DATA.D.W. Hosmer Jr. and S. Lemeshow. New York: Wiley, 1999, pp. xiii + 386, £51.95.

Contents: 1. Introduction to regression modeling of survival data 2. Descriptive methods for survival data 3. Regression models for survival data 4. Interpretation of a fitted proportional hazards

regression model 5. Model development 6. Assessment of model adequacy 7. Extensions of the proportional hazards model 8. Parametric regression models 9. Other models and topics

APPENDIX 1 : The Delta Method APPENDIX 2 : An Introduction to the Counting Process

Approach APPENDIX 3 : Percentiles for Computation of the Ball and

Wellner Confidence Band

Readership: Statisticians, researchers in medicine and biometry, epidemiologists

25

The previous book by these authors, Applied Logistic Regression [Short Book Reviews, Vol. 10, p. 27], has won a firm place in the literature as one of the best introductory texts on the analysis of proportional data. It has helped to make logistic regression accessible beyond the statistical community, and also offered theoretical statisticians an insight into the issues that must be considered when applying statistical theory to real data.

Their new book is sure to occupy the same niche in the survival analysis literature. The authors have a gift for putting mathematical concepts into words and for interpreting the results of a complex statistical analysis in terms of the background to the data.

The major emphasis is on the proportional hazards model and the discussion follows the usual regression modeling paradigm: preliminary data, descrip-tion, model selection, examination of fit, interpretation of the estimates. Mathematics at the level of an introductory regression course gives precision and the surrounding discussion a wealth of insight. Statistical packages that offer procedures for the analyses are referenced. Three sets of data, available on the “Web”, are used throughout as examples. Exercises follow each chapter.

The material covered in the text is right up to date. Chapter 9 discusses recurrent event models, frailty models and nested case-control studies. A discussion of these important topics at this level is not available elsewhere. References to the counting processes approach to survival analysis are made throughout the text and a brief introduction to this topic is given in an appendix.

This book makes a major contribution to the understanding of “time-to-event” data. It is highly recommended.

University of Cape Town Rondebosch, South Africa J.M. Juritz

THE USES AND MISUSES OF DATA AND MODELS. THE MATHEMATIZATION OF THE HUMAN SCIENCES.W.J. Bradley and K.C. Schaefer. Thousand Oaks, California: Sage, 1998, pp. xii + 211, £12.99.

Contents: PART I : Foundations 1. Oracles, norms, and science 2. Modeling 3. Dreams and disappointments PART II : The Information Cycle 4. A priori influences on the information cycle 5. Measurement of human information 6. Limitations of measurement in the social sciences 7. Information for inferences: What are social science

data? 8. Causality 9. Models and policy making

Readership: Researchers and practitioners in the social and decision sciences, decision-makers, and students preparing for these fields

This book examines the nature and role of formal models in the ‘human’ sciences, seeking to explore the principles which (ought to) guide the appropriate use of data and models. It begins with a discussion of the problems associated with social science data, including measurement error, problems of precise definition, the spurious sense of precision associated with the measurement of social phenomena, and consequences of emphasis on processes, and then goes on to describe principles which should guide social research. The book will be at best of peripheral interest to most practising statisticians. The authors

comment, for example, that ‘We have examined a sample of several graduate programs in statistics at major American universities. The programs are very strong in the technical aspects of data analysis. Nevertheless, we were not able to find even one example of a course offered by a graduate-level statistics program that addressed underlying philosophical issues, the history of statistics, or policy implications of statistical analysis’ (p. 13). On the other hand, new PhD students concerned with developing formal models for social processes would benefit from an awareness of its content.

Imperial College of Science, Technology and Medicine

London, U.K. D.J. Hand

LA REGRESSION PLS. M. Tenenhaus. Paris: Editions Technip, 1998, pp. x + 254, FFr320.00.

Table des Matières: 1. Introduction 2. L’analyse canonique 3. Analyse factorielle inter-batteries 4. L’analyse des redondances 5. L’approche SIMPLS 6. L’algorithme NIPALS 7. La régression PLS univariée (PLS1) 8. Propriétés mathématiques de la régression PLS1 9. La régression PLS multivariée (PLS2) 10. Applications de la régression PLS 11. L’analyse canonique PLS 12. Traitement des données qualitatives 13. L’approche PLS

Lecteurs: Enseignants et utilisateurs de la régression

Ce livre donne une présentation théorique et pratique de la méthode de la régression PLS (Partial Least Squares). Il est démontré que cette méthode, qui est principalement connue dans la chimie, s’applique aussi dans beaucoup d’autres domaines. Il s’agit d’une méthode de régression pour décrire les relations entre la réponse et un grand nombre de variables d’entrée en l’absence d’un modèle théorique. L’algorithme de la régression PLS est décrit avec ses propriétés mathématiques. Il y a de nombreux examples et illustrations en utilisant le logiciel SIMCA-P for Windows.

Limburgs Universitair Centrum Diepenbeek, Belgium N.D.C. Veraverbeke

26

REGRESSION GRAPHICS. Ideas for Studying Regressions through Graphics. R.D. Cook. New York: Wiley, 1998, pp. xviii + 349, £65.00.

Contents: 1. Introduction 2. Introduction to 2D scatterplots 3. Constructing 3D scatterplots 4. Interpreting 3D scatterplots 5. Binary response variables 6. Dimension-reduction subspaces 7. Graphical regression 8. Getting numerical help 9. Graphical regression studies 10. Inverse regression graphics 11. Sliced inverse regression 12. Principal Hessian directions 13. Studying predictor effects 14. Predictor transformations 15. Graphics for model assessment

Readership: Advanced regression practitioners with a good theoretical background

This is an intriguing and imaginative book. It discusses how graphical analysis can aid the investigation of regression data and perhaps help to reduce the data to a structure of smaller dimensionality. Although a number of data sets are analyzed extensively, the basic ideas are difficult to pick up on an initial reading due to the high technical density of the material. Natural questions are whether, when a reduction in dimension has been achieved, it has practical consequences that can be fully understood and whether the simplification relates to canonical dimensions (in the case of response surface data, for example). Such questions provide ample incentive to study this important book, perhaps via a seminar course. It is an essential library purchase.

University of Wisconsin Madison, U.S.A. N.R. Draper

PROCESS CAPABILITY INDICES IN THEORY AND PRACTICE. S. Kotz and C.R. Lovelace. London: Arnold, 1998, pp. viii + 279, £40.00.

Contents: 1. What is it all about? 2. The two basic, time-honored process capability

indices: Cp and Cpk 3. First-generation modifications: Cpm and its close

relatives 4. The avalanche 5. The benefit (or curse) of non-normality and asymmetry

and how to get rid of them 6. A superstructure and unified approach to process

capability indices 7. The dangerous but unavoidable area: Multivariate

process capability indices 8. Practical issues in capability analysis 9. Just say yes!

APPENDIX: List of Univariate Process Capability Indices

Readership: Statistically sophisticated practitioners who employ process capability indices to characterize manufacturing processes, and statisticians and advanced students who seek a summary of the past fifteen years’ research in the field

This book is the best overview yet of the broad field of process capability indices. A process capability

index (PCI) is a measure that seeks to represent, in a single number, a manufacturing process’s ability to deliver a product within specifications—typically in terms of a ratio of allowable variation (the customer’s specifications versus the variation actually achieved by the process). Since the mid-1980’s, industrial use of PCI’s has polarized practitioners and theoreticians—with wide gulfs developing both within and between these populations. There have been accusations of statistical terrorism, manipulation, and abuse, and a consequent (healthy) explosion of debate and research. The authors aim to bridge the gaps among all parties, but I would bet that practitioners and their managers will be startled on reading this book to discover just how wide the gap has become as research sheds new light on the field.

The authors describe essentially all of the process capability indices in current use, summarizing their assumptions, statistical properties, intended use, and potential weaknesses. PCI’s are random variables, after all, and an understanding of their variability and biases is a prerequisite to their respective applications. The book brings some theoretical foundation to these mostly ad hoc measures. The field is maturing but not yet ripe. The authors draw broadly from the research of the originators and other investigators of the various PCI’s, and the compilation of references at the end of each chapter is a valuable contribution. Chapter 6 offers a “superstructure” of which several of the existing indices are special cases, and from which other indices can be developed. The authors introduce this approach, tongue-in-cheek: “The time-honored device of generalization utilized in statistics is usually a clever and delicate introduction of additional parameters.”

The authors’ enthusiasm is evident in their writing. The book includes more references to “current events” than is customary in statistics texts, primarily motivating examples from various manufacturing companies. This will date the book, but for a field that is changing so rapidly, these references give an interesting historical context.

Some bits of statistical theory are given in various appendices, but the reader will need some sophistication in addition to follow the derivations and to appreciate the conclusions in the various chapters. As to whether process capability indices are useful, the authors are positive. Their parting words: “Remember, it is much easier to lie without statistics than with them.”

Brookfield, Wisconsin, U.S.A. C. A. Fung

MODEL SELECTION AND INFERENCE: A PRACTICAL INFORMATION-THEORETIC APPROACH.K.P. Burnham and D.R. Anderson. New York: Springer-Verlag, 1998, pp. xix + 353, US$69.95.

Contents: 1. Introduction 2. Information theory and likelihood models: A basis for

model selection and inference 3. Practical use of the information-theoretic approach 4. Model selection uncertainty with examples 5. Monte Carlo and example based insight 6. Statistical theory 7. Summary

Readership: Statisticians, other researchers interested in model selection

Obviously, the authors are enthusiastic disciples of Kullback-Leibler information/distance. The first four chapters are written with inspiration. Although it cannot be considered as a rigorous mathematical text, it is an

27

interesting reading together with occasional historical assays. Other chapters contain a number of examples and derivations of some “practical” formulae and are much more technical but still the mathematical rigour is not their strongest feature. Remarks like “straightforward mathematics ...” and “... is mostly a straightforward exercise” do not help either to understand or to illuminate of the reported results. The use of “the best (approximating) model” instead of “the true model” is an interesting idea, which is frequently addressed in the book. I think that could make the following step they focused on more an empirical observation that “the best (approximating) model” changes when the number of observations increases (p. 69). Actually, this fact has a theoretical background and can be formulated in terms of nested models as dependence of complexity of the best model on the number of available observations. The book would gain a wider audience if the informational vocabulary were more frequently compared with the standard statistical language.

SmithKline Beecham Pharmaceuticals Collegeville, U.S.A. V.V. Fedorov

STATISTICAL DESIGN AND ANALYSIS FOR INTER-CROPPING EXPERIMENTS. Volume II: Three or More Crops. W.T. Federer. New York: Springer-Verlag, 1999, pp. xxiii + 262, US$89.95.

Contents: 11. Introduction to Volume II 12. Main crop with supplementary crops 13. Three or more main crops—Density constant 14. Varying densities for some or all crops in a mixture 15. Mixing ability effects when individual responses are

available 16. Mixtures when responses not available 17. Spatial and density arrangements 18. Some analytical variations 19. Intercropping procedures 20. An intercropping bibliography

Readership: Agricultural experimenters

In an intercropping experiment, two or more crops and/or mixtures of these are grown simultaneously. Responses will depend, amongst other things, on the precise spatial arrangement of the crops and the density level of each crop. There may be multiple goals, for example, both the nutritional value of the food produced and its monetary worth may be of interest.

This volume is a sequel to the first which dealt with mixtures of two crops and sole crops. Chapters in the two books go in parallel so that the topics covered here are similar to those in Volume I [Short Book Reviews, Vol. 14, p.5]. Although the author points out the dangers of making too simple a generalization from two crops to three or more, he gives us an authoritative and extremely thorough account, biased more towards analysis than design, of intercropping experiments. There is a wealth of worked numerical examples, making this book an invaluable reference tool for those working in this area.

Imperial College of Science, Technology and Medicine

London, U.K. L.V. White

THEORY AND METHODS OF SURVEY SAMPLING.P. Mukhopadhyay. New Delhi: Prentice-Hall, 1998, pp. xi + 483.

Contents:

1. The basic concepts 2. Simple random sampling 3. Stratified random sampling 4. Ratio estimator 5. Difference estimator and regression estimator 6. Systematic sampling 7. Cluster sampling 8. Probability proportional to size with replacement

sampling 9. Varying probability without replacement sampling I 10. Varying probability without replacement sampling II 11. Multistage sampling 12. Multiphase sampling 13. Sampling on successive occasions 14. Some problems of inference under a fixed population

set-up 15. Inference from a finite population using the prediction-

theoretic approach 16. Errors in surveys 17. Randomized response techniques 18. Small area estimation

Readership: Survey statisticians, survey samplers, students taking an introductory course in sampling

The first twelve chapters of this book could be used as a very thorough and comprehensive first course in survey sampling methods. Later chapters are rather more specialized; each chapter has a set of exercises, many theoretical but some numerical, to support the theory covered. Solutions to some of these exercises are included at the end of the book; each chapter has its own extensive list of references, most of them from the early development of the subject with relatively few published in the last twenty years, the exceptions being for some of the more specialized material covered in the later chapters and to some of the author’s own recent contributions to these topics. Admissibility of estimators and optimality of sampling strategies when sampling from a superpopulation are discussed in Chapter 14, which is followed in the next chapter by an investigation of model-dependent optimal strategies under alternative models in a superpopulation framework. The final chapters cover randomized response methods for dichotomous and polychotomous populations and some synthetic and composite estimators suitable for small geographical areas.

University of Southampton Southampton, U.K. P. Prescott

ELEMENTS OF LARGE-SAMPLE THEORY. E.L. Lehmann. New York: Springer-Verlag, 1999, pp. xii + 631, US$79.95.

Contents: 1. Mathematical background 2. Convergence in probability and in law 3. Performance of statistical tests 4. Estimation 5. Multivariate extensions 6. Nonparametric estimation 7. Efficient estimators and tests

Readership: Graduate students in statistics and applied fields

The book gives a comprehensive account of first-order large-sample theory for students who have taken two courses of calculus and some linear algebra. It does this by stating the more difficult results without proof, by using convergence in probability rather than almost sure

28

convergence as the mode of probabilistic convergence—and by teaching the mathematics needed beyond the second calculus course. The book took several years to write, and its content has been well tested in courses. Notable features are the concise and thoughtful summaries at the ends of sections, and the extensive sets of problems, which occupy nearly one hundred pages.

There is a good deal more in the book than would be covered in a first graduate course in mathematical statistics, but students are likely to find it a handy and accessible reference. For example, Chapter 5 on multivariate extensions collects together the results they will need for vector-valued statistics. Chapter 6 summarizes the theory of U-statistics, gives an introduction to density estimation and explores briefly some uses of the bootstrap in point estimation of biases and variances. Chapter 7 tackles maximum likelihood estimation and associated tests, illustrating the results with continuous models and with categorical data models. That this chapter comes last underlines the fact, implicit in the title, that the book is more about mathematics of large-sample theory than about its uses. However, there is no extraneous mathematics: the treatment stays admirably close to the fundamentals of the frequentist statistical theory it supports.

University of Waterloo Waterloo, Canada M.E. Thompson

ASYMPTOTIC STATISTICS. A.W. van der Vaart. Cambridge University Press, 1998, pp. xv + 443, £44.00/US$64.95.

Contents: 1. Introduction 2. Stochastic convergence 3. Delta method 4. Moment estimators 5. M- and Z-estimators 6. Contiguity 7. Local asymptotic normality 8. Efficiency of estimators 9. Limits of experiments 10. Bayes procedures 11. Projections 12. U-statistics 13. Rank, sign, and permutation statistics 14. Relative efficiency of tests 15. Efficiency of tests 16. Likelihood ratio tests 17. Chi-square tests 18. Stochastic convergence in metric spaces 19. Empirical processes 20. Functional delta method 21. Quantiles and order statistics 22. L-statistics 23. Bootstrap 24. Nonparametric density estimation 25. Semiparametric models

Readership: Graduate and postgraduate teachers, researchers in mathematical statistics

This well-written book covers limit theorems for the standard special cases of statistics such as M-, L-,R-, and U-statistics. Also likelihood inference and asymp-totic efficiency of estimators and tests are treated. Important concepts in the book are Hajek’s projection method, weak convergence and the unifying mathematical concept of approximation by limit experiences. There is a thirty-page summary of the theory of empirical processes and via the functional delta method, the asymptotic distribution of quantile estimators is derived from that of the distribution function estimators. The chapters on the bootstrap and on nonparametric density estimation are short and necessarily very incomplete. The last chapter is a seventy-page treatment of semiparametric models, an area which is still in full development. The notes to the history and the bibliogra-phy are a bit too selective. Each chapter has a section with a number of nice problems, some of which are hard. This makes the book interesting for teaching projects.

Limburgs Universitair Centrum Diepenbeek, Belgium N.D.C. Veraverbeke

DECOUPLING: FROM DEPENDENCE TO INDEPENDENCE. Randomly Stopped Processes, U-Statistics and Processes, Martingales and Beyond. V.H. de la Peña and E. Giné. New York: Springer-Verlag, 1999, pp. xv + 392.

Contents: 1. Sums of independent random variables 2. Randomly stopped processes with independent

increments 3. Decoupling of U-statistics and U-processes 4. Limit theorems for U-statistics 5. Limit theorems for U-processes 6. General decoupling inequalities for tangent sequences 7. Conditionally independent sequences 8. Further applications of decoupling

Readership: Researchers in probability and statistics

The method of decoupling aims at handling problems with dependent variables by the reduction to problems on related (conditionally) independent variables. This approach grew out of some situations where the traditional method of martingales was not applicable. Very important tools for the method are the so-called decoupling inequalities which compare functionals of the dependent variables to functionals of conditionally independent (decoupled) variables.

This book gives a well-written and very detailed description of the general theory and specific applications. More than half of the book is devoted to the decoupling of U-statistics and U-processes. It is shown how the decoupling inequalities play a crucial role in their asymptotic theory: law of large numbers, central limit theorem and law of the iterated logarithm. Also results for randomly stopped U-statistics are dealt with (extending results of Wald and Anscombe). Applications are given for the empirical median, M-estimators and hazard and distribution function estimators for left truncated data.

Limburgs Universitair Centrum Diepenbeek, Belgium N.D.C. Veraverbeke

29

EPIDEMIC MODELLING: AN INTRODUCTION. D.J. Daley and J. Gani. Cambridge University Press, 1999, pp. xii + 213, £30.00/US$49.75.

Contents: 1. Some history 2. Deterministic models 3. Stochastic models in continuous time 4. Stochastic models in discrete time 5. Rumours: Modelling spread and its cessation 6. Fitting epidemic data 7. The control of epidemics

Readership: Anyone with an interest in the mathematical aspects of epidemic models

This monograph provides an account of the development of mathematical epidemic models, from their foundations more than three hundred years ago with the work of John Graunt, and systematically covering the advances made this century up to about the mid-1970s. The coverage of work of the last twenty years is much less complete (though the spread of HIV gets a substantial treatment) and many major contributions to the literature over that period are not mentioned. However, the authors do not claim to provide a comprehensive account of recent developments, but rather aim to give a suitable background to the current literature, and in this they are wholly successful.

The book will be accessible, at least in part, and its study highly rewarding, to anyone with an interest in epidemic models and a good undergraduate degree in mathematics (including some basic applied probability). As is to be expected from these authors, standard models are discussed with great insight and the book is as much about the mathematical tools that can be brought to bear on the models as about the epidemic models themselves. Thus the organization of the book owes as much to the mathematical methods as to the models themselves. In these days, when too many modellers use numerical tools as a substitute for thought, the careful exposition of techniques is refreshing. This monograph will be much appreciated by the expert, as well as by the novice modeller.

The epidemics discussed are mostly of infections spread by direct contact. There is a fairly brief but useful discussion of model fitting and of the evaluation of control strategies. The spread of HIV is discussed extensively and other infections treated include measles and influenza. Exercises and Complements are provided at the end of each Chapter, providing a welcome opportunity for the reader to develop intuition and additional insights into the models discussed.

University College London London, U.K. V.S. Isham

SAMPLE-PATH ANALYSIS OF QUEUEING SYSTEMS.M. El-Taha and S. Stidham Jr. Boston: Kluwer, 1999, pp. ix + 295, Dfl.260.00/US$115.00/£78.25.

Contents: 1. Introduction and overview 2. Background and fundamental results 3. Processes with general state space 4. Processes with countable state space 5. Sample path stability 6. Little’s formula and extensions 7. Insensitivity of queueing networks 8. Sample-path approach to Palm calculus

Readership: All those interested in theory and applications of stochastic processes

The standard approach to the study of stochastic processes is to formulate a particular class, say the M/M/k queue, and then to deduce its properties. However, its is well-known that many of these will not require the full strength of assumptions made, and it is especially beneficial in terms of developing intuition and understanding to consider what conditions are actually necessary for the particular property to hold. A classic example is Little’s formula, which relates the average queue length to the arrival rate and average waiting time for very broad classes of queues.

Sample-path analysis concentrates on looking at the properties of single realizations of a stochastic process, thus requiring few stochastic assumptions (for example, the existence of long-term averages along sample paths), and then explores the additional assumptions needed for the results to extend to general classes of processes. Considerable insight can be gained from this approach.

This book, which focuses mainly on queueing systems and other input-output processes, will be a valuable resource to all those interested in applied probability and stochastic processes. While it is not a textbook, it contains a wealth of useful material for those giving courses in stochastic processes at all levels, and its extremely readable style makes it suitable for anyone with previous exposure to basic probability and stochastic processes.

University College London London, U.K. V.S. Isham

LARGE DEVIATIONS TECHNIQUES AND APPLICATIONS, 2nd edition. A. Dembo and O. Zeitouni. New York: Springer-Verlag, 1998, pp. xvi + 396. [Original 1993].

Contents: 1. Introduction 2. LDP for finite dimensional spaces 3. Applications - The finite dimensional case 4. General principles 5. Sample path large deviations 6. The LDP for abstract empirical measures 7. Applications of empirical measures LDP

Readership: Mathematicians, physicists, engineers interested in a deeper understanding of the large deviation principle (LDP)

The first edition of this book was overall received as offering a very sound overview of large deviation techniques. The second edition is a largely updated version which, besides keeping the quality of the first edition, contains a lot of updated and/or added material. Moreover, new exercises have been added and the references have seen more than one hundred new additions. I strongly encourage those that already have the first edition to make a bit of extra shelf-space for this one. For those who are contemplating a text on the large deviation principle, this is an excellent text to have.

ETH-Zürich Zürich, Switzerland P.A.L. Embrechts

30

FORECASTING ECONOMETRIC TIME SERIES.M.P. Clements and D.F. Hendry. Cambridge University Press, 1998, pp. xxi + 368, £45.00/US$69.95 Cloth; £15.95/US$24.95 Paper.

Contents: 1. An introduction to forecasting 2. First principles 3. Evaluating forecast accuracy 4. Forecasting univariate processes 5. Monte Carlo techniques 6. Forecasting in cointegrated systems 7. Forecasting with large-scale macroeconometric

models 8. A theory of intercept corrections: beyond mechanistic

forecasts 9. Forecasting using leading indicators 10. Combining forecasts 11. Multi-step estimation 12. Parsimony 13. Testing forecast accuracy 14. Postscript

Readership: Econometricians, time series experts

The impetus for this book came in 1991, when the second author acted as adviser to a House of Commons Select Committee on the Enquiry into Official Economic Forecasting in the U.K. Encountering little theory that could explain the systematic misforecasting of the 1980s, the authors took up the challenge of developing one. The result is a deep and thoughtful investigation of the nature of econometric forecasting.

A central observation is the inadequacy of constant-parameter stationary models to provide useful forecasts in the face of a reality which is non-constant, continually evolving and subject to structural breaks. The authors identify six critical elements of a forecasting framework: the nature of the data-generating process; the knowledge level concerning this process; the dimensionality of the system; whether the analysis is to be based on asymptotic or finite sample theory; the forecast horizon; the linearity or non-linearity of the system. In this framework they provide an exhaustive analysis of forecasting practice and forecasting error. The book presupposes experience in econometric modelling or time series forecasting and is written in an erudite style that will reward the advanced reader with new insights in these areas.

Swiss Federal Institute of Technology Zürich, Switzerland A.J. McNeil

TIME SERIES MODELS FOR BUSINESS AND ECONOMIC FORECASTING. P.H. Franses. Cambridge University Press: pp. x + 280, £42.50/US$69.95 Cloth; £15.25/US$24.95 Paper.

Contents: 1. Introduction and overview 2. Key features of economic time series 3. Useful concepts in univariate time series analysis 4. Trends 5. Seasonality 6. Aberrant observations 7. Conditional heteroskedasticity 8. Non-linearity 9. Multivariate time series 10. Common features

Readership: Statisticians, econometricians and management scientists

This book is intended as an introduction for time series modelling and time series forecasting with particular emphasis to economics and business. It contains all the standard time series models, ARMA models, ARCH models and GARCH models with discussion on tests for unit roots, a technique widely used in the econometric literature. A chapter on multivariate (linear) time series is included. Statistical tests for common features in two or more series is included in the last chapter. The techniques are well illustrated with several real time series, and the series analyzed are included in the book.

The book is well written and the professional time series forecasters will no doubt find the book useful.

University of Manchester Institute of Science and Technology

Manchester, U.K. T. Subba Rao

STOCHASTIC PROCESSES FOR INSURANCE AND FINANCE. T. Rolski, H. Schmidt, V. Schmidt and J.F. Teugels. Chichester, UK: Wiley, 1999, pp. xviii + 654, £60.00.

Contents: 1. Concepts from insurance and finance 2. Probability distributions 3. Premiums and ordering of risks 4. Distributions of aggregate claim amount 5. Risk processes 6. Renewal processes and random walks 7. Markov chains 8. Continuous-time Markov models 9. Martingale techniques I 10. Martingale techniques II 11. Piecewise deterministic Markov processes 12. Point processes 13. Diffusion models

Readership: Graduate students in stochastic modelling, probability theory, statistics, actuarial science and financial mathematics

This book aims to make the basic concepts of stochastic modelling and insurance accessible in a com-prehensive (and comprehensible) manner. It does this well. The opening chapter, which is worth reading on its own, provides a clear and concise overview of the book, and also motivates the necessity for the mathematical depth in the rest of the book. The last point is indicative of the fact that, even apart from its length, the book is not light reading.

In the preface the authors caution that “This book is not covering [sic] the statistical aspects of stochastic models in insurance and finance”, but it does include some real examples and some numerical and algorithmic details. Proofs are given for all except well-known results. There are helpful detailed bibliographical notes at the ends of most sections. There are no exercises, but the authors stress that the book “has been conceived as a course text”, and note that a Teacher’s Manual is forthcoming. In my view the clarity and accessibility of the book will make it successful as a course text for advanced students.

I thought I detected some differences in style between the chapters (presumably written by different authors) but, apart from some idiosyncrasies in places, such as that illustrated above, the book is fluently written and a pleasure to read. I recommend it.

31

Imperial College of Science, Technology and Medicine

London, U.K. D.J. Hand

MATHEMATICS OF FINANCIAL MARKETS. R.J. Elliott and P.K. Kopp. New York: Springer-Verlag, 1999, pp. ix + 292, US$64.95/DM129.00/£49.50.

Contents: 1. Pricing by arbitrage 2. Martingale measures 3. The fundamental theorem of asset pricing 4. Complete markets and martingale representation 5. Stopping times and American options 6. A review of continuous-time stochastic calculus 7. European options in continuous time 8. The American option 9. Bonds and term structure 10. Consumption. Investment strategies

Readership: Students and research workers in financial mathematics

In the preface to the book, the authors note that the field of mathematical finance is rapidly expanding. Under that circumstance, it is not surprising that besides the appearance of many new journals and different societies in this field. In many countries actively working researchers and practitioners are trying to give an exposition of their vision of the mathematics of finance. A presentation of the corresponding material is necessary in order to get control over the subject connected with analysis and decisions on financial markets.

Authors of the book strictly divide their text into two parts: discrete-time framework (the first five chapters), and continuous-time framework (the second five chapters).

Elliott and Kopp are known as experts in stochastic calculus by their books Stochastic Calculus and Applications by Elliott [Short Book Reviews, Vol. 3, p. 7], and Martingales and Stochastic Integrals by Kopp [ShortBook Reviews, Vol. 4, p. 41]. So, readers of their present joint book have a guarantee to obtain a qualified presentation of mathematics of the stochastic financial markets.

Steklov Mathematical Institute Moscow, Russia A.N. Shiryaev

APPLIED STOCHASTIC MODELS AND CONTROL FOR FINANCE AND INSURANCE. C.S. Tapiero. Boston: Kluwer, 1998, pp. 341, Dfl.295.00/US$130.00/£88.50.

Contents: 1. Dynamics, stochastic models and uncertainty 2. Modelling: Markov chains and Markov processes 3. Random walk and stochastic differential equations 4. Jump processes and special problems 5. Memory, volatility models and the range process 6. Dynamic optimization 7. Numerical and optimization techniques

Readership: Students and research workers in financial mathematics

The long title of the book points at a wide circle of the presented topics and problems. The main aim of the author, C.S. Tapiero, who is well known by his papers and books as a specialist on stochastic optimization, was ‘not providing one more book on stochastic control’ (as A. Bensoussan writes in his Foreword to the book), but writing a book about ‘Applications’ of stochastic control

theory to a booming field of finance and insurance engineering.

The author did not spend a lot of space and time on the basic concepts and theories in finance and insurance. His goal is another—to demonstrate how dif-ferent concrete stochastic models and optimization ideas ‘work’ in an economical, financial and business environ-ment, and in insurance, where dynamics and uncertainty play an essential role.

I found the book interesting for its many expla-nations, examples of different models, and concrete recommendations for control and management. It will be useful to students and also practitioners.

Steklov Mathematical Institute Moscow, Russia A.N. Shiryaev

ARBITRAGE THEORY IN CONTINUOUS TIME. T. Björk. Oxford University Press, 1998, pp. xii + 312.

Contents: 1. Introduction 2. The binomial model 3. Stochastic integrals 4. Differential equations 5. Portfolio dynamics 6. Arbitrage pricing 7. Completeness and hedging 8. Parity relations and delta hedging 9. Several underlying assets 10. Incomplete markets 11. Dividends 12. Currency derivatives 13. Barrier options 14. Stochastic optimal control 15. Bonds and interest rates 16. Short rate models 17. Martingale models for the short rate 18. Forward rate models 19. Change of numéraire 20. Forwards and futures

Readership: Senior undergraduates and graduates interested in financial mathematics, quantitative analysts, financial engineers

This book is one of the best of a large number of new books on mathematical and probabilistic models in finance, positioned between the books by Hull and Duffie on a mathematical scale. It is largely self-contained, including a pre-measure theory treatment of Brownian motion and the stochastic integral. The text then proceeds to continuous time no-arbitrage pricing for stock and exchange rate derivatives, bonds, forwards and futures. Although essentially mathematical, the author is generous with intuitive explanations and review; see for example the nice chapter on incomplete markets. This is a highly readable book and strikes a fine balance between mathematical development and intuitive explanation.

University of Waterloo Waterloo, Canada D.L. McLeish

32

STOCHASTIC DYNAMIC PROGRAMMING AND THE CONTROL OF QUEUEING SYSTEMS. L.I. Sennott. New York: Wiley, 1999, pp. xiv + 328, £65.00.

Contents: 1. Introduction 2. Optimization criteria 3. Finite horizon optimization 4. Infinite horizon discounted cost optimization 5. An inventory model 6. Average cost optimization for finite state spaces 7. Average cost optimization for countable state spaces 8. Computation of average cost optimal policies for

infinite state spaces 9. Optimization under actions at selected epochs 10. Average cost optimization of continuous time

processes

APPENDIX A : Results from Analysis APPENDIX B : Sequences of Stationary Policies APPENDIX C : Markov Chains techniques

Readership: Applied statisticians, operations research engineers, control engineers, communication engineers, manufacturing engineers

This book combines a theoretical treatment of the optimal stochastic control of queueing systems with computational support. Source code for computer pro-grammes is available off a website. The book treats finite, countable and infinite state spaces with average and discounted cost functions. The methods and programmes will be of interest to a wide range of readers, including those involved in manufacturing, communications and computing systems. The book is self-contained and contains all necessary background theory and illustrative examples. An excellent book for a wide range of readers.

University of Newcastle Newcastle, Australia G.C. Goodwin

NOTES

ISAAC NEWTON. Eighteenth-Century Perspectives. A.R. Hall. Oxford University Press, 1999, pp. 215.

From the book jacket: “This new work, a com-parative study of Newtonian biography—by one of this century’s most eminent Newtonian scholars, Rupert Hall—brings together for the first time, in a single volume, the early eighteenth-century biographies of Sir Isaac Newton. The centrepiece of the book is a completely new translation of Paolo Frisi’s biography of Newton published in 1778. Also included are the biographies by Fontenelle (1727), Thomas Birch (1738), Charles Hutton (1795), and John Conduitt—making two of the most important eighteenth-century biographies accessible in a contemporary English translation. Each translation is accompanied by illuminating introductions and commentary by Professor Hall and, to help orient the reader, a brief biography of Newton and a bibliography have been included.”

EDISON. A Life of Invention. P. Israel. New York: Wiley, 1998, pp. viii + 552, £24.95.

This is a biography of Thomas Alva Edison (1847–1931). He had only three months of formal education, but became one of the greatest inventors. His inventions included the phonograph, the first successful electric light bulb, and laid the groundwork for movies, telephones and the sound recording industry.

MY BRAIN IS OPEN. The Mathematical Journeys of Paul Erdös. B. Schechter. Oxford University Press, 1998, pp. 224.

Excerpts from the book jacket: “[Paul] Erdös was sustained by the generosity of colleagues and by his own belief in the beauty of mathematics.

“Born in Hungary, Erdös was a member of a generation of remarkable Hungarian scientists (among them, John von Neumann, Edward Teller, and Leo Szilard) who shaped the twentieth century. By age seventeen he had gained international recognition as a prodigy.

“Erdös believed that the meaning of life was to prove and conjecture. He was fascinated by numbers and became one of the century’s leading number theorists. He worked in fields of mathematics that would prove pivotal to the development of computer science, even though he had never touched a computer. He was the most prolific mathematician who ever lived, writing or collaborating on

more than 1,500 papers with over 450 different collaborators.”

N IS A NUMBER. A Portrait of Paul Erdös. G.P. Csicsery. Wellesley, Massachusetts: A K Peters, 1993, 57 minutes.

This is a video of the famous mathematician Paul Erdös (1913–1996). Erdös “eschewed the traditional trappings of success” and dedicated his life to mathematics.

EMPIRE OF LIGHT. A History of Discovery in Science and Art. S. Perkowitz. Washington, D.C.: Joseph Henry Press, 1998, pp. ix + 229, £12.95.

Albert Einstein wrote: “For the rest of my life I will reflect on what light is.” In this volume, the author discusses the nature of light, how the eye sees, and how an understanding of these phenomena has emerged over the years.

FROM WHITE DWARFS TO BLACK HOLES. The Legacy of S. Chandrasekhar. G. Srinivasan (Ed.). University of Chicago Press, 1999, pp. xiii + 240, US$40.00.

From the book jacket: “From White Dwarfs to Black Holes chronicles the extraordinarily productive scientific career of Subrahmanyan Chandrasekhar, one of the twentieth century’s most distinguished astrophysicists. Over the course of more than six decades of active research Chandrasekhar investigated a dizzying array of subjects, including stellar structure and dynamics, the theory of radiative transfer, hydrodynamic and hydro-magnetic stability, the equilibrium and stability of ellipsoidal figures of equilibrium, the general theory of relativity and relativistic astrophysics, the mathematical theory of black holes, and the theory of colliding gravitational waves and non-radial perturbations of relativistic stars.

“G. Srinivasan notes in the preface to this book that “the range of Chandra’s contributions is so vast that no one person in the physics or astronomy community can undertake the task of commenting on his achievements. Thus, in this collection, ten eminent scientists evaluate Chandrasekhar’s contributions to their own fields of specialization. The volume closes with a historical discussion of Chandrasekhar’s interactions with graduate students during his more than quarter century at Yerkes Observatory.

33

“The contributors are: James Binney, John L. Friedman, Norman R. Lebovitz, Donald E. Osterbrock, E.N. Parker, Roger Penrose, A.R.P. Rau, George B. Rybecki, E.E. Salpeter, Bernard F. Schutz, and G. Srinivasan.

“The essays in this book were originally published by the Indian Academy of Sciences in 1996 in a special issue of the Journal of Astrophysics and Astronomy.”

SCIENCE WITHOUT LAWS. R.N. Giere. University of Chicago Press, 1999, pp. x + 285, US$25.00.

From the book jacket: “R.N. Giere argues that it is better to understand what scientists actually do as de-veloping more or less abstract models of specific aspects of the world. Giere’s approach resolves the issues underlying the science wars: the critics of science are correct in rejecting the Enlightenment idea of science, and its defenders are correct in insisting that science does not produce genuine knowledge of the natural world. Giere is utterly persuasive in arguing that to criticize the Enlightenment ideal is not to criticize science itself, and that to defend science one need not defend the Enlightenment ideal.

“Science without Laws thus stakes out a middle ground in these debates by demonstrating a more powerful way of seeing science.”

PHYSICISTS IN CONFLICT. From Antiquity to the New Millennium. N.A. Porter. Bristol: Institute of Physics, 1998, pp. xv + 275, £25.00/US$39.50.

From the book jacket: “Dialogue in science is essential for progress. But when dialogue becomes conflict, or further intensifies to persecution, the situation is harmful not only to science, but also to the wider society in which science exists. This is true whether the conflict is internal, as in the case of Boltzmann, or external, as with Galileo or Oppenheimer against their respective authorities.

“This book examines the nature of conflict in science through examples chosen from the history of physics. These cases fall into three broad themes: physi-cists in conflict with religion; conflict between physicists on significant scientific issues; physicists in conflict with each other and politicians on matters of public policy with scientific content. Conflict is singled out as a common element in otherwise disparate areas precisely because it has characteristics which are common to the different cases, and sometimes the similarities are remarkable. The cases of Galileo and Oppenheimer, in particular, are examples of this.”

CULTURAL BOUNDARIES OF SCIENCE. Credibility on the Line. T.F. Gieryn. University of Chicago Press, 1999, pp. xiv + 398.

This volume shows how the definition of what is science can shift. The author discusses also how science and its interpretation are shaped by society and vice versa.

HOLDING ON TO REALITY. The Nature of Information at the Turn of the Millennium. A. Borgmann. University of Chicago Press, 1999, pp. 274, US$22.00.

From the book jacket: “Drawing on the history of ideas, the details of information technology, and the boundaries of the human condition, Borgmann explains the relationship between things and signs, between reality and information. His history ranges from Plato to Boeing and from the alphabet to virtual reality, all the while being conscious of the enthusiasm, apprehension, and uncertainty

that have greeted every stage of the development of information.”

DRAWBRIDGE UP. Mathematics — A Cultural Anathema. (ZUGBRÜCKE AUSSER BETRIEB. Die Mathematik im Jenseits der Kultur — Eine Aussenansicht.) H.M. Enzensberger. Natick, Massachusetts: A K Peters, 1999, pp. 47, US$5.00.

This is a very small book written side-by-side in two languages, English and German. The author, a poet and essayist, discusses the problem of the mathematician as an isolationist, divorced from reality and the mathematician who feels that mathematics is the under-pinning of technical developments.

STATISTICS IN SOCIETY. The Arithmetic of Politics. D. Dorling and S. Simpson (Eds.). London: Arnold, 1999, pp. xxvi + 484, £16.99.

The papers in this volume are divided into eight parts: 1. Collecting statistics; 2. Models and theory; 3. Classifying people; 4. Counting poverty; 5. Valuing health; 6. Assessing education; 7. Measuring employment; 8. Economics and politics.

STATISTICS, SCIENCE AND PUBLIC POLICY. II. HAZARDS AND RISKS. Proceedings of the Confer-ence on Statistics, Science and Public Policy held at Queen’s University, Kingston, Ontario, Canada, April 23–25, 1997. A.M. Herzberg and I. Krupka (Eds.). Kingston, Ontario: Queen’s University, 1998, pp. xiii + 163, Can.$30.00.

Approximately forty leading scientists, politicians, senior public servants and journalists from several countries met at Queen’s University to consider how to promote better understanding between scientists and policy-makers by focusing on the issue of hazard and risks. This volume consists of the edited version of the proceedings of the conference.

ON SCIENCE, INFERENCE, INFORMATION AND DECISION-MAKING. Selected Essays in the Philosophy of Science. K. Szaniawski, A. Chmielewski and J. Wolénski (Eds.). Dordrecht: Kluwer Academic Publishers, 1998, pp. xiv + 242, Dfl.240.00/US$130.00/£82.00.

From the book jacket: “There are two competing pictures of science. One considers science as a system of inferences, whereas another looks at science as a system of actions. The essays included in this collection offer a view which intends to combine both pictures. This compromise is well illustrated by Szaniawski’s analysis of statistical inferences. It is shown that traditional approaches to the foundations of statistics do not need to be regarded as conflicting with each other. Thus, statistical rules can be treated as rules of behaviour as well as rules of inference. Szaniawski’s uniform approach relies on the concept of rationality, analyzed from the point of view of decision theory. Applications of formal tools to the problem of justice and division of goods shows that the concept of rationality has a wider significance.”

34

BRAVE NEW WORLDS. Staying Human in the Genetic Future. B. Appleyard. London: HarperCollins, 1999, pp. 188, £16.99.

The author discusses the promise and dangers in genetic manipulation and the moral and ethical implications. “In the end, Brave New World is a public appeal, a plan to realign technological advances with human values.”

THE TROUBLED HELIX. Social and Psychological Implications of the New Human Genetics. T. Marteau and M. Richards (Eds.). Cambridge University Press, 1999, pp. xx + 359, £40.00/US$69.95 Cloth; £18.95/US$29.95 Paper.

This volume surveys the problems which arise from the new human genetics, such as the knowledge that because of one’s genetic make-up one will suffer from an incurable disease.

ECONOMIC THEORY AND SUSTAINABILITY. G. Heal. New York: Columbia University Press, 1998, pp. xiii + 226, US$75.00/£52.00.

From the book jacket: “With issues like global warming and the loss of biodiversity becoming increasingly important to policy-makers and scientists world-wide, sustainability is a growing concern.

“The sustainable management of the biosphere has recently been the subject of much attention among ecologists, environmental engineers, and other members of the scientific community. Yet, although these issues are clearly rooted in economic behaviour and organization, economists have not directly addressed the question of sustainability.”

The author presents a model for understanding the future of the earth from an economic perspective.

“Heal’s model begins by reconciling the time horizons of the economist and the environmentalist in economics, discussions of the ‘long run’ generally refer to a much shorter period than do those of the earth sciences. This book shows the benefits of viewing the environment as an economic asset that should be understood as part of a national income and explains how this approach can lead to more conservative use of resources.”

SOCIAL SECURITY AND RETIREMENT AROUND THE WORLD. J. Gruber and D.A. Wise (Eds.). University of Chicago Press, 1999, pp. ix + 486, US$62.00/£49.50.

From the book jacket: “In nearly every industri-alized country, the population is ageing rapidly and individuals are living longer, demographic trends that have strained the financial viability of these countries’ social security systems. The financial strain has been compounded by another trend: workers are leaving the labour force at increasingly younger ages.

“What accounts for this striking decline in labour force participation? One explanation is that social security programs actually provide incentives for early retirement. Social Security and Retirement around the World examines this explanation. This volume houses a set of remarkable papers that present information on the social security systems, and labour force participation patterns, in Belgium, Canada, France, Germany, Italy, Japan, the Netherlands, Spain, Sweden, the United Kingdom, and the United States.”

ENSURING SAFE FOOD. From Production to Consumption. Committee to Ensure Safe Food from Production to Consumption. Institute of Medicine/National

Research Council. Washington, D.C.: National Academy Press, 1998, pp. xi + 194.

The effectiveness of the food safety system is discussed and evaluated.

THE FINANCING OF CATASTROPHE RISK.K.A. Foot (Ed.). University of Chicago Press, 1999, pp. x + 477, US$68.00.

From the book jacket: “Is it possible that the insurance and reinsurance industries cannot handle a major catastrophe? Ten years ago, the notion that the overall cost of a single catastrophic event might exceed $10 billion was unthinkable. Yet, in today’s dollars, the combined costs of just two recent events, the Northridge Earthquake and Hurricane Andrew, exceed $45 billion. With ever increasing property-casualty risks and unabated growth in hazard-prone areas, insurers and reinsurers now envision the possibility of disaster losses of $50 to $100 billion in the United States. Against this backdrop, the capitalization of the insurance and reinsurance industries has become a crucial concern. While it remains unlikely that a single event might entirely bankrupt these industries, a big catastrophe could place firms, policy holders, and investors under stress.

“The crux of the problem lies in the financial system, whose job it is to redistribute risk. Insurance companies retain the vast majority of their catastrophic risk, rather than spreading potential costs evenly across investors. Furthermore, only a small fraction of large-event exposures is covered by reinsurance, and the percentage of reinsurance coverage declines markedly with the size of the event. While reinsurer capital has expanded, there remains concern that it may be more beneficial for catastrophe risks to be shared directly with investors.”

EVERYTHING FOR SALE. The Virtues and Limits of Markets. R. Kuttner. University of Chicago Press, 1999, pp. xx + 410, US$15.00.

From previous reviews of the book: “The best survey of the limits of free markets that we have.... A much needed plea for pragmatism: Take from free markets what is good and do not hesitate to recognize what is bad.” (J. Madrick, Los Angeles Times).

“The most readable and important book about the economy I have read in a long time.... I have never seen the market system better described, more intelligently appreciated, and more trenchantly criticized than in Everything for Sale.” (R. Heilbroner).

“Demonstrating an impressive mastery of a vast range of material, Mr. Kuttner lays out the case for the market’s insufficiency in field after field: employment, medicine, banking, securities, telecommunications, electric power.” (N. Lemann, New York Times Book Review).

35

BOGGS: A Comedy of Values. L. Weschler. University of Chicago Press, 1999, pp. xiv + 160, US$22.00.

From the book jacket: “[The author] chronicles the antics of J.S.G. Boggs, a young artist with a certain panache, a certain flair, a certain je ne payes pas—an artist, that is, whose consuming passion is money, or perhaps, more precisely, value. What Boggs likes to do is to draw money—actual paper notes in the denominations of standard currencies from all over the world—and then to go out and try to spend those drawings. Instead of selling his money drawings outright to interested collectors, Boggs looks for merchants who will accept his drawings in lieu of cash payment for their wares or services as part of elaborately choreographed transactions, complete with receipts and even proper change—an artistic practice that regularly lands him in trouble with treasury police around the globe.

“Boggs: A Comedy of Values teases out these transactions and their sometimes dramatic legal conse-quences, following Boggs on a larkish, though at the same time disconcertingly profound, econo-philosophic chase. For in a madcap Socratic fashion, Boggs is raising all sorts of truly fundamental questions—what is it that we value in art, or, for that matter, in money? Indeed, how do we place a value on anything at all? And in particular, why do we, why should we, how can we place such trust in anything as confoundingly insubstantial as paper money?

“In passing, Weschler frames a concise, highly entertaining history of money itself—from cowrie shells through hedge funds—such that Boggs will delight and fascinate both general readers and seasoned professionals, especially amidst the chaos currently roiling financial and art markets throughout the world.”

IDENTIFYING THE POOR. Papers on Measuring Poverty to Celebrate the Bicentenary of the Publication in 1797 of “The State of the Poor” by Sir Frederick Morton Eden. F.G. Pyatt and M. Ward (Eds.). Amsterdam: IOS Press, 1999, pp. iv + 233.

The ISI Cutting Edge Conference in November 1997 provided the background to this volume which includes papers from the conference and some others. The late Dr. Z. Kenessey, former director of the International Statistical Institute, convened the conference. The abstract of his paper “Sir Frederick Morton Eden’s The State of the Poor (1797–1997)” is:

“The three magisterial volumes of Eden’s classic work were published 200 years ago in 1797. Sir Frederick Morton Eden was not only the author of this inestimable work, which reviewed poverty in England from the Norman Conquest until the end of the 18th century, he was also instrumental in establishing the Globe insurance company which was the first insurance company in Britain. During his short life (1766–1809) he put his education from Oxford University to good use and authored works on subjects other than about poverty (which was the theme of his magnum opus). In his childhood Eden spent time in Maryland, where his father Sir Robert Eden was the last colonial governor.”

THE IMPORTANCE OF BEING FUZZY. And Other Insights from the Border between Math and Computers. A. Sangalli. Princeton University Press, 1998, pp. xvi + 173, £18.95.

From the book jacket: “How has computer science changed mathematical thinking? In this first ever comprehensive survey of the subject for popular science readers, Arturo Sangalli explains how computers have brought a new practicality to mathematics and mathematical applications. By using fuzzy logic and related concepts, programmers have been able to side-step the traditional and often cumbersome search for perfect mathematical solutions to embrace instead solutions that are ‘good enough.’ If mathematicians want their work to be relevant to the problems of the modern world, Sangalli shows, they must increasingly recognize ‘the importance of being fuzzy’.”

MODERN MATHEMATICS IN THE LIGHT OF THE FIELD MEDALS. M. Monastyrsky. Wellesley, Massachusetts: A K Peters, 1999, pp. xv + 160, US$19.95.

From the book cover: “This short book examines the evolution of certain areas of modern mathematics by recounting the past winners of the international Fields Medal, the ‘Nobel Prize’ of mathematics. Subjects like topology, complex analysis, number theory, and mathematical logic are brought to life through the personalities of those who fundamentally contributed to their development.”

THE MATHEMATICS OF MEASUREMENT. A Critical History. J.J. Roche. London: Springer-Verlag, 1998, pp. x + 330, US$79.95

This is the first history of the branches of mathematics which have been developed for the handling of measurements, including the calculus of error analysis.

AN IMAGINARY TALE. The Story of √–1. P.J. Nahin. Princeton University Press, 1998, pp. xvi + 257, £18.95.

From the book jacket: “Today complex numbers have such widespread practical use—from electrical engineering to aeronautics—that few people would expect the story behind their derivation to be filled with adventure and enigma. In An Imaginary Tale, Paul Nahin tells the 2000-year-old history of one of mathematics’ most elusive numbers, the square root of minus one, also known as i, re-creating the baffling mathematical problems that conjured it up and the colourful characters who tried to solve them.

“In 1878, when two brothers stole a mathematical papyrus from the ancient Egyptian burial site in the Valley of Kings, they led scholars to the earliest known occurrence of the square root of a negative number. The papyrus offered a specific numerical example of how to calculate the volume of a truncated square pyramid, which implied the need for i.In the first century, the mathematician-engineer Heron of Alexandria encountered i in a separate project, but fudged the arithmetic; medieval mathematicians stumbled upon the concept while grappling with the meaning of negative numbers, but dismissed their square roots as nonsense. By the time of Descartes, a theoretical use for these elusive square roots—now called ‘imaginary numbers’—was suspected, but efforts to solve them led to intense, bitter debates. The notorious i finally won acceptance and was put to use in complex analysis and theoretical physics in Napoleonic times.”

36

THE COLLECTED WORKS OF F.A. HAYEK. Volume V. Good Money, Part I. The New World. Part II. The Standard. Volume VI. Good Money. S. Kresge (Ed.). University of Chicago Press, 1999, pp. xi + 259; pp. x +259, US$45.00 each.

F.A. Hayek (1899–1992) was the co-recipient of the Nobel Memorial Prize in Economics in 1974 and was awarded the Medal of Freedom in 1991. These two volumes bring together Hayek’s major essays on money and monetary theory.

OPEN PROBLEMS IN MATHEMATICAL SYSTEMS AND CONTROL THEORY. V.D. Blondel, E.D. Sontag, M. Vidyasagar and J.C. Willems (Eds.). London: Springer-Verlag, 1999, pp. xii + 288, US$99.00.

More than fifty open problems have been collected together. They include topics such as chaotic observers, non-linear local controllability and neural network learning.

NEW EDITIONS, PAPER EDITIONS OR REPRINTS

A BASIC COURSE IN STATISTICS, 4th edition. G.M. Clarke and D. Cooke. London: Arnold, 1998, pp. xxiv + 672, £19.99. [Original 1978].

BOUNDARY VALUE PROBLEMS, 4th edition. D.L. Powers. San Diego, California: Harcourt, Academic Press, 1999, pp. xi + 528, US$69.95. [3rd edition 1987].

CONTROLLING THE CONTROLLABLE. The Management of Safety, 4th revised edition. J. Groeneweg. The Netherlands: DSWO Press, 1998, pp. v + 528.

EXPLORING STATISTICS. A Modern Introduction to data Analysis and Inference, 2nd edition. L.J. Kitchens. Pacific Grove, California: Duxbury Press, 1998, pp. xv + 940 + disk. [Original 1987].

INTRODUCTORY STATISTICS WITH APPLICATIONS IN GENERAL INSURANCE, 2nd edition. I.B. Hossack, J.H. Pollard and B. Zehnwirth. Cambridge University Press, 1999, pp. x + 282, £60.00 (Cloth); £22.95 (Paperback). [Original 1983, Short Book Reviews, Vol. 3, p. 31].

JUST THE ESSENTIALS OF ELEMENTARY STATISTICS, 2nd edition. R. Johnson and P. Kuby. Pacific Grove, California: Duxbury Press, 1999, pp. xv + 576 + CD.

MATHEMATICAL MODELING, 2nd edition. M.M. Meerschaert. San Diego, California: Academic Press, 1999, pp. xvi + 351, US$59.95. [Original 1993].

MATRIX DIFFERENTIAL CALCULUS. With Applications in Statistics and Econometrics, revised edition. J.R. Magnus and H. Neudecker. Chichester, U.K.: Wiley, pp. xiii + 395, £34.95. [Original 1988].

SEEING THROUGH STATISTICS, 2nd edition. J.M. Utts. Pacific Grove, California: Duxbury Press, 1999, pp. xix + 465. [Original 1995, Short Book Reviews,Vol. 16, p. 23].,

STATISTICS. A First Course, 7th edition. J.E. Freund and B.M. Perles. Upper Saddle River, New Jersey: Prentice Hall, 1999, pp. xi + 532. [Original 1970].

SURVIVAL MODELS AND DATA ANALYSIS. R.C. Elandt-Johnson and N.L. Johnson. New York: Wiley, 1999, pp. xvi + 457, £32.50. [Original 1980, Short BookReviews, Vol. 1, p. 2].

THE WORLD ACCORDING TO WAVELETS. The Story of a Mathematical Technique in the Making, 2nd edition. Natick, Massachusetts: A K Peters, 1998, pp. xx + 330, US$40.00. [Original 1996].

GOVERNMENT PUBLICATIONS

ANNUAL REPORT 1998. Oslo, Research Department, Statistics Norway, 1999, pp. 27.

COMMERCE EXTERIEUR DU LUXEMBOURG. Bulletin du STATEC, Volume XXXXV, No. 7. Luxembourg: Service Central de la Statistique et des Etudes Economiques, 1998, pp. 51, Annual subscription: LUF900.00 (8 issues)/Price per issue: LUF150.00.

LE COÛT DE LA MAIN-D’OEUVRE AU LUXEMBOURG. Bulletin du STATEC, Vol. XXXXV, No. 8. Luxembourg: Service Central de la Statistique et des Études Écono-

miques, 1999, pp. 33, LUF150.00 (per issue)/LUF900.00 (annual subscription: 8 issues).

NETHERLANDS OFFICIAL STATISTICS. Volume 14, Winter 1998. Voorburg/Heerlen: Statistics Netherlands, 1998, pp. 30, Dfl.20.00 (per issue)/Dfl.42.00 (annual subscription).

WORKSHOP FOR THE BALTIC COUNTRIES ON SURVEY SAMPLING THEORY AND METHODOLOGY. May 24–27, 1998, Jurmala, Latvia. Riga: Central Statistical Bureau of Latvia, 1998, pp. 91.

UNITED NATIONS STATISTICAL OFFICE PUBLICATIONS RECENTLY ISSUED

DEMOGRAPHIC YEARBOOK, 1997. Forty-ninth issue. ST/ESA/STAT/SER.R/28. U.N. Sales No. E/F.99.XIII.1, 1999, pp. ix + 582.

HANDBOOK ON CIVIL REGISTRATION AND VITAL STATISTICS SYSTEMS. Computerization. Handbooks on Civil Registration and Vital Statistics Systems. Studies in Methods, Series F, No. 73. ST/ESA/STAT/SER.F/73, 1998, U.N. Sales No. E.98.XVII.10, pp. ix + 68.

1996 INDUSTRIAL COMMODITY STATISTICS YEARBOOK. Production Statistics (1987–1996).

ST/ESA/STAT/SER.P/36, 1998, U.S. Sales No. E/F.98.XVII.17, pp. xvii + 908.

NATIONAL ACCOUNTS STATISTICS: MAIN AGGREGATES AND DETAILED TABLES, 1994. Parts I and II. ST/ESA/STAT/SER.X/23, PART I and PART II. U.N. Sales No. E.97.XVII.6, 1997, pp. xxii + 8 + 1995.

POPULATION AND VITAL STATISTICS REPORT. Statistical Papers Series A. Vol. LI, No. 1. Data available as of 1 January 1999. ST/ESA/STAT/SER.A/208, pp.19.

38

COLLECTED PAPERS, TABLES AND PROCEEDINGS

A FIELD GUIDE FOR SCIENCE WRITERS. D. Blum and M. Knudson (Eds.). New York: Oxford University Press, 1997, pp. xi + 287, US$9.99.

ALGORITHMS AND THEORY OF COMPUTATION HANDBOOK. M.J. Atallah (Ed.). Boca Raton, Florida: CRC Press, 1999, pp. 1288.

ANIMAL LEARNING AND COGNITION. N.J. MacKintosh (Ed.). San Diego, California: Academic Press, 1994, pp. xviii + 379.

ASYMPTOTIC METHODS IN PROBABILITY AND STATISTICS. A Volume in Honour of Miklos Csörgö. B. Szyszkowicz (Ed.). Amsterdam: Elsevier, 1999, pp. xxxiii + 889.

ASYMPTOTICS, NONPARAMETRICS, AND TIME SERIES. S. Ghosh (Ed.). New York: Dekker, 1999, pp. xviii + 833.

FEATURE EXTRACTION, CONSTRUCTION AND SELECTION: A Data Mining Perspective. H. Liu and H. Motoda (Eds.). Boston: Kluwer Academic Publishers, 1998, pp. xxiv + 410, Dfl.320.00/US$140.00/£95.25.

FRONTIERS OF RESEARCH IN ECONOMIC THEORY. The Nancy L. Schwartz Memorial Lectures, 1983–1997. D.P. Jacobs, E. Kalai and M.I. Kamien (Eds.). Cambridge University Press, 1998, pp. xxv + 274, £40.00/US$57.95 Cloth; £14.95/US$21.95 Paper.

LEARNING IN GRAPHICAL MODELS. M.I. Jordan (Ed.). Dordrecht, Netherlands: Kluwer Academic Publishers, 1998, pp. x + 630, Dfl.520.00/US$280.00/£177.00.

MAXIMUM ENTROPY AND BAYESIAN METHODS. Boise, Idaho, U.S.A., 1997. Proceedings of the 17th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis. G.J. Erickson, J.T. Rychert and C.R. Smith (Eds.). Dordrecht: Kluwer Academic Publishers, 1998, pp. ix + 297, Dfl.245.00/US$133.00/£84.00.

MSI–2000: Multivariate Statistical Analysis in honor of Professor Minoru Siotani on his 70th Birthday. Multivariate Statistical Inference. American Journal of Mathematical and Management Sciences, Vol. 18, Nos. 1 and 2. V.S. Taneja (Guest Ed.). Columbus, Ohio: American Sciences Press, 1998, pp. 238, US$148.00.

QUANTUM PROBABILITY COMMUNICATIONS. R.L. Hudson and J.M. Lindsay (Eds.). Singapore: World Scientific, 1998, pp. vii + 363, US$78.00.

STATISTICAL AND PROBABILISTIC MODELS IN RELIABILITY. D.C. Ionescu and N. Limnios (Eds.). Foreword by M. Iosifescu. Boston: Birkhäuser, 1999, pp. xxxvi + 352, SwFr.148.00/DM178.00/AusS1300.00.

THE MATHEMAGICIAN AND PIED PUZZLER. A Collection in Tribute to Martin Gardner. E. Berlekamp and T. Rodgers (Eds.). Natick, Massachusetts: A K Peters, 1999, pp. x + 266, US$34.00.

VISUAL EXPLORATIONS IN FINANCE. With Self-Organizing Maps. G. Deboeck and T. Kohonen (Eds.). London: Springer-Verlag, 1998, pp. xlv + 258, US$96.00.

BOOKS RECEIVED

A COURSE IN MATHEMATICAL MODELING. D.D. Mooney and R.J. Swift. Washington, D.C.: The Mathematical Association of America, 1999, pp. xx + 431, US$41.95.

A COURSE IN REAL ANALYSIS. J.N. McDonald and N.A. Weiss. San Diego, California: Academic Press, 1999, pp. xvii + 745.

APPLIED STATISTICS. P. Mukhopadhyay. Calcutta: Books and Applied (P) Ltd., 1999, pp. xii + 648 + 37, Rs.395.00/US$25.00.

A–Z OF MEDICAL STATISTICS. A Companion for Critical Appraisal. F. Pereira-Maxwell. London: Arnold, New York: Oxford University Press, 1998, pp. ix + 118, £15.99.

CALCULUS AND ALGEBRA WITH MATHCAD 8. B. Birkeland. Lund, Sweden: Studenttlitteratur, 1999, pp. 190, SEK326.00.

CONSTRAINED MARKOV DECISION PROCESSES. Stochastic Modeling. E. Altman. Boca Raton, Florida: Chapman & Hall/CRC, 1999, pp. 242.

DATA ANALYSIS USING SPSS FOR WINDOWS. A Beginner’s Guide. J.J. Foster. London: Sage, 1998, pp. xiii + 224, £45.00 Cloth; £14.99 Paper.

DESIGN ANALYSIS. Mathematical Modeling of Nonlinear Systems. D.E. Thompson. Cambridge University Press, 1999, pp. xvi + 275, £35.00/US$59.95.

EULER. The Master of Us All. W. Dunham. Washington, D.C.: The Mathematical Association of America, 1999, pp. xxviii + 185, US$29.95.

EVALUATION AND OPTIMIZATION OF ELECTORAL SYSTEMS. P. Grilli di Cortona, C. Manzi, A. Pennisi, F. Ricca and B. Simeone. Philadelphia: Society for Industrial and Applied Mathematics, 1999, pp. xvi + 230, US$53.00.

GEOMETRY AND ITS APPLICATIONS. W. Meyer. San Diego, California: Harcourt/Academic Press, 1999, pp. xviii + 531 CD, US$59.95.

GRAPH CLASSES. A Survey. A. Brandstadt, V.B. Le and J.P. Spinrad. Philadelphia: Society for Industrial and Applied Mathematics, 1999, pp. xi + 304, US$68.00.

HOW TO WIN MORE. Strategies for Increasing a Lottery Win. N. Henze and H. Riedwyl. Natick, Massachusetts: Peters, 1998, pp. x + 149, US$15.95.

INFLUENCE FUNCTIONS AND MATRICES. Y.A. Melnikov. New York: Dekker, 1999, pp. x + 469, US$185.00.

38

39

INTEGRABLE SYSTEMS. Twisters, Loop Groups, and Riemann Surfaces. Based on lectures given at a conference on integrable systems organized by N.M.J. Woodhouse and held at the Mathematical Institute, University of Oxford, in September 1997. N.J. Hitchin, G.B. Segal and R.S. Ward. Oxford: Clarendon Press, 1999, pp. viii + 136, £25.00.

INTERACTIVE STATISTICS. M. Aliaga and B. Gunderson. Upper Saddle River, New Jersey: Prentice Hall, 1999, pp. xii + 676.

INTRODUCTION TO MATRIX ANALYTIC METHODS IN STOCHASTIC MODELING. G. Latouche and V. Ramaswami. Philadelphia: Society for Industrial and Applied Mathematics; Alexandria, Virginia: American Statistical Association, 1999, pp. xiv + 334, US$49.50.

ITEM NONRESPONSE: OCCURRENCE, CAUSES, AND IMPUTATION OF MISSING ANSWERS TO TEST ITEMS. M. Huisman. Netherlands: DSWO Press, 1999, pp. 185.

ITERATIVE METHODS FOR OPTIMIZATION. C.T. Kelley. Philadelphia: Society for Industrial and Applied Mathematics, 1999, pp. xv + 180.

LATENT CLASS SCALING ANALYSIS. C.M. Dayton. Thousand Oaks, California: Sage Publications, 1998, pp. vii + 95, £8.99 Paper.

MATHEMATICA® NAVIGATOR. Graphics and Methods of Applied Mathematics. H. Ruskeepaa. San Diego, California: Academic Press, 1999, pp. xx + 848 + CD, US$44.95.

MATHEMATICAL MODELING IN THE ENVIRONMENT. C.R. Hadlock. Washington, D.C.: The Mathematical Association of America, 1998, pp. xiv 302 + disk, US$52.95.

METRIC NUMBER THEORY. G. Harman. Oxford: Clarendon Press, 1998, pp. xviii + 297, US$125.00.

MOBILE ROBOTS. Inspiration to Implementation, 2nd edition. J.L. Jones, B.A. Seiger and A.M. Flynn. Natick, Massachusetts: Peters, 1999, pp. xxii + 457, US$32.00.

MONTE CARLO METHODS IN STATISTICAL PHYSICS. M.E.J. Newman and G.T. Barkema. Oxford: Clarendon Press, 1999, pp. xiv + 475.

MULTIPLE REGRESSION: A Primer. P.D. Allison. Thousand Oaks, California: Pine Forge Press, 1999, pp. xviii + 202, £11.99.

NEURAL NETWORKS. H. Abdi, D. Valentin and B. Edelman. Thousand Oaks, California: Sage Publications, 1999, pp. iv + 89, £8.99.

NURBS. From Projective Geometry to Practical Use, 2nd edition. G.E. Farin. Natick, Massachusetts: A K Peters, 1999, pp. xv + 267, US$44.00/£30.00.

PRACTICAL STATISTICS BY EXAMPLE USING MICROSOFT® EXCEL. T. Sincich, D.M. Levine and D. Stephan. Upper Saddle River, New Jersey: Prentice-Hall, 1999, pp. xvii + 846 CD.

RANDOM FIELDS AND STOCHASTIC PARTIAL DIFFERENTIAL EQUATIONS. Y.A. Rozanov. Dordrecht, Netherlands: Kluwer, 1998, pp. vii + 229, Dfl.195.00/US$105.00/£67.00.

RELATING STATISTICS AND EXPERIMENTAL DESIGN. An Introduction. I.P. Lewin. Thousand Oaks, California: Sage Publications, 1999, pp. vi + 90, £8.99.

RUSSIAN PAPERS ON THE HISTORY OF PROBABILITY AND STATISTICS. o. Sheynin. Egelsbach, Germany: Verlag Der Deutschen Hochschulschriften, 1999, 3 Fiches, DM98.00.

SEEDS. Ecology, Biogeography, and Evolution or Dormancy and Germination. C.C. Baskin and J.M. Baskin. San Diego, California: Academic Press, 1998, pp. xiv + 666, US$99.95.

STATISTICAL DISTRIBUTIONS IN ENGINEERING. K. Bury. Cambridge University Press, 1999, pp. x + 362, £55.00/US$80.00 Cloth; £20.95/US$34.95 Paper.

STATISTICS. An Introduction, 5th edition. R.D. Mason, D.A. Lind and W.G. Marchal. Pacific Grove, California: Duxbury Press, 1998, pp. xxii + 689 + disk.

SYMMETRY AS A DEVELOPMENT PRINCIPLE IN NATURE AND ART. W. Hahn. Singapore: World Scientific, 1998, pp. xxi + 510, US$95.00.

THE MATHEMATICS OF CIPHERS. Number Theory and RSA Cryptography. S.C. Coutinho. Natick, Massachusetts: A K Peters, 1999, pp. xv + 196, US$30.00.

THE PRACTICE OF MACHINE DESIGN. Y. Hatamura. Translated by Y. Yamamoto. Oxford: Clarendon Press, 1999, pp. xiv + 336, £70.00.

THE SHAPE OF THE RIVER. Long-Term Consequences of Considering Race in College and University Admissions. W.G. Bowen and D. Bok. Princeton University Press, 1998, pp. xxxvi + 472, £19.95.

TOPICS IN INTERSECTION GRAPH THEORY. T.A. McKee and F.R. McMorris. Philadelphia: Society for Industrial and Applied Mathematics, 1999, pp. viii + 205, US$55.00.

USING SPSS FOR WINDOWS. Data Analysis and Graphics. K.E. Voelkl and S.B. Gerber. New York: Springer-Verlag, 1999, pp. xvi + 228, US$34.95/£26.50/DM69.00.

WAVELETS. J. Bergh, F. Ekstedt and M. Lindberg. Lund, Sweden: Studentlitteratur, 1999, pp. vii + 210, SEK355.00.

WAVES BY FINITE ANALYSIS. G. Backstrom. Lund, Sweden: Studentlitteratur, 1999, pp. 186.

YOUR STATISTICAL CONSULTANT. Answers To Your Data Analysis Questions. R.R. Newton and K.E. Rudestam. Thousand Oaks, California: Sage Publications, 1999, pp. xxii + 313, £33.00 (Cloth); £17.99 (Paper).

40