Bayesian data analysis /

Saved in:
Bibliographic Details
Author / Creator:Gelman, Andrew, author.
Edition:Third edition.
Imprint:Boca Raton : CRC Press, 2013.
Description:xiv, 667 pages : illustrations ; 27 cm.
Language:English
Series:Chapman & Hall/CRC texts in statistical science
Texts in statistical science.
Subject:
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/10388148
Hidden Bibliographic Details
Varying Form of Title:BDA3
ISBN:9781439840955 (hardback)
1439840954 (hardback)
9781439840962
1439840962
Notes:Includes bibliographical references (pages 607-639) and indexes.
Summary:"Preface This book is intended to have three roles and to serve three associated audiences: an introductory text on Bayesian inference starting from first principles, a graduate text on effective current approaches to Bayesian modeling and computation in statistics and related fields, and a handbook of Bayesian methods in applied statistics for general users of and researchers in applied statistics. Although introductory in its early sections, the book is definitely not elementary in the sense of a first text in statistics. The mathematics used in our book is basic probability and statistics, elementary calculus, and linear algebra. A review of probability notation is given in Chapter 1 along with a more detailed list of topics assumed to have been studied. The practical orientation of the book means that the reader's previous experience in probability, statistics, and linear algebra should ideally have included strong computational components. To write an introductory text alone would leave many readers with only a taste of the conceptual elements but no guidance for venturing into genuine practical applications, beyond those where Bayesian methods agree essentially with standard non-Bayesian analyses. On the other hand, we feel it would be a mistake to present the advanced methods without first introducing the basic concepts from our data-analytic perspective. Furthermore, due to the nature of applied statistics, a text on current Bayesian methodology would be incomplete without a variety of worked examples drawn from real applications. To avoid cluttering the main narrative, there are bibliographic notes at the end of each chapter and references at the end of the book"--
Table of Contents:
  • Preface
  • Part I. Fundamentals of Bayesian Inference
  • 1. Probability and inference
  • 1.1. The three steps of Bayesian data analysis
  • 1.2. General notation for statistical inference
  • 1.3. Bayesian inference
  • 1.4. Discrete probability examples: genetics and spell checking
  • 1.5. Probability as a measure of uncertainty
  • 1.6. Example of probability assignment: football point spreads
  • 1.7. Example: estimating the accuracy of record linkage
  • 1.8. Some useful results from probability theory
  • 1.9. Computation and software
  • 1.10. Bayesian inference in applied statistics
  • 1.11. Bibliographic note
  • 1.12. Exercises
  • 2. Single-parameter models
  • 2.1. Estimating a probability from binomial data
  • 2.2. Posterior as compromise between data and prior information
  • 2.3. Summarizing posterior inference
  • 2.4. Informative prior distributions
  • 2.5. Estimating a normal mean with known variance
  • 2.6. Other standard single-parameter models
  • 2.7. Example: informative prior distribution for cancer rates
  • 2.8. Noninformative prior distributions
  • 2.9. Weakly informative prior distributions
  • 2.10. Bibliographic note
  • 2.11. Exercises
  • 3. Introduction to multiparameter models
  • 3.1. Averaging over 'nuisance parameters'
  • 3.2. Normal data with a noninformative prior distribution
  • 3.3. Normal data with a conjugate prior distribution
  • 3.4. Multinomial model for categorical data
  • 3.5. Multivariate normal model with known variance
  • 3.6. Multivariate normal with unknown mean and variance
  • 3.7. Example: analysis of a bioassay experiment
  • 3.8. Summary of elementary modeling and computation
  • 3.9. Bibliographic note
  • 3.10. Exercises
  • 4. Asymptotics and connections to non-Bayesian approaches
  • 4.1. Normal approximations to the posterior distribution
  • 4.2. Large-sample theory
  • 4.3. Counterexamples to the theorems
  • 4.4. Frequency evaluations of Bayesian inferences
  • 4.5. Bayesian interpretations of other statistical methods
  • 4.6. Bibliographic note
  • 4.7. Exercises
  • 5. Hierarchical models
  • 5.1. Constructing a parameterized prior distribution
  • 5.2. Exchangeability and setting up hierarchical models
  • 5.3. Fully Bayesian analysis of conjugate hierarchical models
  • 5.4. Estimating exchangeable parameters from a normal model
  • 5.5. Example: parallel experiments in eight schools
  • 5.6. Hierarchical modeling applied to a meta-analysis
  • 5.7. Weakly informative priors for hierarchical variance parameters
  • 5.8. Bibliographic note
  • 5.9. Exercises
  • Part II. Fundamentals of Bayesian Data Analysis
  • 6. Model checking
  • 6.1. The place of model checking in applied Bayesian statistics
  • 6.2. Do the inferences from the model make sense?
  • 6.3. Posterior predictive checking
  • 6.4. Graphical posterior predictive checks
  • 6.5. Model checking for the educational testing example
  • 6.6. Bibliographic note
  • 6.7. Exercises
  • 7. Evaluating, comparing, and expanding models
  • 7.1. Measures of predictive accuracy
  • 7.2. Information criteria and cross-validation
  • 7.3. Model comparison based on predictive performance
  • 7.4. Model comparison using Bayes factors
  • 7.5. Continuous model expansion
  • 7.6. Implicit assumptions and model expansion: an example
  • 7.7. Bibliographic note
  • 7.8. Exercises
  • 8. Modeling accounting for data collection
  • 8.1. Bayesian inference requires a model for data collection
  • 8.2. Data-collection models and ignoreability
  • 8.3. Sample surveys
  • 8.4. Designed experiments
  • 8.5. Sensitivity and the role of randomization
  • 8.6. Observational studies
  • 8.7. Censoring and truncation
  • 8.8. Discussion
  • 8.9. Bibliographic note
  • 8.10. Exercises
  • 9. Decision analysis
  • 9.1. Bayesian decision theory in different contexts
  • 9.2. Using regression predictions: incentives for telephone surveys
  • 9.3. Multistage decision making: medical screening
  • 9.4. Hierarchical decision analysis for radon measurement
  • 9.5. Personal vs. institutional decision analysis
  • 9.6. Bibliographic note
  • 9.7. Exercises
  • Part III. Advanced Computation
  • 10. Introduction to Bayesian computation
  • 10.1. Numerical integration
  • 10.2. Distributional approximations
  • 10.3. Direct simulation and rejection sampling
  • 10.4. Importance sampling
  • 10.5. How many simulation draws are needed?
  • 10.6. Computing environments
  • 10.7. Debugging Bayesian computing
  • 10.8. Bibliographic note
  • 10.9. Exercises
  • 11. Basics of Markov chain simulation
  • 11.1. Gibbs sampler
  • 11.2. Metropolis and Metropolis-Hastings algorithms
  • 11.3. Using Gibbs and Metropolis as building blocks
  • 11.4. Inference and assessing convergence
  • 11.5. Effective number of simulation draws
  • 11.6. Example: hierarchical normal model
  • 11.7. Bibliographic note
  • 11.8. Exercises
  • 12. Computationally efficient Markov chain simulation
  • 12.1. Efficient Gibbs samplers
  • 12.2. Efficient Metropolis jumping rules
  • 12.3. Further extensions to Gibbs and Metropolis
  • 12.4. Hamiltonian Monte Carlo
  • 12.5. Hamiltonian dynamics for a simple hierarchical model
  • 12.6. Stan: developing a computing environment
  • 12.7. Bibliographic note
  • 12.8. Exercises
  • 13. Modal and distributional approximations
  • 13.1. Finding posterior modes
  • 13.2. Boundary-avoiding priors for modal summaries
  • 13.3. Normal and related mixture approximations
  • 13.4. Finding marginal posterior modes using EM
  • 13.5. Approximating conditional and marginal posterior densities
  • 13.6. Example: hierarchical normal model (continued)
  • 13.7. Variational inference
  • 13.8. Expectation propagation
  • 13.9. Other approximations
  • 13.10. Unknown normalizing factors
  • 13.11. Bibliographic note
  • 13.12. Exercises
  • Part IV. Regression Models
  • 14. Introduction to regression models
  • 14.1. Conditional modeling
  • 14.2. Bayesian analysis of the classical regression model
  • 14.3. Regression for causal inference: incumbency in congressional elections
  • 14.4. Goals of regression analysis
  • 14.5. Assembling the matrix of explanatory variables
  • 14.6. Regularization and dimension reduction for multiple predictors
  • 14.7. Unequal variances and correlations
  • 14.8. Including numerical prior information
  • 14.9. Bibliographic note
  • 14.10. Exercises
  • 15. Hierarchical linear models
  • 15.1. Regression coefficients exchangeable in batches
  • 15.2. Example: forecasting U.S. presidential elections
  • 15.3. Interpreting a normal prior distribution as additional data
  • 15.4. Varying intercepts and slopes
  • 15.5. Computation: batching and transformation
  • 15.6. Analysis of variance and the batching of coefficients
  • 15.7. Hierarchical models for batches of variance components
  • 15.8. Bibliographic note
  • 15.9. Exercises
  • 16. Generalized linear models
  • 16.1. Standard generalized linear model likelihoods
  • 16.2. Working with generalized linear models
  • 16.3. Weakly informative priors for logistic regression
  • 16.4. Example: hierarchical Poisson regression for police stops
  • 16.5. Example: hierarchical logistic regression for political opinions
  • 16.6. Models for multivariate and multinomial responses
  • 16.7. Loglinear models for multivariate discrete data
  • 16.8. Bibliographic note
  • 16.9. Exercises
  • 17. Models for robust inference
  • 17.1. Aspects of robustness
  • 17.2. Overdispersed versions of standard probability models
  • 17.3. Posterior inference and computation
  • 17.4. Robust inference and sensitivity analysis for the eight schools
  • 17.5. Robust regression using t-distributed errors
  • 17.6. Bibliographic note
  • 17.7. Exercises
  • 18. Models for missing data
  • 18.1. Notation
  • 18.2. Multiple imputation
  • 18.3. Missing data in the multivariate normal and t models
  • 18.4. Example: multiple imputation for a series of polls
  • 18.5. Missing values with counted data
  • 18.6. Example: an opinion poll in Slovenia
  • 18.7. Bibliographic note
  • 18.8. Exercises
  • Part V. Nonlinear and Nonparametric Models
  • 19. Parametric nonlinear models
  • 19.1. Example: serial dilution assay
  • 19.2. Example: population toxicokinetics
  • 19.3. Bibliographic note
  • 19.4. Exercises
  • 20. Basis function models
  • 20.1. Splines and weighted sums of basis functions
  • 20.2. Basis selection and shrinkage of coefficients
  • 20.3. Non-normal models and multivariate regression surfaces
  • 20.4. Bibliographic note
  • 20.5. Exercises
  • 21. Gaussian process models
  • 21.1. Gaussian process regression
  • 21.2. Example: birthdays and birthdates
  • 21.3. Latent Gaussian process models
  • 21.4. Functional data analysis
  • 21.5. Density estimation and regression
  • 21.6. Bibliographic note
  • 21.7. Exercises
  • 22. Finite mixture models
  • 22.1. Setting up and interpreting mixture models
  • 22.2. Example: reaction times and schizophrenia
  • 22.3. Label switching and posterior computation
  • 22.4. Unspecified number of mixture components
  • 22.5. Mixture models for classification and regression
  • 22.6. Bibliographic note
  • 22.7. Exercises
  • 23. Dirichlet process models
  • 23.1. Bayesian histograms
  • 23.2. Dirichlet process prior distributions
  • 23.3. Dirichlet process mixtures
  • 23.4. Beyond density estimation
  • 23.5. Hierarchical dependence
  • 23.6. Density regression
  • 23.7. Bibliographic note
  • 23.8. Exercises
  • A. Standard probability distributions
  • A.1. Continuous distributions
  • A.2. Discrete distributions
  • A.3. Bibliographic note
  • B. Outline of proofs of limit theorems
  • B.1. Bibliographic note
  • C. Computation in R and Stan
  • C.1. Getting started with R and Stan
  • C.2. Fitting a hierarchical model in Stan
  • C.3. Direct simulation, Gibbs, and Metropolis in R
  • C.4. Programming Hamiltonian Monte Carlo in R
  • C.5. Further comments on computation
  • C.6. Bibliographic note
  • References
  • Author Index
  • Subject Index