Bayesian data analysis /

Saved in:
Bibliographic Details
Imprint:London ; New York : Chapman & Hall, 1995.
Description:xix, 526 p. : ill. ; 24 cm.
Language:English
Series:Chapman & Hall texts in statistical science series
Texts in statistical science.
Subject:
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/2327440
Hidden Bibliographic Details
Other authors / contributors:Gelman, Andrew.
ISBN:0412039915
Notes:Includes bibliographical references and index.
Table of Contents:
  • List of models
  • List of examples
  • Preface
  • Part I. Fundamentals of Bayesian Inference
  • 1. Background
  • 1.1. Overview
  • 1.2. General notation for statistical inference
  • 1.3. Bayesian inference
  • 1.4. Example: inference about a genetic probability
  • 1.5. Probability as a measure of uncertainty
  • 1.6. Example of probability assignment: football point spreads
  • 1.7. Example of probability assignment: estimating the accuracy of record linkage
  • 1.8. Some useful results from probability theory
  • 1.9. Summarizing inferences by simulation
  • 1.10. Computation and software
  • 1.11. Bibliographic note
  • 1.12. Exercises
  • 2. Single-parameter models
  • 2.1. Estimating a probability from binomial data
  • 2.2. Posterior distribution as compromise between data and prior information
  • 2.3. Summarizing posterior inference
  • 2.4. Informative prior distributions
  • 2.5. Example: estimating the probability of a female birth given placenta previa
  • 2.6. Estimating the mean of a normal distribution with known variance
  • 2.7. Other standard single-parameter models
  • 2.8. Example: informative prior distribution and multilevel structure for estimating cancer rates
  • 2.9. Noninformative prior distributions
  • 2.10. Bibliographic note
  • 2.11. Exercises
  • 3. Introduction to multiparameter models
  • 3.1. Averaging over 'nuisance parameters'
  • 3.2. Normal data with a noninformative prior distribution
  • 3.3. Normal data with a conjugate prior distribution
  • 3.4. Normal data with a semi-conjugate prior distribution
  • 3.5. The multinomial model
  • 3.6. The multivariate normal model
  • 3.7. Example: analysis of a bioassay experiment
  • 3.8. Summary of elementary modeling and computation
  • 3.9. Bibliographic note
  • 3.10. Exercises
  • 4. Large-sample inference and frequency properties of Bayesian inference
  • 4.1. Normal approximations to the posterior distribution
  • 4.2. Large-sample theory
  • 4.3. Counterexamples to the theorems
  • 4.4. Frequency evaluations of Bayesian inferences
  • 4.5. Bibliographic note
  • 4.6. Exercises
  • Part II. Fundamentals of Bayesian Data Analysis
  • 5. Hierarchical models
  • 5.1. Constructing a parameterized prior distribution
  • 5.2. Exchangeability and setting up hierarchical models
  • 5.3. Computation with hierarchical models
  • 5.4. Estimating an exchangeable set of parameters from a normal model
  • 5.5. Example: combining information from educational testing experiments in eight schools
  • 5.6. Hierarchical modeling applied to a meta-analysis
  • 5.7. Bibliographic note
  • 5.8. Exercises
  • 6. Model checking and improvement
  • 6.1. The place of model checking in applied Bayesian statistics
  • 6.2. Do the inferences from the model make sense?
  • 6.3. Is the model consistent with data? Posterior predictive checking
  • 6.4. Graphical posterior predictive checks
  • 6.5. Numerical posterior predictive checks
  • 6.6. Model expansion
  • 6.7. Model comparison
  • 6.8. Model checking for the educational testing example
  • 6.9. Bibliographic note
  • 6.10. Exercises
  • 7. Modeling accounting for data collection
  • 7.1. Introduction
  • 7.2. Formal models for data collection
  • 7.3. Ignorability
  • 7.4. Sample surveys
  • 7.5. Designed experiments
  • 7.6. Sensitivity and the role of randomization
  • 7.7. Observational studies
  • 7.8. Censoring and truncation
  • 7.9. Discussion
  • 7.10. Bibliographic note
  • 7.11. Exercises
  • 8. Connections and challenges
  • 8.1. Bayesian interpretations of other statistical methods
  • 8.2. Challenges in Bayesian data analysis
  • 8.3. Bibliographic note
  • 8.4. Exercises
  • 9. General advice
  • 9.1. Setting up probability models
  • 9.2. Posterior inference
  • 9.3. Model evaluation
  • 9.4. Summary
  • 9.5. Bibliographic note
  • Part III. Advanced Computation
  • 10. Overview of computation
  • 10.1. Crude estimation by ignoring some information
  • 10.2. Use of posterior simulations in Bayesian data analysis
  • 10.3. Practical issues
  • 10.4. Exercises
  • 11. Posterior simulation
  • 11.1. Direct simulation
  • 11.2. Markov chain simulation
  • 11.3. The Gibbs sampler
  • 11.4. The Metropolis and Metropolis-Hastings algorithms
  • 11.5. Building Markov chain algorithms using the Gibbs sampler and Metropolis algorithm
  • 11.6. Inference and assessing convergence
  • 11.7. Example: the hierarchical normal model
  • 11.8. Efficient Gibbs samplers
  • 11.9. Efficient Metropolis jumping rules
  • 11.10. Recommended strategy for posterior simulation
  • 11.11. Bibliographic note
  • 11.12. Exercises
  • 12. Approximations based on posterior modes
  • 12.1. Finding posterior modes
  • 12.2. The normal and related mixture approximations
  • 12.3. Finding marginal posterior modes using EM and related algorithms
  • 12.4. Approximating conditional and marginal posterior densities
  • 12.5. Example: the hierarchical normal model (continued)
  • 12.6. Bibliographic note
  • 12.7. Exercises
  • 13. Special topics in computation
  • 13.1. Advanced techniques for Markov chain simulation
  • 13.2. Numerical integration
  • 13.3. Importance sampling
  • 13.4. Computing normalizing factors
  • 13.5. Bibliographic note
  • 13.6. Exercises
  • Part IV. Regression Models
  • 14. Introduction to regression models
  • 14.1. Introduction and notation
  • 14.2. Bayesian analysis of the classical regression model
  • 14.3. Example: estimating the advantage of incumbency in U.S. Congressional elections
  • 14.4. Goals of regression analysis
  • 14.5. Assembling the matrix of explanatory variables
  • 14.6. Unequal variances and correlations
  • 14.7. Models for unequal variances
  • 14.8. Including prior information
  • 14.9. Bibliographic note
  • 14.10. Exercises
  • 15. Hierarchical linear models
  • 15.1. Regression coefficients exchangeable in batches
  • 15.2. Example: forecasting U.S. Presidential elections
  • 15.3. General notation for hierarchical linear models
  • 15.4. Computation
  • 15.5. Hierarchical modeling as an alternative to selecting predictors
  • 15.6. Analysis of variance
  • 15.7. Bibliographic note
  • 15.8. Exercises
  • 16. Generalized linear models
  • 16.1. Introduction
  • 16.2. Standard generalized linear model likelihoods
  • 16.3. Setting up and interpreting generalized linear models
  • 16.4. Computation
  • 16.5. Example: hierarchical Poisson regression for police stops
  • 16.6. Example: hierarchical logistic regression for political opinions
  • 16.7. Models for multinomial responses
  • 16.8. Loglinear models for multivariate discrete data
  • 16.9. Bibliographic note
  • 16.10. Exercises
  • 17. Models for robust inference
  • 17.1. Introduction
  • 17.2. Overdispersed versions of standard probability models
  • 17.3. Posterior inference and computation
  • 17.4. Robust inference and sensitivity analysis for the educational testing example
  • 17.5. Robust regression using Student-t errors
  • 17.6. Bibliographic note
  • 17.7. Exercises
  • 18. Mixture models
  • 18.1. Introduction
  • 18.2. Setting up mixture models
  • 18.3. Computation
  • 18.4. Example: reaction times and schizophrenia
  • 18.5. Bibliographic note
  • 19. Multivariate models
  • 19.1. Linear regression with multiple outcomes
  • 19.2. Prior distributions for covariance matrices
  • 19.3. Hierarchical multivariate models
  • 19.4. Multivariate models for nonnormal data
  • 19.5. Time series and spatial models
  • 19.6. Bibliographic note
  • 19.7. Exercises
  • 20. Nonlinear models
  • 20.1. Introduction
  • 20.2. Example: serial dilution assay
  • 20.3. Example: population toxicokinetics
  • 20.4. Bibliographic note
  • 20.5. Exercises
  • 21. Models for missing data
  • 21.1. Notation
  • 21.2. Multiple imputation
  • 21.3. Missing data in the multivariate normal and t models
  • 21.4. Example: multiple imputation for a series of polls
  • 21.5. Missing values with counted data
  • 21.6. Example: an opinion poll in Slovenia
  • 21.7. Bibliographic note
  • 21.8. Exercises
  • 22. Decision analysis
  • 22.1. Bayesian decision theory in different contexts
  • 22.2. Using regression predictions: incentives for telephone surveys
  • 22.3. Multistage decision making: medical screening
  • 22.4. Decision analysis using a hierarchical model: home radon measurement and remediation
  • 22.5. Personal vs. institutional decision analysis
  • 22.6. Bibliographic note
  • 22.7. Exercises
  • Appendixes
  • A. Standard probability distributions
  • A.1. Introduction
  • A.2. Continuous distributions
  • A.3. Discrete distributions
  • A.4. Bibliographic note
  • B. Outline of proofs of asymptotic theorems
  • B.1. Bibliographic note
  • C. Example of computation in R and Bugs
  • C.1. Getting started with R and Bugs
  • C.2. Fitting a hierarchical model in Bugs
  • C.3. Options in the Bugs implementation
  • C.4. Fitting a hierarchical model in R
  • C.5. Further comments on computation
  • C.6. Bibliographic note
  • References
  • Author index
  • Subject index