Time series analysis by state space methods /
Saved in:
Author / Creator: | Durbin, James. |
---|---|
Imprint: | Oxford ; New York : Oxford University Press, 2001. |
Description: | xvii, 253 p. : ill. ; 25 cm. |
Language: | English |
Series: | Oxford statistical science series ; 24 |
Subject: | |
Format: | Print Book |
URL for this record: | http://pi.lib.uchicago.edu/1001/cat/bib/4505883 |
Table of Contents:
- 1. Introduction
- 1.1. Basic ideas of state space analysis
- 1.2. Linear Gaussian model
- 1.3. Non-Gaussian and nonlinear models
- 1.4. Prior knowledge
- 1.5. Notation
- 1.6. Other books on state space methods
- 1.7. Website for the book
- I. The Linear Gaussian State Space Model
- 2. Local level model
- 2.1. Introduction
- 2.2. Filtering
- 2.2.1. The Kalman Filter
- 2.2.2. Illustration
- 2.3. Forecast errors
- 2.3.1. Cholesky decomposition
- 2.3.2. Error recursions
- 2.4. State smoothing
- 2.4.1. Smoothed state
- 2.4.2. Smoothed state variance
- 2.4.3. Illustration
- 2.5. Disturbance smoothing
- 2.5.1. Smoothed observation disturbances
- 2.5.2. Smoothed state disturbances
- 2.5.3. Illustration
- 2.5.4. Cholesky decomposition and smoothing
- 2.6. Simulation
- 2.6.1. Illustration
- 2.7. Missing observations
- 2.7.1. Illustration
- 2.8. Forecasting
- 2.8.1. Illustration
- 2.9. Initialisation
- 2.10. Parameter estimation
- 2.10.1. Loglikelihood evaluation
- 2.10.2. Concentration of loglikelihood
- 2.10.3. Illustration
- 2.11. Steady state
- 2.12. Diagnostic checking
- 2.12.1. Diagnostic tests for forecast errors
- 2.12.2. Detection of outliers and structural breaks
- 2.12.3. Illustration
- 2.13. Appendix: Lemma in multivariate normal regression
- 3. Linear Gaussian state space models
- 3.1. Introduction
- 3.2. Structural time series models
- 3.2.1. Univariate models
- 3.2.2. Multivariate models
- 3.2.3. Stamp
- 3.3. ARMA models and ARIMA models
- 3.4. Exponential smoothing
- 3.5. State space versus Box-Jenkins approaches
- 3.6. Regression with time-varying coefficients
- 3.7. Regression with ARMA errors
- 3.8. Benchmarking
- 3.9. Simultaneous modelling of series from different sources
- 3.10. State space models in continuous time
- 3.10.1. Local level model
- 3.10.2. Local linear trend model
- 3.11. Spline smoothing
- 3.11.1. Spline smoothing in discrete time
- 3.11.2. Spline smoothing in continuous time
- 4. Filtering, smoothing and forecasting
- 4.1. Introduction
- 4.2. Filtering
- 4.2.1. Derivation of Kalman filter
- 4.2.2. Kalman filter recursion
- 4.2.3. Steady state
- 4.2.4. State estimation errors and forecast errors
- 4.3. State smoothing
- 4.3.1. Smoothed state vector
- 4.3.2. Smoothed state variance matrix
- 4.3.3. State smoothing recursion
- 4.4. Disturbance smoothing
- 4.4.1. Smoothed disturbances
- 4.4.2. Fast state smoothing
- 4.4.3. Smoothed disturbance variance matrices
- 4.4.4. Disturbance smoothing recursion
- 4.5. Covariance matrices of smoothed estimators
- 4.6. Weight functions
- 4.6.1. Introduction
- 4.6.2. Filtering weights
- 4.6.3. Smoothing weights
- 4.7. Simulation smoothing
- 4.7.1. Simulating observation disturbances
- 4.7.2. Derivation of simulation smoother for observation disturbances
- 4.7.3. Simulation smoothing recursion
- 4.7.4. Simulating state disturbances
- 4.7.5. Simulating state vectors
- 4.7.6. Simulating multiple samples
- 4.8. Missing observations
- 4.9. Forecasting
- 4.10. Dimensionality of observational vector
- 4.11. General matrix form for filtering and smoothing
- 5. Initialisation of filter and smoother
- 5.1. Introduction
- 5.2. The exact initial Kalman filter
- 5.2.1. The basic recursions
- 5.2.2. Transition to the usual Kalman filter
- 5.2.3. A convenient representation
- 5.3. Exact initial state smoothing
- 5.3.1. Smoothed mean of state vector
- 5.3.2. Smoothed variance of state vector
- 5.4. Exact initial disturbance smoothing
- 5.5. Exact initial simulation smoothing
- 5.6. Examples of initial conditions for some models
- 5.6.1. Structural time series models
- 5.6.2. Stationary ARMA models
- 5.6.3. Nonstationary ARIMA models
- 5.6.4. Regression model with ARMA errors
- 5.6.5. Spline smoothing
- 5.7. Augmented Kalman filter and smoother
- 5.7.1. Introduction
- 5.7.2. Augmented Kalman filter
- 5.7.3. Filtering based on the augmented Kalman filter
- 5.7.4. Illustration: the local linear trend model
- 5.7.5. Comparisons of computational efficiency
- 5.7.6. Smoothing based on the augmented Kalman filter
- 6. Further computational aspects
- 6.1. Introduction
- 6.2. Regression estimation
- 6.2.1. Introduction
- 6.2.2. Inclusion of coefficient vector in state vector
- 6.2.3. Regression estimation by augmentation
- 6.2.4. Least squares and recursive residuals
- 6.3. Square root filter and smoother
- 6.3.1. Introduction
- 6.3.2. Square root form of variance updating
- 6.3.3. Givens rotations
- 6.3.4. Square root smoothing
- 6.3.5. Square root filtering and initialisation
- 6.3.6. Ilustration: local linear trend model
- 6.4. Univariate treatment of multivariate series
- 6.4.1. Introduction
- 6.4.2. Details of univariate treatment
- 6.4.3. Correlation between observation equations
- 6.4.4. Computational efficiency
- 6.4.5. Illustration: vector splines
- 6.5. Filtering and smoothing under linear restrictions
- 6.6. The algorithms of SsfPack
- 6.6.1. Introduction
- 6.6.2. The SsfPack function
- 6.6.3. Illustration: spline smoothing
- 7. Maximum likelihood estimation
- 7.1. Introduction
- 7.2. Likelihood evaluation
- 7.2.1. Loglikelihood when initial conditions are known
- 7.2.2. Diffuse loglikelihood
- 7.2.3. Diffuse loglikelihood evaluated via augmented Kalman filter
- 7.2.4. Likelihood when elements of initial state vector are fixed but unknown
- 7.3. Parameter estimation
- 7.3.1. Introduction
- 7.3.2. Numerical maximisation algorithms
- 7.3.3. The score vector
- 7.3.4. The EM algorithm
- 7.3.5. Parameter estimation when dealing with diffuse initial conditions
- 7.3.6. Large sample distribution of maximum likelihood estimates
- 7.3.7. Effect of errors in parameter estimation
- 7.4. Goodness of fit
- 7.5. Diagnostic checking
- 8. Bayesian analysis
- 8.1. Introduction
- 8.2. Posterior analysis of state vector
- 8.2.1. Posterior analysis conditional on parameter vector
- 8.2.2. Posterior analysis when parameter vector is unknown
- 8.2.3. Non-informative priors
- 8.3. Markov chain Monte Carlo methods
- 9. Illustrations of the use of the linear Gaussian model
- 9.1. Introduction
- 9.2. Structural time series models
- 9.3. Bivariate structural time series analysis
- 9.4. Box-Jenkins analysis
- 9.5. Spline smoothing
- 9.6. Approximate methods for modelling volatility
- II. Non-Gaussian And Nonlinear State Space Models
- 10. Non-Gaussian and nonlinear state space models
- 10.1. Introduction
- 10.2. The general non-Gaussian model
- 10.3. Exponential family models
- 10.3.1. Poisson density
- 10.3.2. Binary density
- 10.3.3. Binomial density
- 10.3.4. Negative binomial density
- 10.3.5. Multinomial density
- 10.4. Heavy-tailed distributions
- 10.4.1. t-Distribution
- 10.4.2. Mixture of normals
- 10.4.3. General error distribution
- 10.5. Nonlinear models
- 10.6. Financial models
- 10.6.1. Stochastic volatility models
- 10.6.2. General autoregressive conditional heteroscedasticity
- 10.6.3. Durations: exponential distribution
- 10.6.4. Trade frequencies: Poisson distribution
- 11. Importance sampling
- 11.1. Introduction
- 11.2. Basic ideas of importance sampling
- 11.3. Linear Gaussian approximating models
- 11.4. Linearisation based on first two derivatives
- 11.4.1. Exponentional family models
- 11.4.2. Stochastic volatility model
- 11.5. Linearisation based on the first derivative
- 11.5.1. t-distribution
- 11.5.2. Mixture of normals
- 11.5.3. General error distribution
- 11.6. Linearisation for non-Gaussian state components
- 11.6.1. t-distribution for state errors
- 11.7. Linearisation for nonlinear models
- 11.7.1. Multiplicative models
- 11.8. Estimating the conditional mode
- 11.9. Computational aspects of importance sampling
- 11.9.1. Introduction
- 11.9.2. Practical implementation of importance sampling
- 11.9.3. Antithetic variables
- 11.9.4. Diffuse initialisation
- 11.9.5. Treatment of t-distribution without importance sampling
- 11.9.6. Treatment of Gaussian mixture distributions without importance sampling
- 12. Analysis from a classical standpoint
- 12.1. Introduction
- 12.2. Estimating conditional means and variances
- 12.3. Estimating conditional densities and distribution functions
- 12.4. Forecasting and estimating with missing observations
- 12.5. Parameter estimation
- 12.5.1. Introduction
- 12.5.2. Estimation of likelihood
- 12.5.3. Maximisation of loglikelihood
- 12.5.4. Variance matrix of maximum likelihood estimate
- 12.5.5. Effect of errors in parameter estimation
- 12.5.6. Mean square error matrix due to simulation
- 12.5.7. Estimation when the state disturbances are Gaussian
- 12.5.8. Control variables
- 13. Analysis from a Bayesian standpoint
- 13.1. Introduction
- 13.2. Posterior analysis of functions of the state vector
- 13.3. Computational aspects of Bayesian analysis
- 13.4. Posterior analysis of parameter vector
- 13.5. Markov chain Monte Carlo methods
- 14. Non-Gaussian and nonlinear illustrations
- 14.1. Introduction
- 14.2. Poisson density: van drivers killed in Great Britain
- 14.3. Heavy-tailed density: outlier in gas consumption in UK
- 14.4. Volatility: pound/dollar daily exchange rates
- 14.5. Binary density: Oxford-Cambridge boat race
- 14.6. Non-Gaussian and nonlinear analysis using SsfPack
- References
- Author index
- Subject index