SAS for forecasting time series /

Saved in:
Bibliographic Details
Author / Creator:Brocklebank, John Clare.
Edition:2nd ed.
Imprint:Cary, N.C. : SAS Institute Inc. ; [S.l.] : John Wiley, c2003.
Description:x, 398 p. : ill. ; 28 cm.
Language:English
Subject:
Format: Print Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/4915541
Hidden Bibliographic Details
Other authors / contributors:Dickey, David A.
ISBN:1590471822 (SAS Institute Inc.)
0471395668 (paper : alk. paper)
Notes:Includes bibliographical references and index.
Table of Contents:
  • Preface
  • Acknowledgments
  • Chapter 1. Overview of Time Series
  • 1.1. Introduction
  • 1.2. Analysis Methods and SAS/ETS Software
  • 1.2.1. Options
  • 1.2.2. How SAS/ETS Software Procedures Interrelate
  • 1.3. Simple Models: Regression
  • 1.3.1. Linear Regression
  • 1.3.2. Highly Regular Seasonality
  • 1.3.3. Regression with Transformed Data
  • Chapter 2. Simple Models: Autoregression
  • 2.1. Introduction
  • 2.1.1. Terminology and Notation
  • 2.1.2. Statistical Background
  • 2.2. Forecasting
  • 2.2.1. Forecasting with PROC ARIMA
  • 2.2.2. Backshift Notation B for Time Series
  • 2.2.3. Yule-Walker Equations for Covariances
  • 2.3. Fitting an AR Model in PROC REG
  • Chapter 3. The General ARIMA Model
  • 3.1. Introduction
  • 3.1.1. Statistical Background
  • 3.1.2. Terminology and Notation
  • 3.2. Prediction
  • 3.2.1. One-Step-Ahead Predictions
  • 3.2.2. Future Predictions
  • 3.3. Model Identification
  • 3.3.1. Stationarity and Invertibility
  • 3.3.2. Time Series Identification
  • 3.3.3. Chi-Square Check of Residuals
  • 3.3.4. Summary of Model Identification
  • 3.4. Examples and Instructions
  • 3.4.1. IDENTIFY Statement for Series 1-8
  • 3.4.2. Example: Iron and Steel Export Analysis
  • 3.4.3. Estimation Methods Used in PROC ARIMA
  • 3.4.4. ESTIMATE Statement for Series 8
  • 3.4.5. Nonstationary Series
  • 3.4.6. Effect of Differencing on Forecasts
  • 3.4.7. Examples: Forecasting IBM Series and Silver Series
  • 3.4.8. Models for Nonstationary Data
  • 3.4.9. Differencing to Remove a Linear Trend
  • 3.4.10. Other Identification Techniques
  • 3.5. Summary
  • Chapter 4. The ARIMA Model: Introductory Applications
  • 4.1. Seasonal Time Series
  • 4.1.1. Introduction to Seasonal Modeling
  • 4.1.2. Model Identification
  • 4.2. Models with Explanatory Variables
  • 4.2.1. Case 1: Regression with Time Series Errors
  • 4.2.2. Case 1A: Intervention
  • 4.2.3. Case 2: Simple Transfer Function
  • 4.2.4. Case 3: General Transfer Function
  • 4.2.5. Case 3A: Leading Indicators
  • 4.2.6. Case 3B: Intervention
  • 4.3. Methodology and Example
  • 4.3.1. Case 1: Regression with Time Series Errors
  • 4.3.2. Case 2: Simple Transfer Functions
  • 4.3.3. Case 3: General Transfer Functions
  • 4.3.4. Case 3B: Intervention
  • 4.4. Further Examples
  • 4.4.1. North Carolina Retail Sales
  • 4.4.2. Construction Series Revisited
  • 4.4.3. Milk Scare (Intervention)
  • 4.4.4. Terrorist Attack
  • Chapter 5. The ARIMA Model: Special Applications
  • 5.1. Regression with Time Series Errors and Unequal Variances
  • 5.1.1. Autoregressive Errors
  • 5.1.2. Example: Energy Demand at a University
  • 5.1.3. Unequal Variances
  • 5.1.4. ARCH, GARCH, and IGARCH for Unequal Variances
  • 5.2. Cointegration
  • 5.2.1. Introduction
  • 5.2.2. Cointegration and Eigenvalues
  • 5.2.3. Impulse Response Function
  • 5.2.4. Roots in Higher-Order Models
  • 5.2.5. Cointegration and Unit Roots
  • 5.2.6. An Illustrative Example
  • 5.2.7. Estimating the Cointegrating Vector
  • 5.2.8. Intercepts and More Lags
  • 5.2.9. PROC VARMAX
  • 5.2.10. Interpreting the Estimates
  • 5.2.11. Diagnostics and Forecasts
  • Chapter 6. State Space Modeling
  • 6.1. Introduction
  • 6.1.1. Some Simple Univariate Examples
  • 6.1.2. A Simple Multivariate Example
  • 6.1.3. Equivalence of State Space and Vector ARMA Models
  • 6.2. More Examples
  • 6.2.1. Some Univariate Examples
  • 6.2.2. ARMA(1,1) of Dimension 2
  • 6.3. PROC STATESPACE
  • 6.3.1. State Vectors Determined from Covariances
  • 6.3.2. Canonical Correlations
  • 6.3.3. Simulated Example
  • Chapter 7. Spectral Analysis
  • 7.1. Periodic Data: Introduction
  • 7.2. Example: Plant Enzyme Activity
  • 7.3. PROC SPECTRA Introduced
  • 7.4. Testing for White Noise
  • 7.5. Harmonic Frequencies
  • 7.6. Extremely Fast Fluctuations and Aliasing
  • 7.7. The Spectral Density
  • 7.8. Some Mathematical Detail (Optional Reading)
  • 7.9. Estimating the Spectrum: The Smoothed Periodogram
  • 7.10. Cross-Spectral Analysis
  • 7.10.1. Interpreting Cross-Spectral Quantities
  • 7.10.2. Interpreting Cross-Amplitude and Phase Spectra
  • 7.10.3. PROC SPECTRA Statements
  • 7.10.4. Cross-Spectral Analysis of the Neuse River Data
  • 7.10.5. Details on Gain, Phase, and Pure Delay
  • Chapter 8. Data Mining and Forecasting
  • 8.1. Introduction
  • 8.2. Forecasting Data Model
  • 8.3. The Time Series Forecasting System
  • 8.4. HPF Procedure
  • 8.5. Scorecard Development
  • 8.6. Business Goal Performance Metrics
  • 8.7. Graphical Displays
  • 8.8. Goal-Seeking Model Development
  • 8.9. Summary
  • References
  • Index