Bayesian signal processing : classical, modern, and particle filtering methods /

Saved in:
Bibliographic Details
Author / Creator:Candy, James V.
Imprint:Hoboken, N.J. : Wiley : IEEE, [2009]
©2009
Description:1 online resource (xxiii, 445 pages) : illustrations, map
Language:English
Series:Adaptive and learning systems for signal processing, communications, and control
Adaptive and learning systems for signal processing, communications, and control.
Subject:
Format: E-Resource Book
URL for this record:http://pi.lib.uchicago.edu/1001/cat/bib/13595719
Hidden Bibliographic Details
ISBN:9780470430583
0470430583
9781118210543
1118210549
0470180943
9780470180945
9780470430576
0470430575
9780470180945
1282316680
9781282316683
Digital file characteristics:data file
Notes:Includes bibliographical references and index.
Print version record.
Summary:New Bayesian approach helps you solve tough problems in signal processing with ease. Signal processing is based on this fundamental conceptthe extraction of critical information from noisy, uncertain data. Most techniques rely on underlying Gaussian assumptions for a solution, but what happens when these assumptions are erroneous? Bayesian techniques circumvent this limitation by offering a completely different approach that can easily incorporate non-Gaussian and nonlinear processes along with all of the usual methods currently available. This text enables readers to fully exploit the many a.
Other form:Print version: Candy, J.V. Bayesian signal processing. Hoboken, N.J. : Wiley : IEEE, ©2009 9780470180945 0470180943
Standard no.:10.1002/9780470430583
Table of Contents:
  • Preface
  • References
  • Acknowledgments
  • 1. Introduction
  • 1.1. Introduction
  • 1.2. Bayesian Signal Processing
  • 1.3. Simulation-Based Approach to Bayesian Processing
  • 1.4. Bayesian Model-Based Signal Processing
  • 1.5. Notation and Terminology
  • References
  • Problems
  • 2. Bayesian Estimation
  • 2.1. Introduction
  • 2.2. Batch Bayesian Estimation
  • 2.3. Batch Maximum Likelihood Estimation
  • 2.4. Batch Minimum Variance Estimation
  • 2.5. Sequential Bayesian Estimation
  • 2.6. Summary
  • References
  • Problems
  • 3. Simulation-Based Bayesian Methods
  • 3.1. Introduction
  • 3.2. Probability Density Function Estimation
  • 3.3. Sampling Theory
  • 3.4. Monte Carlo Approach
  • 3.5. Importance Sampling
  • 3.6. Sequential Importance Sampling
  • 3.7. Summary
  • References
  • Problems
  • 4. State-Space Models for Bayesian Processing
  • 4.1. Introduction
  • 4.2. Continuous-Time State-Space Models
  • 4.3. Sampled-Data State-Space Models
  • 4.4. Discrete-Time State-Space Models
  • 4.5. Gauss-Markov State-Space Models
  • 4.6. Innovations Model
  • 4.7. State-Space Model Structures
  • 4.8. Nonlinear (Approximate) Gauss-Markov State-Space Models
  • 4.9. Summary
  • References
  • Problems
  • 5. Classical Bayesian State-Space Processors
  • 5.1. Introduction
  • 5.2. Bayesian Approach to the State-Space
  • 5.3. Linear Bayesian Processor (Linear Kalman Filter)
  • 5.4. Linearized Bayesian Processor (Linearized Kalman Filter)
  • 5.5. Extended Bayesian Processor (Extended Kalman Filter)
  • 5.6. Iterated-Extended Bayesian Processor (Iterated-Extended Kalman Filter)
  • 5.7. Practical Aspects of Classical Bayesian Processors
  • 5.8. Case Study: RLC Circuit Problem
  • 5.9. Summary
  • References
  • Problems
  • 6. Modern Bayesian State-Space Processors
  • 6.1. Introduction
  • 6.2. Sigma-Point (Unscented) Transformations
  • 6.3. Sigma-Point Bayesian Processor (Unscented Kalman Filter)
  • 6.4. Quadrature Bayesian Processors
  • 6.5. Gaussian Sum (Mixture) Bayesian Processors
  • 6.6. Case Study: 2D-Tracking Problem
  • 6.7. Summary
  • References
  • Problems
  • 7. Particle-Based Bayesian State-Space Processors
  • 7.1. Introduction
  • 7.2. Bayesian State-Space Particle Filters
  • 7.3. Importance Proposal Distributions
  • 7.4. Resampling
  • 7.5. State-Space Particle Filtering Techniques
  • 7.6. Practical Aspects of Particle Filter Design
  • 7.7. Case Study: Population Growth Problem
  • 7.8. Summary
  • References
  • Problems
  • 8. Joint Bayesian State/Parametric Processors
  • 8.1. Introduction
  • 8.2. Bayesian Approach to Joint State/Parameter Estimation
  • 8.3. Classical/Modern Joint Bayesian State/Parametric Processors
  • 8.3.1. Classical Joint Bayesian Processor
  • 8.3.2. Modern Joint Bayesian Processor
  • 8.4. Particle-Based Joint Bayesian State/Parametric Processors
  • 8.5. Case Study: Random Target Tracking using a Synthetic Aperture Towed Array
  • 8.6. Summary
  • References
  • Problems
  • 9. Discrete Hidden Markov Model Bayesian Processors
  • 9.1. Introduction
  • 9.2. Hidden Markov Models
  • 9.3. Properties of the Hidden Markov Model
  • 9.4. HMM Observation Probability: Evaluation Problem
  • 9.5. State Estimation in HMM: The Viterbi Technique
  • 9.6. Parameter Estimation in HMM: The EM/Baum-Welch Technique
  • 9.7. Case Study: Time-Reversal Decoding
  • 9.8. Summary
  • References
  • Problems
  • 10. Bayesian Processors for Physics-Based Applications
  • 10.1. Optimal Position Estimation for the Automatic Alignment
  • 10.2. Broadband Ocean Acoustic Processing
  • 10.3. Bayesian Processing for Biothreats
  • 10.4. Bayesian Processing for the Detection of Radioa