Bayesian Signal Processing: Classical, Modern, and Particle Filtering Methods, Second Edition
Buy Rights Online Buy Rights

Rights Contact Login For More Details

More About This Title Bayesian Signal Processing: Classical, Modern, and Particle Filtering Methods, Second Edition

English

Presents the Bayesian approach to statistical signal processing for a variety of useful model sets 

This book aims to give readers a unified Bayesian treatment starting from the basics (Baye’s rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on “Sequential Bayesian Detection,” a new section on “Ensemble Kalman Filters” as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to “fill-in-the gaps” of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical “sanity testing” lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed and applied. These expansions of the book have been updated to provide a more cohesive discussion of Bayesian processing with examples and applications enabling the comprehension of alternative approaches to solving estimation/detection problems.

The second edition of Bayesian Signal Processing features: 

  • “Classical” Kalman filtering for linear, linearized, and nonlinear systems; “modern” unscented and ensemble Kalman filters: and the “next-generation” Bayesian particle filters
  • Sequential Bayesian detection techniques incorporating model-based schemes for a variety of real-world problems
  • Practical Bayesian processor designs including comprehensive methods of performance analysis ranging from simple sanity testing and ensemble techniques to sophisticated information metrics
  • New case studies on adaptive particle filtering and sequential Bayesian detection are covered detailing more Bayesian approaches to applied problem solving
  • MATLAB® notes at the end of each chapter help readers solve complex problems using readily available software commands and point out other software packages available
  • Problem sets included to test readers’ knowledge and help them put their new skills into practice Bayesian 
Signal Processing, Second Edition is written for all students, scientists, and engineers who investigate and apply signal processing to their everyday problems.

English

JAMES V. CANDY, PhD, is Chief Scientist for Engineering, a Distinguished Member of the Technical Staff, founder, and former director of the Center for Advanced Signal & Image Sciences at the Lawrence Livermore National Laboratory. He is also an Adjunct Full Professor at the University of California, Santa Barbara, a Fellow of the IEEE, and a Fellow of the Acoustical Society of America. Dr. Candy has published more than 225 journal articles, book chapters, and technical reports. He is also the author of Signal Processing: Model-Based Approach, Signal Processing: A Modern Approach, and Model-Based Signal Processing (Wiley 2006). Dr. Candy was awarded the IEEE Distinguished Technical Achievement Award for his development of model-based signal processing and the Acoustical Society of America Helmholtz-Rayleigh Interdisciplinary Silver Medal for his contributions to acoustical signal processing and underwater acoustics.

English

Preface to Second Edition xiii

References xiv

Preface to First Edition xvii

References xxiii

Acknowledgments xxvii

List of Abbreviations xxix

1 Introduction 1

1.1 Introduction 1

1.2 Bayesian Signal Processing 1

1.3 Simulation-Based Approach to Bayesian Processing 4

1.3.1 Bayesian Particle Filter 8

1.4 Bayesian Model-Based Signal Processing 9

1.5 Notation and Terminology 13

References 15

Problems 16

2 Bayesian Estimation 20

2.1 Introduction 20

2.2 Batch Bayesian Estimation 20

2.3 Batch Maximum Likelihood Estimation 23

2.3.1 Expectation–Maximization Approach to Maximum Likelihood 27

2.3.2 EM for Exponential Family of Distributions 30

2.4 Batch Minimum Variance Estimation 34

2.5 Sequential Bayesian Estimation 37

2.5.1 Joint Posterior Estimation 41

2.5.2 Filtering Posterior Estimation 42

2.5.3 Likelihood Estimation 45

2.6 Summary 45

References 46

Problems 47

3 Simulation-Based Bayesian Methods 52

3.1 Introduction 52

3.2 Probability Density Function Estimation 54

3.3 Sampling Theory 58

3.3.1 Uniform Sampling Method 60

3.3.2 Rejection Sampling Method 64

3.4 Monte Carlo Approach 66

3.4.1 Markov Chains 71

3.4.2 Metropolis–Hastings Sampling 74

3.4.3 Random Walk Metropolis–Hastings Sampling 75

3.4.4 Gibbs Sampling 79

3.4.5 Slice Sampling 81

3.5 Importance Sampling 83

3.6 Sequential Importance Sampling 87

3.7 Summary 90

References 91

Problems 94

4 State–Space Models for Bayesian Processing 98

4.1 Introduction 98

4.2 Continuous-Time State–Space Models 99

4.3 Sampled-Data State–Space Models 103

4.4 Discrete-Time State–Space Models 107

4.4.1 Discrete Systems Theory 109

4.5 Gauss–Markov State–Space Models 115

4.5.1 Continuous-Time/Sampled-Data Gauss–Markov Models 115

4.5.2 Discrete-Time Gauss–Markov Models 117

4.6 Innovations Model 123

4.7 State–Space Model Structures 124

4.7.1 Time Series Models 124

4.7.2 State–Space and Time Series Equivalence Models 131

4.8 Nonlinear (Approximate) Gauss–Markov State–Space Models 137

4.9 Summary 142

References 142

Problems 143

5 Classical Bayesian State–Space Processors 150

5.1 Introduction 150

5.2 Bayesian Approach to the State–Space 151

5.3 Linear Bayesian Processor (Linear Kalman Filter) 153

5.4 Linearized Bayesian Processor (Linearized Kalman Filter) 162

5.5 Extended Bayesian Processor (Extended Kalman Filter) 170

5.6 Iterated-Extended Bayesian Processor (Iterated-Extended Kalman Filter) 179

5.7 Practical Aspects of Classical Bayesian Processors 185

5.8 Case Study: RLC Circuit Problem 190

5.9 Summary 194

References 195

Problems 196

6 Modern Bayesian State–Space Processors 201

6.1 Introduction 201

6.2 Sigma-Point (Unscented) Transformations 202

6.2.1 Statistical Linearization 202

6.2.2 Sigma-Point Approach 205

6.2.3 SPT for Gaussian Prior Distributions 210

6.3 Sigma-Point Bayesian Processor (Unscented Kalman Filter) 213

6.3.1 Extensions of the Sigma-Point Processor 222

6.4 Quadrature Bayesian Processors 223

6.5 Gaussian Sum (Mixture) Bayesian Processors 224

6.6 Case Study: 2D-Tracking Problem 228

6.7 Ensemble Bayesian Processors 234

6.8 Summary 245

References 247

Problems 249

7 Particle-Based Bayesian State–Space Processors 253

7.1 Introduction 253

7.2 Bayesian State–Space Particle Filters 253

7.3 Importance Proposal Distributions 258

7.3.1 Minimum Variance Importance Distribution 258

7.3.2 Transition Prior Importance Distribution 261

7.4 Resampling 262

7.4.1 Multinomial Resampling 267

7.4.2 Systematic Resampling 268

7.4.3 Residual Resampling 269

7.5 State–Space Particle Filtering Techniques 270

7.5.1 Bootstrap Particle Filter 271

7.5.2 Auxiliary Particle Filter 274

7.5.3 Regularized Particle Filter 281

7.5.4 MCMC Particle Filter 283

7.5.5 Linearized Particle Filter 286

7.6 Practical Aspects of Particle Filter Design 290

7.6.1 Sanity Testing 290

7.6.2 Ensemble Estimation 291

7.6.3 Posterior Probability Validation 293

7.6.4 Model Validation Testing 304

7.7 Case Study: Population Growth Problem 311

7.8 Summary 317

References 318

Problems 321

8 Joint Bayesian State/Parametric Processors 327

8.1 Introduction 327

8.2 Bayesian Approach to Joint State/Parameter Estimation 328

8.3 Classical/Modern Joint Bayesian State/Parametric Processors 330

8.3.1 Classical Joint Bayesian Processor 331

8.3.2 Modern Joint Bayesian Processor 338

8.4 Particle-Based Joint Bayesian State/Parametric Processors 341

8.4.1 Parametric Models 342

8.4.2 Joint Bayesian State/Parameter Estimation 344

8.5 Case Study: Random Target Tracking Using a Synthetic Aperture Towed Array 349

8.6 Summary 359

References 360

Problems 362

9 Discrete Hidden Markov Model Bayesian Processors 367

9.1 Introduction 367

9.2 Hidden Markov Models 367

9.2.1 Discrete-Time Markov Chains 368

9.2.2 Hidden Markov Chains 369

9.3 Properties of the Hidden Markov Model 372

9.4 HMM Observation Probability: Evaluation Problem 373

9.5 State Estimation in HMM: The Viterbi Technique 376

9.5.1 Individual Hidden State Estimation 377

9.5.2 Entire Hidden State Sequence Estimation 380

9.6 Parameter Estimation in HMM: The EM/Baum–Welch Technique 384

9.6.1 Parameter Estimation with State Sequence Known 385

9.6.2 Parameter Estimation with State Sequence Unknown 387

9.7 Case Study: Time-Reversal Decoding 390

9.8 Summary 395

References 396

Problems 398

10 Sequential Bayesian Detection 401

10.1 Introduction 401

10.2 Binary Detection Problem 402

10.2.1 Classical Detection 403

10.2.2 Bayesian Detection 407

10.2.3 Composite Binary Detection 408

10.3 Decision Criteria 411

10.3.1 Probability-of-Error Criterion 411

10.3.2 Bayes Risk Criterion 412

10.3.3 Neyman–Pearson Criterion 414

10.3.4 Multiple (Batch) Measurements 416

10.3.5 Multichannel Measurements 418

10.3.6 Multiple Hypotheses 420

10.4 Performance Metrics 423

10.4.1 Receiver Operating Characteristic (ROC) Curves 424

10.5 Sequential Detection 440

10.5.1 Sequential Decision Theory 442

10.6 Model-Based Sequential Detection 447

10.6.1 Linear Gaussian Model-Based Processor 447

10.6.2 Nonlinear Gaussian Model-Based Processor 451

10.6.3 Non-Gaussian Model-Based Processor 454

10.7 Model-Based Change (Anomaly) Detection 459

10.7.1 Model-Based Detection 460

10.7.2 Optimal Innovations Detection 461

10.7.3 Practical Model-Based Change Detection 463

10.8 Case Study: Reentry Vehicle Change Detection 468

10.8.1 Simulation Results 471

10.9 Summary 472

References 475

Problems 477

11 Bayesian Processors for Physics-Based Applications 484

11.1 Optimal Position Estimation for the Automatic Alignment 484

11.1.1 Background 485

11.1.2 Stochastic Modeling of Position Measurements 487

11.1.3 Bayesian Position Estimation and Detection 489

11.1.4 Application: Beam Line Data 490

11.1.5 Results: Beam Line (KDP Deviation) Data 492

11.1.6 Results: Anomaly Detection 494

11.2 Sequential Detection of Broadband Ocean Acoustic Sources 497

11.2.1 Background 498

11.2.2 Broadband State–Space Ocean Acoustic Propagators 500

11.2.3 Discrete Normal-Mode State–Space Representation 504

11.2.4 Broadband Bayesian Processor 504

11.2.5 Broadband Particle Filters 505

11.2.6 Broadband Bootstrap Particle Filter 507

11.2.7 Bayesian Performance Metrics 509

11.2.8 Sequential Detection 509

11.2.9 Broadband BSP Design 512

11.2.10 Summary 520

11.3 Bayesian Processing for Biothreats 520

11.3.1 Background 521

11.3.2 Parameter Estimation 524

11.3.3 Bayesian Processor Design 525

11.3.4 Results 526

11.4 Bayesian Processing for the Detection of Radioactive Sources 528

11.4.1 Physics-Based Processing Model 528

11.4.2 Radionuclide Detection 531

11.4.3 Implementation 535

11.4.4 Detection 539

11.4.5 Data 540

11.4.6 Radionuclide Detection 540

11.4.7 Summary 541

11.5 Sequential Threat Detection: An X-ray Physics-Based Approach 541

11.5.1 Physics-Based Models 543

11.5.2 X-ray State–Space Simulation 547

11.5.3 Sequential Threat Detection 549

11.5.4 Summary 554

11.6 Adaptive Processing for Shallow Ocean Applications 554

11.6.1 State–Space Propagator 555

11.6.2 Processors 562

11.6.3 Model-Based Ocean Acoustic Processing 565

11.6.4 Summary 572

References 572

Appendix: Probability and Statistics Overview 576

A.1 Probability Theory 576

A.2 Gaussian Random Vectors 582

A.3 Uncorrelated Transformation: Gaussian Random Vectors 583

References 584

Index 585

loading