Bayesian State Estimation of Nonlinear Systems Using Approximate Aggregate Markov Chains
Document Type
Article
Publication Date
2006
Publication Title
Industrial and Engineering Chemistry Research
Abstract
The conditional probability density function (pdf) is the most complete statistical representation of the state from which optimal inferences may be drawn. The transient pdf is usually infinite-dimensional and impossible to obtain except for linear Gaussian systems. In this paper, a novel density-based filter is proposed for nonlinear Bayesian estimation. The approach is fundamentally different from optimization- and linearization-based methods. Unlike typical density-based methods such as probability grid filters (PGF) and sequential Monte Carlo (SMC), the cell filter poses an off-line probabilistic modeling task and an on-line estimation task. The probabilistic behavior is described by the Foias or Frobenius−Perron operators. Monte Carlo simulations are used for computing these transition operators, which represent approximate aggregate Markov chains. The approach places no restrictions on system model or noise processes. The cell filter is shown to achieve the performance of PGF and SMC filters at a fraction of the computational cost for recursive state estimation.
Repository Citation
Ungarala, Sridhar; Chen, Zhongzhou; and Li, Keyu, "Bayesian State Estimation of Nonlinear Systems Using Approximate Aggregate Markov Chains" (2006). Chemical & Biomedical Engineering Faculty Publications. 32.
https://engagedscholarship.csuohio.edu/encbe_facpub/32
Original Citation
Ungarala, S.; Chen, Z.; Li, K. Bayesian State Estimation of Nonlinear Systems Using Approximate Aggregate Markov Chains. Ind Eng Chem Res 2006, 45, 4208-4221.
Volume
45
Issue
12
DOI
10.1021/ie050362l
Comments
This material is based upon work supported by the National Science Foundation under Grant Nos. CTS-0433527 and CTS- 0522864.