Course:CPSC522/Particle Filtering

Title

Particle Filtering: Using Current Knowledge to Predict Future State
Principal Author: Arthur Sun
Collaborators: Ke Dai, JunYuan Zheng, Samprity

Abstract

Particle Filtering has already been a mature statistical method for performing a system-level state model inference where the system evolves according to the change of time series and information from noisy measurement at every specific time slot. So particle filtering is a model belonging to discrete-time state-space model.[1] and is often implemented to track the state of a system with changing time. This paper will give a brief introduction and relevant dependencies of particle filtering for its beginning, its usage, principle and implementation.

Content

Wide Application

Particle Filtering has wide application in various industries such as tracking aircraft position from radar, recovering communication signals from noisy transmission environment, financial forecasting, rebot direction detection and outdoor moving vehicles surveillance tracking, etc[2]. This wiki will cover the basic introduction, general principle and its wide application of use.

Builds on

Particle Filtering is a quite complicated mathematical model which is built on several previous well-established and calculated mathematical forms and complete derivation from previous probability model.

Mathematical Foundation

The particle filtering was first proposed by Pierre Del Moral in 1996 for his own particle algorithm analysis[3] to prove the unbiased properties of particle likelihood approximation function and unnormalized conditional probability measures.

After that, there were a couple of extensive researches about branching type particle methodologies with varying population sizes which was developed by Dan Crisan, Pierre Del Moral and Terry Lyons in their paper[4].

In 2000, Pierre Del Moral and Alice Guionnet proposed central limit theorems[5] and uniform convergence result[6], which was the most updated basic fundamental of the current particle filtering algorithm.

Nowadays currently most updated particle filtering methodologies are based on Feynman-Kac[7] particle filter algorithms and backward Markov particle models[8],genealogical tree based models[9],adaptive mean field particle models[10] and particle Markov chain Monte Carlo[11]

Content

What is Particle Filter

Particle means a state of possibility and the definition of state fully depends on user context. For single one-dimension circumstance, the state can be just a value, also the state can be a directional vector and a series of states can be called 'particle'.

For the particle filter, a series of initial particles is needed and each of these particles will eventually be given a weigh and the number of particles is chosen at random because the number chosen at the beginning stage to start with for the values of each attribute in the particle are generated in a pseudo-random manner.

Mathematical Derivation

Prerequisite I: Recursive Estimation

Before talking about particle filtering, we need to take a look at what is called recursive estimation and Bayesian estimation because particle filtering is based on the both of them theoretically. When people talking about recursive estimation, it means that there exists an enormous range of applications that require online estimation and predictions of an on-going set of parameters with conditions of data of great uncertainty such is object tracking, computer vision object detection and financial forecasting,etc, which requires an algorithm to recursively iterative through previous procedure for next-step estimation. We can create a dynamic model to describe this type of problem and a model for measuring how the current data is related to the system and in which case we often use Bayesian approach to describe the problem in a probabilistic way.

Prerequisite II: Bayesian Estimation

Bayesian estimation's main purpose is to construct posterior probability density function which requires to use state vector for current all available information and posterior probability density function(pdf) contains current state knowledge description and recursive Bayesian filter can help to provide a way for posterior probability distribution when the new information comes. So, after knowing the Bayesian estimation, we can see particle filter as a sort of recursive Bayesian filter by sequential [12] and present random samples of posterior probability density in a random way. In order to improve accuracy, we will choose a large set of simple data, perhaps to infinity. Furthermore, particle filtering can extend problem scope without restrictions like linear-Gaussian assumptions. However, the computational complexity of particle filtering is high considering its other characters

Prerequisite III: Dynamic Estimation: Fundamental Model of Particle Filtering

Dynamic estimation is the fundamental mathematical models for particle filter and it describes about how a certain state vector changes according to time and can be described in the following form: ${\displaystyle k_{k}=f_{k-1}(x_{k-1},v_{k-1})}$ Here ${\displaystyle x_{k}}$ is the estimated state vector and k is the time step and ${\displaystyle f_{k-1}}$ is an already known non-linear function. The measurement equation relates the received measurements to the state vector ${\displaystyle z_{k}=h_{k}(x_{k},w_{K})}$, for ${\displaystyle k>0}$

A Fundamental Model for Particle Filtering Algorithm

Resampling Technique

Particle filtering algorithm is a kind of algorithm that generates all the samples for one variable before moving to the next variable. It does one scanning through all the variables with all of samples in every variable one at a time, which means that it has an advantage that variables can be generated dynamically and there can be unbounded number of many variables. It also allows for a new operation of resampling. Given a set of samples on some of the variables, resampling consists of taking n samples, each with their own weight and then generate a new set of n samples with each of them sharing the same weight. Therefore, it can be implemented in the same way as for random samples when a single random variable is generated.

A particle is practically an assignment of a value to a set of variables with an associated weight and probability of the proposition is proportional to the weighted proportion of the weights of the particles.

Sampling is important for particle filtering as it is an implementation of sampling method that starts with a population of particles with every particles assigned a value to no variables, and has a weight of 1. Starting with each step, it will select a variable that has not been sampled and observed before and for each particle, it samples the variable according to some proposal distribution, then the weight of the particle is updated as in importance sampling.Then the algorithm selects a piece of evidence to absorb where the evidence should not have been absorbed before and weight of the particle is multiplied by the probability of the evidence given the values of the particle. Finally, the algorithm will resample the population and constructs a new population of particles with each sharing the same weight.

Mathematical Model

Generally, a particle filter algorithm starts with N independent random variables and can be viewed as a direct view of formal Bayesian filter. Suppose we have N random samples from posterior probability function ${\displaystyle p(x_{k-1}|Z_{k-1})}$ and we can describe these examples or so-called 'Particles'

During the prediction phase, the particle algorithm is made up of transferring each particle examples from each time step k-1 to system model to make a series of prior samples where prior samples are written in The prediction phase of the basic algorithm consists of passing each of these samples from time step k−1 through the system model (1) to generate a set of prior samples at time step k. These prior samples are written so that ${\displaystyle x_{k}=f_{k-1}(x_{k-1},v{k-1})}$ Here V is an independent sample derived from the previous probability density function of the noise of the system and therefore we can produce a set of samples from the prior probability density function ${\displaystyle p(x_{k}|Z_{k-1})}$

To update the prior samples of previous measurement Zk, we define Wk as a calculated weight for each particle, which is a measurement likelihood number determined at the value of prior sample ${\displaystyle w_{k}=p(z_{k}|x_{k})}$ and then normalized to sum to unity ${\displaystyle w_{k}=w_{k}/\sum _{j=1}^{N}w_{i}}$ and resample previous particles by normalized weights to produce a new series of particles:

This means that a complete set of prior samples is chosen with a probability equal to its normalized weight and repeats for N times to construct new set. Finally, we can see that the new set of particles are just samples of the required probability density function and full series of algorithm is complete.

Here is the general Pseudo-code for particle filtering

   FOR i = 1: N
Draw Xk - q(Xk|Xk-1, Zk)
Update weights according to Wk where Wk belongs to P(Zk|Xk)P(Xk|Xk-1)/q(Xk|Xk-1,Zk)
END FOR
Normalize weights Sum(Wk) = 1
Calculate degeneracy measure = 1/Sum(Wk)^2
If degeneracy measure < Nt
Resample
END IF


A Practical Example

A Rebort Localiazation Experiement by UW

This was conducted by University of Washington Robotics and State Estimation Lab and it is at the beginning of a robot localization task where the blue circle is of best guess as to where the robot is now and the little red dots are different hypotheses for where the robot might be at the beginning of the task so for

From the very beginning of the task, we don't know where the robot is. But with the time going by, each hypothesis is called a particle and the lines extending from the robot are sensor measurements taken by a lase.

When the time goes by, the best-guess location of the robot will show up with the most likely changes occurs. As the robot moves and measures the surroundings, it will automatically detects the most of the hypotheses it started with which have the lowest probability and delete those information and after certain amount of time, the number of hypotheses is reduced to a few clouds in the hallway. When the robot goes into the hallway where these exists many symmetry, and it is not sure the exact place, then it pinpoint its location into two hypotheses. When the robot finally enters a room and detects the surroundings, it will be then clear for it that the current best hypothesis was actually correct.

Extended Particle Filtering Algorithm

Besides the basic particle filtering algorithm which focuses on resampling, there exists a lot of more advanced, task-specific particle filtering algorithm with list of follows:

1. Exponential Natural Particle Filter: trying to solve the problem of loss of particle diversity, the need for large number of particles and the costly selection of the importance density functions
2. Auxiliary particle filter: trying to improve some deficiencies of the sequential importance resampling (SIR) algorithm when dealing with tailed observation densities
3. Regularized auxiliary particle filter: trying to improve auxiliary particle filter by regularizing the approximating empirical density and redraw samples from a continuous distribution
4. Gaussian particle filter: trying to work on non-linear hybrid systems.
5. Unscented particle filter: trying to work on dynamic system where variance of the observation noise is small

Related Wiki Pages

Recursive Bayesian Estimation: [13]

Bayes Estimator: [14]

Recursive Estimation: [15]

Annotated Bibliography

[1] Del Moral, Pierre (1996). "Non Linear Filtering: Interacting Particle Solution." (PDF). Markov Processes and Related Fields 2 (4): 555–580.

[2] Del Moral, Pierre (2004). Feynman-Kac formulae. Genealogical and interacting particle approximations. Springer. p. 575. Series: Probability and Applications

[3] Del Moral, Pierre; Miclo, Laurent (2000). "A Moran particle system approximation of Feynman-Kac formulae.". Stochastic Processes and their Applications 86 (2): 193–216. doi:10.1016/S0304-4149(99)00094-0

[4] Particle methods: An introduction with applications". ESAIM: Proc. 44: 1–46. doi:10.1051/proc/201444001

[5] Del Moral, P.; Guionnet, A. (1999). "Central limit theorem for nonlinear filtering and interacting particle systems". The Annals of Applied Probability 9 (2): 275–297. doi:10.1214/aoap/1029962742. ISSN 1050-5164

[6] Del Moral, Pierre; Guionnet, Alice (2001). "On the stability of interacting processes with applications to filtering and genetic algorithms". Annales de l'Institut Henri Poincaré 37 (2): 155–194

[7] Del Moral, Pierre; Guionnet, Alice (2001). "On the stability of interacting processes with applications to filtering and genetic algorithms". Annales de l'Institut Henri Poincaré 37 (2): 155–194

[8] Del Moral, Pierre (2013). Mean field simulation for Monte Carlo integration. Chapman & Hall/CRC Press. p. 626. Monographs on Statistics & Applied Probability

[9] Del Moral, Pierre; Miclo, Laurent (2000). Branching and Interacting Particle Systems Approximations of Feynman-Kac Formulae with Applications to Non-Linear Filtering. (PDF). Lecture Notes in Mathematics 1729. pp. 1–145. doi:10.1007/bfb0103798.

[10] Del Moral, Pierre; Doucet, Arnaud; Jasra, Ajay (2012). "On Adaptive Resampling Procedures for Sequential Monte Carlo Methods" (PDF). Bernoulli 18 (1): 252–278. doi:10.3150/10-bej335.

[11] Andrieu, Christophe; Doucet, Arnaud; Holenstein, Roman (2010). "Particle Markov chain Monte Carlo methods". Journal Royal Statistical Society B 72 (3): 269–342. doi:10.1111/j.1467-9868.2009.00736.x.

[11] Andrieu, Christophe; Doucet, Arnaud; Holenstein, Roman (2010). "Particle Markov chain Monte Carlo methods". Journal Royal Statistical Society B 72 (3): 269–342. doi:10.1111/j.1467-9868.2009.00736.x.

[12] Haug, A.J. (2005). "A Tutorial on Bayesian Estimation and Tracking Techniques Applicable to Nonlinear and Non-Gaussian Processes" (PDF). The MITRE Corporation, USA, Tech. Rep., Feb. Retrieved 2008-05-06.