Cargando…

MapReduce particle filtering with exact resampling and deterministic runtime

Particle filtering is a numerical Bayesian technique that has great potential for solving sequential estimation problems involving non-linear and non-Gaussian models. Since the estimation accuracy achieved by particle filters improves as the number of particles increases, it is natural to consider a...

Descripción completa

Detalles Bibliográficos
Autores principales: Thiyagalingam, Jeyarajan, Kekempanos, Lykourgos, Maskell, Simon
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6959401/
https://www.ncbi.nlm.nih.gov/pubmed/32010202
http://dx.doi.org/10.1186/s13634-017-0505-9
Descripción
Sumario:Particle filtering is a numerical Bayesian technique that has great potential for solving sequential estimation problems involving non-linear and non-Gaussian models. Since the estimation accuracy achieved by particle filters improves as the number of particles increases, it is natural to consider as many particles as possible. MapReduce is a generic programming model that makes it possible to scale a wide variety of algorithms to Big data. However, despite the application of particle filters across many domains, little attention has been devoted to implementing particle filters using MapReduce. In this paper, we describe an implementation of a particle filter using MapReduce. We focus on a component that what would otherwise be a bottleneck to parallel execution, the resampling component. We devise a new implementation of this component, which requires no approximations, has O(N) spatial complexity and deterministic O((logN)(2)) time complexity. Results demonstrate the utility of this new component and culminate in consideration of a particle filter with 2(24) particles being distributed across 512 processor cores.