Cargando…
TRUST: A Novel Framework for Vehicle Trajectory Recovery from Urban-Scale Videos
We study a new type of path inference query against urban-scale video databases. Given a vehicle image query, our goal is to recover its historical trajectory from the footprints captured by surveillance cameras deployed across the road network. The problem is challenging because visual matching inh...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9788550/ https://www.ncbi.nlm.nih.gov/pubmed/36560317 http://dx.doi.org/10.3390/s22249948 |
Sumario: | We study a new type of path inference query against urban-scale video databases. Given a vehicle image query, our goal is to recover its historical trajectory from the footprints captured by surveillance cameras deployed across the road network. The problem is challenging because visual matching inherently suffers from object occlusion, low camera resolution, varying illumination conditions, and viewing angles. Furthermore, with limited computation resources, only a fraction of video frames can be ingested and indexed, causing severe data sparsity issues for visual matching. To support efficient and accurate trajectory recovery, we develop a select-and-refine framework in a heterogeneous hardware environment with both CPUs and GPUs. We construct a proximity graph from the top-k visually similar frames and propose holistic scoring functions based on visual and spatial-temporal coherence. To avoid enumerating all the paths, we also propose a coarse-grained scoring function with monotonic property to reduce search space. Finally, the derived path is refined by examining raw video frames to fill the missing cameras. For performance evaluation, we construct two largest-scale video databases generated from cameras deployed upon real road networks. Experimental results validate the efficiency and accuracy of our proposed trajectory recovery framework. |
---|