Cargando…

Multi-Model Running Latency Optimization in an Edge Computing Paradigm

Recent advances in both lightweight deep learning algorithms and edge computing increasingly enable multiple model inference tasks to be conducted concurrently on resource-constrained edge devices, allowing us to achieve one goal collaboratively rather than getting high quality in each standalone ta...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Peisong, Wang, Xinheng, Huang, Kaizhu, Huang, Yi, Li, Shancang, Iqbal, Muddesar
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9415810/
https://www.ncbi.nlm.nih.gov/pubmed/36015856
http://dx.doi.org/10.3390/s22166097

Ejemplares similares