Cargando…
MobilePrune: Neural Network Compression via ℓ(0) Sparse Group Lasso on the Mobile System
It is hard to directly deploy deep learning models on today’s smartphones due to the substantial computational costs introduced by millions of parameters. To compress the model, we develop an [Formula: see text]-based sparse group lasso model called MobilePrune which can generate extremely compact n...
Autores principales: | Shao, Yubo, Zhao, Kaikai, Cao, Zhiwen, Peng, Zhehao, Peng, Xingang, Li, Pan, Wang, Yijie, Ma, Jianzhu |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9185446/ https://www.ncbi.nlm.nih.gov/pubmed/35684708 http://dx.doi.org/10.3390/s22114081 |
Ejemplares similares
-
Pruning-Based Sparse Recovery for Electrocardiogram Reconstruction from Compressed Measurements
por: Lee, Jaeseok, et al.
Publicado: (2017) -
Groupyr: Sparse Group Lasso in Python
por: Richie-Halford, Adam, et al.
Publicado: (2021) -
Sparse Damage Detection with Complex Group Lasso and Adaptive Complex Group Lasso
por: Dimopoulos, Vasileios, et al.
Publicado: (2022) -
Coarse-Grained Pruning of Neural Network Models Based on Blocky Sparse Structure
por: Huang, Lan, et al.
Publicado: (2021) -
Seagull: lasso, group lasso and sparse-group lasso regularization for linear regression models via proximal gradient descent
por: Klosa, Jan, et al.
Publicado: (2020)