Cargando…
How Many Bits Does it Take to Quantize Your Neural Network?
Quantization converts neural networks into low-bit fixed-point computations which can be carried out by efficient integer-only hardware, and is standard practice for the deployment of neural networks on real-time embedded devices. However, like their real-numbered counterpart, quantized networks are...
Autores principales: | Giacobbe, Mirco, Henzinger, Thomas A., Lechner, Mathias |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7480702/ http://dx.doi.org/10.1007/978-3-030-45237-7_5 |
Ejemplares similares
-
Design of a 2-Bit Neural Network Quantizer for Laplacian Source
por: Perić, Zoran, et al.
Publicado: (2021) -
A Novel Low-Bit Quantization Strategy for Compressing Deep Neural Networks
por: Long, Xin, et al.
Publicado: (2020) -
Optimization of the Sampling Periods and the Quantization Bit Lengths for Networked Estimation
por: Suh, Young Soo, et al.
Publicado: (2010) -
Single Abrikosov vortices as quantized information bits
por: Golod, T., et al.
Publicado: (2015) -
How Many Parameters Does It Take to Describe Disease Tolerance?
por: Louie, Alexander, et al.
Publicado: (2016)