Cargando…

Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons

Binary neural networks (BNNs) are variations of artificial/deep neural network (ANN/DNN) architectures that constrain the real values of weights to the binary set of numbers {−1,1}. By using binary values, BNNs can convert matrix multiplications into bitwise operations, which accelerates both traini...

Descripción completa

Detalles Bibliográficos
Autores principales: Su, Yuanxin, Seng, Kah Phooi, Ang, Li Minn, Smith, Jeremy
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10675041/
https://www.ncbi.nlm.nih.gov/pubmed/38005640
http://dx.doi.org/10.3390/s23229254
_version_ 1785149775815901184
author Su, Yuanxin
Seng, Kah Phooi
Ang, Li Minn
Smith, Jeremy
author_facet Su, Yuanxin
Seng, Kah Phooi
Ang, Li Minn
Smith, Jeremy
author_sort Su, Yuanxin
collection PubMed
description Binary neural networks (BNNs) are variations of artificial/deep neural network (ANN/DNN) architectures that constrain the real values of weights to the binary set of numbers {−1,1}. By using binary values, BNNs can convert matrix multiplications into bitwise operations, which accelerates both training and inference and reduces hardware complexity and model sizes for implementation. Compared to traditional deep learning architectures, BNNs are a good choice for implementation in resource-constrained devices like FPGAs and ASICs. However, BNNs have the disadvantage of reduced performance and accuracy because of the tradeoff due to binarization. Over the years, this has attracted the attention of the research community to overcome the performance gap of BNNs, and several architectures have been proposed. In this paper, we provide a comprehensive review of BNNs for implementation in FPGA hardware. The survey covers different aspects, such as BNN architectures and variants, design and tool flows for FPGAs, and various applications for BNNs. The final part of the paper gives some benchmark works and design tools for implementing BNNs in FPGAs based on established datasets used by the research community.
format Online
Article
Text
id pubmed-10675041
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-106750412023-11-17 Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons Su, Yuanxin Seng, Kah Phooi Ang, Li Minn Smith, Jeremy Sensors (Basel) Review Binary neural networks (BNNs) are variations of artificial/deep neural network (ANN/DNN) architectures that constrain the real values of weights to the binary set of numbers {−1,1}. By using binary values, BNNs can convert matrix multiplications into bitwise operations, which accelerates both training and inference and reduces hardware complexity and model sizes for implementation. Compared to traditional deep learning architectures, BNNs are a good choice for implementation in resource-constrained devices like FPGAs and ASICs. However, BNNs have the disadvantage of reduced performance and accuracy because of the tradeoff due to binarization. Over the years, this has attracted the attention of the research community to overcome the performance gap of BNNs, and several architectures have been proposed. In this paper, we provide a comprehensive review of BNNs for implementation in FPGA hardware. The survey covers different aspects, such as BNN architectures and variants, design and tool flows for FPGAs, and various applications for BNNs. The final part of the paper gives some benchmark works and design tools for implementing BNNs in FPGAs based on established datasets used by the research community. MDPI 2023-11-17 /pmc/articles/PMC10675041/ /pubmed/38005640 http://dx.doi.org/10.3390/s23229254 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Review
Su, Yuanxin
Seng, Kah Phooi
Ang, Li Minn
Smith, Jeremy
Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons
title Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons
title_full Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons
title_fullStr Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons
title_full_unstemmed Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons
title_short Binary Neural Networks in FPGAs: Architectures, Tool Flows and Hardware Comparisons
title_sort binary neural networks in fpgas: architectures, tool flows and hardware comparisons
topic Review
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10675041/
https://www.ncbi.nlm.nih.gov/pubmed/38005640
http://dx.doi.org/10.3390/s23229254
work_keys_str_mv AT suyuanxin binaryneuralnetworksinfpgasarchitecturestoolflowsandhardwarecomparisons
AT sengkahphooi binaryneuralnetworksinfpgasarchitecturestoolflowsandhardwarecomparisons
AT angliminn binaryneuralnetworksinfpgasarchitecturestoolflowsandhardwarecomparisons
AT smithjeremy binaryneuralnetworksinfpgasarchitecturestoolflowsandhardwarecomparisons