Cargando…
TbsNet: the importance of thin-branch structures in CNNs
The performance of a convolutional neural network (CNN) model is influenced by several factors, such as depth, width, network structure, size of the receptive field, and feature map scaling. The optimization of the best combination of these factors poses as the main difficulty in designing a viable...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
PeerJ Inc.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10280644/ https://www.ncbi.nlm.nih.gov/pubmed/37346637 http://dx.doi.org/10.7717/peerj-cs.1429 |
_version_ | 1785060843007770624 |
---|---|
author | Hu, Xiujian Sheng, Guanglei Shi, Piao Ding, Yuanyuan |
author_facet | Hu, Xiujian Sheng, Guanglei Shi, Piao Ding, Yuanyuan |
author_sort | Hu, Xiujian |
collection | PubMed |
description | The performance of a convolutional neural network (CNN) model is influenced by several factors, such as depth, width, network structure, size of the receptive field, and feature map scaling. The optimization of the best combination of these factors poses as the main difficulty in designing a viable architecture. This article presents an analysis of key factors influencing network performance, offers several strategies for constructing an efficient convolutional network, and introduces a novel architecture named TbsNet (thin-branch structure network). In order to minimize computation costs and feature redundancy, lightweight operators such as asymmetric convolution, pointwise convolution, depthwise convolution, and group convolution are implemented to further reduce the network’s weight. Unlike previous studies, the TbsNet architecture design rejects the reparameterization method and adopts a plain, simplified structure which eliminates extraneous branches. We conduct extensive experiments, including network depth, width, etc. TbsNet performs well on benchmark platforms, Top 1 Accuracy on CIFAR-10 is 97.02%, on CIFAR-100 is 83.56%, and on ImageNet-1K is 86.17%. Tbs-UNet’s DSC on the Synapse dataset is 78.39%, higher than TransUNet’s 0.91%. TbsNet can be competent for some downstream tasks in computer vision, such as medical image segmentation, and thus is competitive with prior state-of-the-art deep networks such as ResNet, ResNeXt, RepVgg, ParNet, ConvNeXt, and MobileNet. |
format | Online Article Text |
id | pubmed-10280644 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | PeerJ Inc. |
record_format | MEDLINE/PubMed |
spelling | pubmed-102806442023-06-21 TbsNet: the importance of thin-branch structures in CNNs Hu, Xiujian Sheng, Guanglei Shi, Piao Ding, Yuanyuan PeerJ Comput Sci Algorithms and Analysis of Algorithms The performance of a convolutional neural network (CNN) model is influenced by several factors, such as depth, width, network structure, size of the receptive field, and feature map scaling. The optimization of the best combination of these factors poses as the main difficulty in designing a viable architecture. This article presents an analysis of key factors influencing network performance, offers several strategies for constructing an efficient convolutional network, and introduces a novel architecture named TbsNet (thin-branch structure network). In order to minimize computation costs and feature redundancy, lightweight operators such as asymmetric convolution, pointwise convolution, depthwise convolution, and group convolution are implemented to further reduce the network’s weight. Unlike previous studies, the TbsNet architecture design rejects the reparameterization method and adopts a plain, simplified structure which eliminates extraneous branches. We conduct extensive experiments, including network depth, width, etc. TbsNet performs well on benchmark platforms, Top 1 Accuracy on CIFAR-10 is 97.02%, on CIFAR-100 is 83.56%, and on ImageNet-1K is 86.17%. Tbs-UNet’s DSC on the Synapse dataset is 78.39%, higher than TransUNet’s 0.91%. TbsNet can be competent for some downstream tasks in computer vision, such as medical image segmentation, and thus is competitive with prior state-of-the-art deep networks such as ResNet, ResNeXt, RepVgg, ParNet, ConvNeXt, and MobileNet. PeerJ Inc. 2023-06-16 /pmc/articles/PMC10280644/ /pubmed/37346637 http://dx.doi.org/10.7717/peerj-cs.1429 Text en © 2023 Hu et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, reproduction and adaptation in any medium and for any purpose provided that it is properly attributed. For attribution, the original author(s), title, publication source (PeerJ Computer Science) and either DOI or URL of the article must be cited. |
spellingShingle | Algorithms and Analysis of Algorithms Hu, Xiujian Sheng, Guanglei Shi, Piao Ding, Yuanyuan TbsNet: the importance of thin-branch structures in CNNs |
title | TbsNet: the importance of thin-branch structures in CNNs |
title_full | TbsNet: the importance of thin-branch structures in CNNs |
title_fullStr | TbsNet: the importance of thin-branch structures in CNNs |
title_full_unstemmed | TbsNet: the importance of thin-branch structures in CNNs |
title_short | TbsNet: the importance of thin-branch structures in CNNs |
title_sort | tbsnet: the importance of thin-branch structures in cnns |
topic | Algorithms and Analysis of Algorithms |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10280644/ https://www.ncbi.nlm.nih.gov/pubmed/37346637 http://dx.doi.org/10.7717/peerj-cs.1429 |
work_keys_str_mv | AT huxiujian tbsnettheimportanceofthinbranchstructuresincnns AT shengguanglei tbsnettheimportanceofthinbranchstructuresincnns AT shipiao tbsnettheimportanceofthinbranchstructuresincnns AT dingyuanyuan tbsnettheimportanceofthinbranchstructuresincnns |