Cargando…
Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection
In this paper, we propose a novel two-stage transformer with GhostNet, which improves the performance of the small object detection task. Specifically, based on the original Deformable Transformers for End-to-End Object Detection (deformable DETR), we chose GhostNet as the backbone to extract featur...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9503248/ https://www.ncbi.nlm.nih.gov/pubmed/36146288 http://dx.doi.org/10.3390/s22186939 |
_version_ | 1784795915407589376 |
---|---|
author | Li, Sijia Sultonov, Furkat Tursunboev, Jamshid Park, Jun-Hyun Yun, Sangseok Kang, Jae-Mo |
author_facet | Li, Sijia Sultonov, Furkat Tursunboev, Jamshid Park, Jun-Hyun Yun, Sangseok Kang, Jae-Mo |
author_sort | Li, Sijia |
collection | PubMed |
description | In this paper, we propose a novel two-stage transformer with GhostNet, which improves the performance of the small object detection task. Specifically, based on the original Deformable Transformers for End-to-End Object Detection (deformable DETR), we chose GhostNet as the backbone to extract features, since it is better suited for an efficient feature extraction. Furthermore, at the target detection stage, we selected the 300 best bounding box results as regional proposals, which were subsequently set as primary object queries of the decoder layer. Finally, in the decoder layer, we optimized and modified the queries to increase the target accuracy. In order to validate the performance of the proposed model, we adopted a widely used COCO 2017 dataset. Extensive experiments demonstrated that the proposed scheme yielded a higher average precision (AP) score in detecting small objects than the existing deformable DETR model. |
format | Online Article Text |
id | pubmed-9503248 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-95032482022-09-24 Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection Li, Sijia Sultonov, Furkat Tursunboev, Jamshid Park, Jun-Hyun Yun, Sangseok Kang, Jae-Mo Sensors (Basel) Communication In this paper, we propose a novel two-stage transformer with GhostNet, which improves the performance of the small object detection task. Specifically, based on the original Deformable Transformers for End-to-End Object Detection (deformable DETR), we chose GhostNet as the backbone to extract features, since it is better suited for an efficient feature extraction. Furthermore, at the target detection stage, we selected the 300 best bounding box results as regional proposals, which were subsequently set as primary object queries of the decoder layer. Finally, in the decoder layer, we optimized and modified the queries to increase the target accuracy. In order to validate the performance of the proposed model, we adopted a widely used COCO 2017 dataset. Extensive experiments demonstrated that the proposed scheme yielded a higher average precision (AP) score in detecting small objects than the existing deformable DETR model. MDPI 2022-09-14 /pmc/articles/PMC9503248/ /pubmed/36146288 http://dx.doi.org/10.3390/s22186939 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Communication Li, Sijia Sultonov, Furkat Tursunboev, Jamshid Park, Jun-Hyun Yun, Sangseok Kang, Jae-Mo Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection |
title | Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection |
title_full | Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection |
title_fullStr | Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection |
title_full_unstemmed | Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection |
title_short | Ghostformer: A GhostNet-Based Two-Stage Transformer for Small Object Detection |
title_sort | ghostformer: a ghostnet-based two-stage transformer for small object detection |
topic | Communication |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9503248/ https://www.ncbi.nlm.nih.gov/pubmed/36146288 http://dx.doi.org/10.3390/s22186939 |
work_keys_str_mv | AT lisijia ghostformeraghostnetbasedtwostagetransformerforsmallobjectdetection AT sultonovfurkat ghostformeraghostnetbasedtwostagetransformerforsmallobjectdetection AT tursunboevjamshid ghostformeraghostnetbasedtwostagetransformerforsmallobjectdetection AT parkjunhyun ghostformeraghostnetbasedtwostagetransformerforsmallobjectdetection AT yunsangseok ghostformeraghostnetbasedtwostagetransformerforsmallobjectdetection AT kangjaemo ghostformeraghostnetbasedtwostagetransformerforsmallobjectdetection |