Cargando…

The Compact Support Neural Network

Neural networks are popular and useful in many fields, but they have the problem of giving high confidence responses for examples that are away from the training data. This makes the neural networks very confident in their prediction while making gross mistakes, thus limiting their reliability for s...

Descripción completa

Detalles Bibliográficos
Autores principales: Barbu, Adrian, Mou, Hongyu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8709146/
https://www.ncbi.nlm.nih.gov/pubmed/34960583
http://dx.doi.org/10.3390/s21248494
_version_ 1784622862844297216
author Barbu, Adrian
Mou, Hongyu
author_facet Barbu, Adrian
Mou, Hongyu
author_sort Barbu, Adrian
collection PubMed
description Neural networks are popular and useful in many fields, but they have the problem of giving high confidence responses for examples that are away from the training data. This makes the neural networks very confident in their prediction while making gross mistakes, thus limiting their reliability for safety-critical applications such as autonomous driving and space exploration, etc. This paper introduces a novel neuron generalization that has the standard dot-product-based neuron and the radial basis function (RBF) neuron as two extreme cases of a shape parameter. Using a rectified linear unit (ReLU) as the activation function results in a novel neuron that has compact support, which means its output is zero outside a bounded domain. To address the difficulties in training the proposed neural network, it introduces a novel training method that takes a pretrained standard neural network that is fine-tuned while gradually increasing the shape parameter to the desired value. The theoretical findings of the paper are bound on the gradient of the proposed neuron and proof that a neural network with such neurons has the universal approximation property. This means that the network can approximate any continuous and integrable function with an arbitrary degree of accuracy. The experimental findings on standard benchmark datasets show that the proposed approach has smaller test errors than the state-of-the-art competing methods and outperforms the competing methods in detecting out-of-distribution samples on two out of three datasets.
format Online
Article
Text
id pubmed-8709146
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-87091462021-12-25 The Compact Support Neural Network Barbu, Adrian Mou, Hongyu Sensors (Basel) Article Neural networks are popular and useful in many fields, but they have the problem of giving high confidence responses for examples that are away from the training data. This makes the neural networks very confident in their prediction while making gross mistakes, thus limiting their reliability for safety-critical applications such as autonomous driving and space exploration, etc. This paper introduces a novel neuron generalization that has the standard dot-product-based neuron and the radial basis function (RBF) neuron as two extreme cases of a shape parameter. Using a rectified linear unit (ReLU) as the activation function results in a novel neuron that has compact support, which means its output is zero outside a bounded domain. To address the difficulties in training the proposed neural network, it introduces a novel training method that takes a pretrained standard neural network that is fine-tuned while gradually increasing the shape parameter to the desired value. The theoretical findings of the paper are bound on the gradient of the proposed neuron and proof that a neural network with such neurons has the universal approximation property. This means that the network can approximate any continuous and integrable function with an arbitrary degree of accuracy. The experimental findings on standard benchmark datasets show that the proposed approach has smaller test errors than the state-of-the-art competing methods and outperforms the competing methods in detecting out-of-distribution samples on two out of three datasets. MDPI 2021-12-20 /pmc/articles/PMC8709146/ /pubmed/34960583 http://dx.doi.org/10.3390/s21248494 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Barbu, Adrian
Mou, Hongyu
The Compact Support Neural Network
title The Compact Support Neural Network
title_full The Compact Support Neural Network
title_fullStr The Compact Support Neural Network
title_full_unstemmed The Compact Support Neural Network
title_short The Compact Support Neural Network
title_sort compact support neural network
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8709146/
https://www.ncbi.nlm.nih.gov/pubmed/34960583
http://dx.doi.org/10.3390/s21248494
work_keys_str_mv AT barbuadrian thecompactsupportneuralnetwork
AT mouhongyu thecompactsupportneuralnetwork
AT barbuadrian compactsupportneuralnetwork
AT mouhongyu compactsupportneuralnetwork