Cargando…

MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection

Anomaly detection refers to the identification of cases that do not conform to the expected pattern, which takes a key role in diverse research areas and application domains. Most of existing methods can be summarized as anomaly object detection-based and reconstruction error-based techniques. Howev...

Descripción completa

Detalles Bibliográficos
Formato: Online Artículo Texto
Lenguaje:English
Publicado: IEEE 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8544938/
https://www.ncbi.nlm.nih.gov/pubmed/33326377
http://dx.doi.org/10.1109/TMI.2020.3045295
_version_ 1784589921622687744
collection PubMed
description Anomaly detection refers to the identification of cases that do not conform to the expected pattern, which takes a key role in diverse research areas and application domains. Most of existing methods can be summarized as anomaly object detection-based and reconstruction error-based techniques. However, due to the bottleneck of defining encompasses of real-world high-diversity outliers and inaccessible inference process, individually, most of them have not derived groundbreaking progress. To deal with those imperfectness, and motivated by memory-based decision-making and visual attention mechanism as a filter to select environmental information in human vision perceptual system, in this paper, we propose a Multi-scale Attention Memory with hash addressing Autoencoder network (MAMA Net) for anomaly detection. First, to overcome a battery of problems result from the restricted stationary receptive field of convolution operator, we coin the multi-scale global spatial attention block which can be straightforwardly plugged into any networks as sampling, upsampling and downsampling function. On account of its efficient features representation ability, networks can achieve competitive results with only several level blocks. Second, it’s observed that traditional autoencoder can only learn an ambiguous model that also reconstructs anomalies “well” due to lack of constraints in training and inference process. To mitigate this challenge, we design a hash addressing memory module that proves abnormalities to produce higher reconstruction error for classification. In addition, we couple the mean square error (MSE) with Wasserstein loss to improve the encoding data distribution. Experiments on various datasets, including two different COVID-19 datasets and one brain MRI (RIDER) dataset prove the robustness and excellent generalization of the proposed MAMA Net.
format Online
Article
Text
id pubmed-8544938
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher IEEE
record_format MEDLINE/PubMed
spelling pubmed-85449382023-02-10 MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection IEEE Trans Med Imaging Article Anomaly detection refers to the identification of cases that do not conform to the expected pattern, which takes a key role in diverse research areas and application domains. Most of existing methods can be summarized as anomaly object detection-based and reconstruction error-based techniques. However, due to the bottleneck of defining encompasses of real-world high-diversity outliers and inaccessible inference process, individually, most of them have not derived groundbreaking progress. To deal with those imperfectness, and motivated by memory-based decision-making and visual attention mechanism as a filter to select environmental information in human vision perceptual system, in this paper, we propose a Multi-scale Attention Memory with hash addressing Autoencoder network (MAMA Net) for anomaly detection. First, to overcome a battery of problems result from the restricted stationary receptive field of convolution operator, we coin the multi-scale global spatial attention block which can be straightforwardly plugged into any networks as sampling, upsampling and downsampling function. On account of its efficient features representation ability, networks can achieve competitive results with only several level blocks. Second, it’s observed that traditional autoencoder can only learn an ambiguous model that also reconstructs anomalies “well” due to lack of constraints in training and inference process. To mitigate this challenge, we design a hash addressing memory module that proves abnormalities to produce higher reconstruction error for classification. In addition, we couple the mean square error (MSE) with Wasserstein loss to improve the encoding data distribution. Experiments on various datasets, including two different COVID-19 datasets and one brain MRI (RIDER) dataset prove the robustness and excellent generalization of the proposed MAMA Net. IEEE 2020-12-16 /pmc/articles/PMC8544938/ /pubmed/33326377 http://dx.doi.org/10.1109/TMI.2020.3045295 Text en © IEEE 2020. This article is free to access and download, along with rights for full text and data mining, re-use and analysis.
spellingShingle Article
MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection
title MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection
title_full MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection
title_fullStr MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection
title_full_unstemmed MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection
title_short MAMA Net: Multi-Scale Attention Memory Autoencoder Network for Anomaly Detection
title_sort mama net: multi-scale attention memory autoencoder network for anomaly detection
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8544938/
https://www.ncbi.nlm.nih.gov/pubmed/33326377
http://dx.doi.org/10.1109/TMI.2020.3045295
work_keys_str_mv AT mamanetmultiscaleattentionmemoryautoencodernetworkforanomalydetection
AT mamanetmultiscaleattentionmemoryautoencodernetworkforanomalydetection
AT mamanetmultiscaleattentionmemoryautoencodernetworkforanomalydetection
AT mamanetmultiscaleattentionmemoryautoencodernetworkforanomalydetection
AT mamanetmultiscaleattentionmemoryautoencodernetworkforanomalydetection
AT mamanetmultiscaleattentionmemoryautoencodernetworkforanomalydetection