Cargando…

Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation

Traditionally the Perception Action cycle is the first stage of building an autonomous robotic system and a practical way to implement a low latency reactive system within a low Size, Weight and Power (SWaP) package. However, within complex scenarios, this method can lack contextual understanding ab...

Descripción completa

Detalles Bibliográficos
Autores principales: Kirkland, Paul, Di Caterina, Gaetano, Soraghan, John, Matich, George
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7604290/
https://www.ncbi.nlm.nih.gov/pubmed/33192434
http://dx.doi.org/10.3389/fnbot.2020.568319
_version_ 1783604113126195200
author Kirkland, Paul
Di Caterina, Gaetano
Soraghan, John
Matich, George
author_facet Kirkland, Paul
Di Caterina, Gaetano
Soraghan, John
Matich, George
author_sort Kirkland, Paul
collection PubMed
description Traditionally the Perception Action cycle is the first stage of building an autonomous robotic system and a practical way to implement a low latency reactive system within a low Size, Weight and Power (SWaP) package. However, within complex scenarios, this method can lack contextual understanding about the scene, such as object recognition-based tracking or system attention. Object detection, identification and tracking along with semantic segmentation and attention are all modern computer vision tasks in which Convolutional Neural Networks (CNN) have shown significant success, although such networks often have a large computational overhead and power requirements, which are not ideal in smaller robotics tasks. Furthermore, cloud computing and massively parallel processing like in Graphic Processing Units (GPUs) are outside the specification of many tasks due to their respective latency and SWaP constraints. In response to this, Spiking Convolutional Neural Networks (SCNNs) look to provide the feature extraction benefits of CNNs, while maintaining low latency and power overhead thanks to their asynchronous spiking event-based processing. A novel Neuromorphic Perception Understanding Action (PUA) system is presented, that aims to combine the feature extraction benefits of CNNs with low latency processing of SCNNs. The PUA utilizes a Neuromorphic Vision Sensor for Perception that facilitates asynchronous processing within a Spiking fully Convolutional Neural Network (SpikeCNN) to provide semantic segmentation and Understanding of the scene. The output is fed to a spiking control system providing Actions. With this approach, the aim is to bring features of deep learning into the lower levels of autonomous robotics, while maintaining a biologically plausible STDP rule throughout the learned encoding part of the network. The network will be shown to provide a more robust and predictable management of spiking activity with an improved thresholding response. The reported experiments show that this system can deliver robust results of over 96 and 81% for accuracy and Intersection over Union, ensuring such a system can be successfully used within object recognition, classification and tracking problem. This demonstrates that the attention of the system can be tracked accurately, while the asynchronous processing means the controller can give precise track updates with minimal latency.
format Online
Article
Text
id pubmed-7604290
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-76042902020-11-13 Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation Kirkland, Paul Di Caterina, Gaetano Soraghan, John Matich, George Front Neurorobot Neuroscience Traditionally the Perception Action cycle is the first stage of building an autonomous robotic system and a practical way to implement a low latency reactive system within a low Size, Weight and Power (SWaP) package. However, within complex scenarios, this method can lack contextual understanding about the scene, such as object recognition-based tracking or system attention. Object detection, identification and tracking along with semantic segmentation and attention are all modern computer vision tasks in which Convolutional Neural Networks (CNN) have shown significant success, although such networks often have a large computational overhead and power requirements, which are not ideal in smaller robotics tasks. Furthermore, cloud computing and massively parallel processing like in Graphic Processing Units (GPUs) are outside the specification of many tasks due to their respective latency and SWaP constraints. In response to this, Spiking Convolutional Neural Networks (SCNNs) look to provide the feature extraction benefits of CNNs, while maintaining low latency and power overhead thanks to their asynchronous spiking event-based processing. A novel Neuromorphic Perception Understanding Action (PUA) system is presented, that aims to combine the feature extraction benefits of CNNs with low latency processing of SCNNs. The PUA utilizes a Neuromorphic Vision Sensor for Perception that facilitates asynchronous processing within a Spiking fully Convolutional Neural Network (SpikeCNN) to provide semantic segmentation and Understanding of the scene. The output is fed to a spiking control system providing Actions. With this approach, the aim is to bring features of deep learning into the lower levels of autonomous robotics, while maintaining a biologically plausible STDP rule throughout the learned encoding part of the network. The network will be shown to provide a more robust and predictable management of spiking activity with an improved thresholding response. The reported experiments show that this system can deliver robust results of over 96 and 81% for accuracy and Intersection over Union, ensuring such a system can be successfully used within object recognition, classification and tracking problem. This demonstrates that the attention of the system can be tracked accurately, while the asynchronous processing means the controller can give precise track updates with minimal latency. Frontiers Media S.A. 2020-10-19 /pmc/articles/PMC7604290/ /pubmed/33192434 http://dx.doi.org/10.3389/fnbot.2020.568319 Text en Copyright © 2020 Kirkland, Di Caterina, Soraghan and Matich. http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
Kirkland, Paul
Di Caterina, Gaetano
Soraghan, John
Matich, George
Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation
title Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation
title_full Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation
title_fullStr Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation
title_full_unstemmed Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation
title_short Perception Understanding Action: Adding Understanding to the Perception Action Cycle With Spiking Segmentation
title_sort perception understanding action: adding understanding to the perception action cycle with spiking segmentation
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7604290/
https://www.ncbi.nlm.nih.gov/pubmed/33192434
http://dx.doi.org/10.3389/fnbot.2020.568319
work_keys_str_mv AT kirklandpaul perceptionunderstandingactionaddingunderstandingtotheperceptionactioncyclewithspikingsegmentation
AT dicaterinagaetano perceptionunderstandingactionaddingunderstandingtotheperceptionactioncyclewithspikingsegmentation
AT soraghanjohn perceptionunderstandingactionaddingunderstandingtotheperceptionactioncyclewithspikingsegmentation
AT matichgeorge perceptionunderstandingactionaddingunderstandingtotheperceptionactioncyclewithspikingsegmentation