Cargando…

Signal Perceptron: On the Identifiability of Boolean Function Spaces and Beyond

In a seminal book, Minsky and Papert define the perceptron as a limited implementation of what they called “parallel machines.” They showed that some binary Boolean functions including XOR are not definable in a single layer perceptron due to its limited capacity to learn only linearly separable fun...

Descripción completa

Detalles Bibliográficos
Autores principales: Mendez Lucero, Miguel-Angel, Karampatsis, Rafael-Michael, Bojorquez Gallardo, Enrique, Belle, Vaishak
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9203047/
https://www.ncbi.nlm.nih.gov/pubmed/35719687
http://dx.doi.org/10.3389/frai.2022.770254
Descripción
Sumario:In a seminal book, Minsky and Papert define the perceptron as a limited implementation of what they called “parallel machines.” They showed that some binary Boolean functions including XOR are not definable in a single layer perceptron due to its limited capacity to learn only linearly separable functions. In this work, we propose a new more powerful implementation of such parallel machines. This new mathematical tool is defined using analytic sinusoids—instead of linear combinations—to form an analytic signal representation of the function that we want to learn. We show that this re-formulated parallel mechanism can learn, with a single layer, any non-linear k-ary Boolean function. Finally, to provide an example of its practical applications, we show that it outperforms the single hidden layer multilayer perceptron in both Boolean function learning and image classification tasks, while also being faster and requiring fewer parameters.