TechRxiv
NLNADAF_paper.pdf (1.2 MB)
Download file

Non-linear Neurons with Human-like Apical Dendrite Activations

Download (1.2 MB)
preprint
posted on 2020-02-18, 04:16 authored by Mariana-Iuliana Georgescu, Radu Tudor IonescuRadu Tudor Ionescu, Nicolae-Catalin Ristea, Nicu Sebe
In order to classify linearly non-separable data, neurons are typically organized into multi-layer neural networks that are equipped with at least one hidden layer. Inspired by some recent discoveries in neuroscience, we propose a new neuron model along with a novel activation function enabling learning of non-linear decision boundaries using a single neuron. We show that a standard neuron followed by the novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy. Furthermore, we conduct experiments on three benchmark data sets from computer vision and natural language processing, i.e. Fashion-MNIST, UTKFace and MOROCO, showing that the ADA and the leaky ADA functions provide superior results to Rectified Liner Units (ReLU) and leaky ReLU, for various neural network architectures, e.g. 1-hidden layer or 2-hidden layers multi-layer perceptrons (MLPs) and convolutional neural networks (CNNs) such as LeNet, VGG, ResNet and Character-level CNN. We also obtain further improvements when we change the standard model of the neuron with our pyramidal neuron with apical dendrite activations (PyNADA).

History

Email Address of Submitting Author

raducu.ionescu@gmail.com

ORCID of Submitting Author

0000-0002-9301-1950

Submitting Author's Institution

University of Bucharest

Submitting Author's Country

  • Romania

Usage metrics

    Licence

    Exports