TechRxiv
IEEE_ToIE_DB21_10_12.pdf (285.04 kB)
Download file

Neural Layer Bypassing Network

Download (285.04 kB)
preprint
posted on 20.10.2021, 04:24 by Amogh PalasamudramAmogh Palasamudram

This research aims to introduce and evaluate a new neural network architecture to improve the speed and effectiveness of forward propagation in neural networks: the Neural Layer Bypassing Network (NLBN). The theory and workings of this architecture have been explained in this research paper, along with comparisons to other methods of increasing the efficacy of deep learning models. This research also includes code examples with 3 image classification models trained on different datasets and analyses the impact of the NLBN architecture on forward propagation. It was found that this architecture increases the speed of forward propagation and tends to slightly decrease the accuracy of the model. However, it takes longer to train and takes more memory. All in all, this architecture is a potential foundation for using deep learning to teach deep learning models to be more efficient. This includes skipping and re-propagating through layers to improve the overall performance of a model.


History

Email Address of Submitting Author

amogh.p.214@gmail.com

Submitting Author's Institution

R42 Institute

Submitting Author's Country

United States of America

Usage metrics

Licence

Exports