TechRxiv
Minibatch_Techive.pdf (2.31 MB)
Download file

Analysis, Design and Evaluation of a High-Performance Stochastic Multilayer Perceptron: from Mini-Batch Training to Inference

Download (2.31 MB)
preprint
posted on 2022-10-05, 20:48 authored by Ziheng Wang, Farzad Niknia, Shanshan LiuShanshan Liu, Pedro Reviriego, Fabrizio Lombardi

The mini-batch technique is widely used in neural network training with conventional arithmetic for its efficiency; however, its feasibility and performance in SC MLPs have rarely been studied. This paper analyzes by theory and simulation the performance of the mini-batch technique in SC MLPs; the results show that it potentially has a larger benefit in MLPs using SC than the conventional version. Moreover, a pipelined SC MLP implementation is also pursued in this paper for performing the inference process. All findings and designs provided in this paper leverage the advantages of the mini-batch technique and SC implementation to design high-performance MLPs. 

History

Email Address of Submitting Author

ssliu@ece.neu.edu

Submitting Author's Institution

New Mexico State University

Submitting Author's Country

  • United States of America