loading page

Beyond Gradient: Subspace Rotation Algorithm
  • Ci Lin,
  • Tet Hin Yeap,
  • Iluju Kiringa
Ci Lin

Corresponding Author:[email protected]

Author Profile
Tet Hin Yeap
Iluju Kiringa

Abstract

This study introduces the Subspace Rotation Algorithm (SRA), an innovative gradientfree method designed to discover the global optimal weight matrix. The SRA consists of two fundamental algorithms: the Left Subspace Rotation Algorithm (LSRA) and the Right Subspace Rotation Algorithm (RSRA). The combination of LSRA and RSRA, in two formats, LSRA-RSRA and RSRA-LSRA, can harness the advantages of both individual algorithms, resulting in enhanced performance. Our observations reveal that shallow and wide Multilayer Perceptrons (MLP), trained using RSRA-LSRA, achieve higher training accuracy compared to the Backpropagation (BP) algorithm. Moreover, when combining SRA with the BP algorithm, a remarkable impact on training MLP models is observed. Our experiments demonstrate that the BP algorithm may become trapped in local optima, while RSRA-LSRA, though capable of escaping local optima, may not fully realize its potential when the number of hidden nodes is limited. The synergy of RSRA-LSRA and the BP algorithm allows for the utilization of the advantages from both approaches, achieving optimal MLP performance with fewer hidden nodes.
14 Apr 2024Submitted to TechRxiv
18 Apr 2024Published in TechRxiv