loading page

Physical Interpretation of Backpropagated Error in Neural Networks
  • Anton van Wyk
Anton van Wyk
University of the Witwatersrand, University of the Witwatersrand, University of the Witwatersrand

Corresponding Author:[email protected]

Author Profile

Abstract

In this note, we shed light on the physical meaning for the backpropagated error used by the backpropagation training algorithm. Essentially, for a given scalar output of the neural network, its backpropagated error is a linear apportionment of the error at it, in proportion to the linear gain between the outputs of neurons and the output according to a linearised-systems-view of the network. For multiple outputs, superposition provides the total/nett backpropagated error at the outputs of neurons.
Subsequently, we present some elementary statistical analysis for backpropagated errors in the network.