Physical Interpretation of Backpropagated Error in Neural Networks (V0.5).pdf (273.29 kB)
Download filePhysical Interpretation of Backpropagated Error in Neural Networks
In this note, we shed light on the physical meaning for the backpropagated error used by the backpropagation training algorithm. Essentially, for a given scalar output of the neural network, its backpropagated error is a linear apportionment of the error at it, in proportion to the linear gain between the outputs of neurons and the output according to a linearised-systems-view of the network. For multiple outputs, superposition provides the total/nett backpropagated error at the outputs of neurons.
Subsequently, we present some elementary statistical analysis for backpropagated errors in the network.
Funding
Carl and Emily Fuchs Foundation
History
Email Address of Submitting Author
mavanwyk@gmail.comORCID of Submitting Author
0000-0002-4519-1475Submitting Author's Institution
University of the WitwatersrandSubmitting Author's Country
- South Africa