Hopfield Neural Network With Negative Weights

- Carla Cursino ,
- Luiz Alberto Vieira Dias

## Abstract

Hopfield and Tank have shown that a neural network can find solutions
for complex optimization problems although it can be trapped in a local
minimum of the objective function returning a suboptimal solution. When
the problem has constraints they can be added to the objective function
as penalty terms using Lagrange multipliers. In this paper, we introduce
an approach inspired by the work of Andrew, Chu, and Kate to implement a
neural network to obtain solutions satisfying the linear equality
constraints using the Moore- Penrose pseudo inverse matrix to construct
a projection matrix to send any configuration to the subspace of
configuration space that satisfies all the constraints. The objective
function of the problem is modified to include Lagrange-multipliers
terms for the equations of constraints. Furthermore, we have found that
such condition makes the network converge to a set of stable states even
if some diagonal elements of the weight matrix are negative. If after
several steps the network does not converge to a stable state, we just
solve the problem using simulated annealing. We use this technique to
solve the NP-hard Light Up puzzle.