FeaSel-Net: A Recursive Feature Selection Callback in Neural Networks
AbstractSelecting only the relevant subsets from all gathered data has never
been as challenging as it is in these times of big data and sensor
fusion. Multiple complementary methods have emerged for observing
similar phenomena and oftentimes many of these techniques are
superimposed in order to make the best possible decisions. A pathologist
for example uses microscopic and spectroscopic techniques for the
discrimination between healthy and cancerous tissue. Especially in the
field of spectroscopy, immensely many frequencies are recorded and
appropriately sized datasets are rarely acquired due to time intensive
measurements and the lack of patients. In order to cope with the curse
of dimensionality in machine learning, it is necessary to reduce the
overhead from irrelevant or redundant features.
In this article, we propose the FeaSel algorithm, that can be embedded
in conventional neural networks. It recursively prunes the input nodes
after the optimizer in the neural network achieves satisfying results.
The weights of nodes that do not contribute critical information for the
decision making will be deleted during the pruning process. We
demonstrate the performance of the feature selection algorithm on
different datasets and compare it to existing feature selection methods.
Our algorithm combines the advantages of neural networks' non-linear
learning ability and the embedding of the feature selection algorithm
into the actual classifier optimization.