loading page

FeaSel-Net: A Recursive Feature Selection Callback in Neural Networks
  • +1
  • Felix Fischer ,
  • Alexander Birk ,
  • Karsten Frenner ,
  • Alois Herkommer
Felix Fischer
Institue of Applied Optics

Corresponding Author:[email protected]

Author Profile
Alexander Birk
Author Profile
Karsten Frenner
Author Profile
Alois Herkommer
Author Profile


Selecting only the relevant subsets from all gathered data has never been as challenging as it is in these times of big data and sensor fusion. Multiple complementary methods have emerged for observing similar phenomena and oftentimes many of these techniques are superimposed in order to make the best possible decisions. A pathologist for example uses microscopic and spectroscopic techniques for the discrimination between healthy and cancerous tissue. Especially in the field of spectroscopy, immensely many frequencies are recorded and appropriately sized datasets are rarely acquired due to time intensive measurements and the lack of patients. In order to cope with the curse of dimensionality in machine learning, it is necessary to reduce the overhead from irrelevant or redundant features. In this article, we propose the FeaSel algorithm, that can be embedded in conventional neural networks. It recursively prunes the input nodes after the optimizer in the neural network achieves satisfying results. The weights of nodes that do not contribute critical information for the decision making will be deleted during the pruning process. We demonstrate the performance of the feature selection algorithm on different datasets and compare it to existing feature selection methods. Our algorithm combines the advantages of neural networks' non-linear learning ability and the embedding of the feature selection algorithm into the actual classifier optimization.
31 Oct 2022Published in Machine Learning and Knowledge Extraction volume 4 issue 4 on pages 968-993. 10.3390/make4040049