TechRxiv
HyP_ABC_for_IEEE_Transactions_on_Artificial_Intelligence__Submitted_.pdf (5.11 MB)

HyP-ABC: A Novel Automated Hyper-Parameter Tuning Algorithm Using Evolutionary Optimization

Download (5.11 MB)
preprint
posted on 2021-11-12, 17:47 authored by Leila ZahediLeila Zahedi, Farid Ghareh Mohammadi, M. Hadi AminiM. Hadi Amini

Machine learning techniques lend themselves as promising decision-making and analytic tools in a wide range of applications. Different ML algorithms have various hyper-parameters. In order to tailor an ML model towards a specific application working at its best, its hyper-parameters should be tuned. Tuning the hyper-parameters directly affects the performance. However, for large-scale search spaces, efficiently exploring the ample number of combinations of hyper-parameters is computationally expensive. Many of the automated hyper-parameter tuning techniques suffer from low convergence rates and high experimental time complexities. In this paper, we propose HyP-ABC, an automatic innovative hybrid hyper-parameter optimization algorithm using the modified artificial bee colony approach, to measure the classification accuracy of three ML algorithms: random forest, extreme gradient boosting, and support vector machine. In order to ensure the robustness of the proposed method, the algorithm takes a wide range of feasible hyper-parameter values and is tested using a real-world educational dataset. Experimental results show that HyP-ABC is competitive with state-of-the-art techniques. Also, it has fewer hyper-parameters to be tuned than other population-based algorithms, making it worthwhile for real-world HPO problems.

History

Email Address of Submitting Author

lzahe001@fiu.edu

ORCID of Submitting Author

https://orcid.org/0000-0002-7325-1025

Submitting Author's Institution

Florida International University

Submitting Author's Country

  • United States of America