Volume 16 , Issue 1 , PP: 86-101, 2025 | Cite this article as | XML | Html | PDF | Full Length Article
Ismael Salih Aref 1 * , Dheyab Salman Ibrahim 2 , Bashar Talib AL-Nuaimi 3
Doi: https://doi.org/10.54216/JISIoT.160108
Feature selection (FS) is a crucial preprocessing step in data mining to eliminate redundant or irrelevant features from high-dimensional data. Many optimization algorithms for FS often lack balance in their search processes. This paper proposes a hybrid algorithm, the Artificial Hummingbird Algorithm based on the Genetic Algorithm (AHA-GA), to address this imbalance and solve the FS problem. The main goal of AHA-GA is to select the most crucial characteristics to improve overall model categorization. The UCI datasets are used to assess the performance of the proposed FS method. The proposed feature selection algorithm was compared with five feature selection optimization algorithms: BWOAHHO, HSGW, WOA-CM, BDA-SA, and ASGW. AHA-GA achieved a classification accuracy of 96% across 18 datasets, which was higher than BWOAHHO (93.2%), HSGW (92.5%), WOA-CM (94.4%), BDA-SA (93%), and ASGW (91.6%). When comparing the proposed AHA-GA algorithm to the results obtained by the other five algorithms in terms of selected attribute size, the average feature sizes were as follows: AHA-GA (15.10889), BWOAHHO (16.74222), HSGW (19.43111), WOA-CM (17.05389), BDA-SA (17.275), and ASGW (19.7585). The statistical and experimental tests demonstrated that the proposed AHA-GA performs better than competitive algorithms in selecting effective features.
Feature Selection , Artificial Hummingbirds algorithm , Genetic Algorithm , Classification
[1] M. Abdel-Basset, D. El-Shahat, I. El-Henawy, V. H. C. De Albuquerque, and S. Mirjalili, “A new fusion of grey wolf optimizer algorithm with a two-phase mutation for feature selection,” Expert Syst. Appl., vol. 139, p. 112824, 2020.
[2] E. Hancer, B. Xue, D. Karaboga, and M. Zhang, “A binary ABC algorithm based on advanced similarity scheme for feature selection,” Appl. Soft Comput., vol. 36, pp. 334–348, 2015.
[3] C. F. Tsai, K. L. Sue, Y. H. Hu, and A. Chiu, “Combining feature selection, instance selection, and ensemble classification techniques for improved financial distress prediction,” J. Bus. Res., vol. 130, pp. 200–209, 2021.
[4] H. Chantar, M. Tubishat, M. Essgaer, and S. Mirjalili, “Hybrid binary dragonfly algorithm with simulated annealing for feature selection,” SN Comput. Sci., vol. 2, no. 4, p. 295, 2021.
[5] E.-G. Talbi, Metaheuristics: From design to implementation. John Wiley & Sons, 2009.
[6] H. K. Chantar and D. W. Corne, “Feature subset selection for Arabic document categorization using BPSO-KNN,” in Proc. 3rd World Congr. Nature Biologically Inspired Comput, 2011, pp. 546–551.
[7] L.-Y. Chuang, C.-H. Yang, and J.-C. Li, “Chaotic maps based on binary particle swarm optimization for feature selection,” Appl. Soft Comput., vol. 11, no. 1, pp. 239–248, 2011.
[8] E. Emary, H. M. Zawbaa, and A. E. Hassanien, “Binary grey wolf optimization approaches for feature selection,” Neurocomputing, vol. 172, pp. 371–381, 2016.
[9] H. Huang, H.-B. Xie, J.-Y. Guo, and H.-J. Chen, “Ant colony optimization-based feature selection method for surface electromyography signals classification,” Comput. Biol. Med., vol. 42, no. 1, pp. 30–38, 2012.
[10] R. K. Eluri and N. Devarakonda, “Feature selection with a binary flamingo search algorithm and a genetic algorithm,” Multimedia Tools Appl., vol. 82, no. 17, pp. 26679–26730, 2023.
[11] Y. Song, “Research on the application of computer graphic advertisement design based on a genetic algorithm and TRIZ theory,” 2022.
[12] W. Zhao, L. Wang, and S. Mirjalili, “Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications,” Comput. Methods Appl. Mech. Eng., vol. 388, p. 114194, 2022.
[13] A. A. Abdulhussien, M. F. Nasrudin, S. M. Darwish, and Z. A. A. Alyasseri, “Feature selection method based on quantum inspired genetic algorithm for Arabic signature verification,” J. King Saud Univ. Comput. Inf. Sci., vol. 35, no. 3, pp. 141–156, 2023.
[14] A. G. Gad, K. M. Sallam, R. K. Chakrabortty, M. J. Ryan, and A. A. Abohany, “An improved binary sparrow search algorithm for feature selection in data classification,” Neural Comput. Appl., vol. 34, no. 18, pp. 15705–15752, 2022.
[15] L. Fang and X. Liang, “A novel method based on nonlinear binary grasshopper whale optimization algorithm for feature selection,” J. Bionic Eng., vol. 20, no. 1, pp. 237–252, 2023.
[16] A. T. Azar, Z. I. Khan, S. U. Amin, and K. M. Fouad, “Hybrid global optimization algorithm for feature selection,” Comput. Mater. Continua, vol. 74, no. 1, pp. 2021–2037, 2023.
[17] M. H. Nadimi-Shahraki, Z. Asghari Varzaneh, H. Zamani, and S. Mirjalili, “Binary starling murmuration optimizer algorithm to select effective features from medical data,” Appl. Sci., vol. 13, no. 1, p. 564, 2022.
[18] M. H. Nadimi-Shahraki, M. Banaie-Dezfouli, H. Zamani, S. Taghian, and S. Mirjalili, “B-MFO: A binary moth-flame optimization for feature selection from medical datasets,” Comput., vol. 10, no. 11, p. 136, 2021.
[19] N. K. Hussein, S. Damas, M. A. Elaziz, A. Kamil, and A. Darwish, “Enhancing feature selection with GMSMFO: A global optimization algorithm for machine learning with application to intrusion detection,” J. Comput. Des. Eng., vol. 10, no. 4, pp. 1363–1389, 2023.
[20] L. Abualigah and A. J. Dulaimi, “A novel feature selection method for data mining tasks using hybrid sine cosine algorithm and genetic algorithm,” Cluster Comput., vol. 24, pp. 2161–2176, 2021.
[21] Z. Michalewicz and M. Schoenauer, “Evolutionary algorithms for constrained parameter optimization problems,” Evol. Comput., vol. 4, no. 1, pp. 1–32, 1996.
[22] A. Lambora, K. Gupta, and K. Chopra, “Genetic algorithm—A literature review,” in Proc. Int. Conf. Mach. Learn., Big Data, Cloud Parallel Comput. (COMITCon), 2019, pp. 380–384.
[23] T. Alam, S. Qamar, A. Dixit, and M. Benaida, “Genetic algorithm: Reviews, implementations, and applications,” arXiv preprint arXiv: 2007.12673, 2020.
[24] Q. Al-Tashi, S. J. A. Kadir, H. M. Rais, S. Mirjalili, and H. Alhussian, “Binary optimization using hybrid grey wolf optimization for feature selection,” IEEE Access, vol. 7, pp. 39496–39508, 2019.
[25] A. Hamdipour, A. Basiri, M. Zaare, and S. Mirjalili, “Baha: Binary artificial hummingbird algorithm for feature selection,” SSRN, p. 4519771, 2023.