Volume 16 , Issue 2 , PP: 123-141, 2025 | Cite this article as | XML | Html | PDF | Full Length Article
El-Sayed M. El-kenawy 1 * , Amel Ali Alhussan 2 , Doaa Sami Khafaga 3 , Amal H. Alharbi 4 , Sarah A. Alzakari 5 , Abdelaziz A. Abdelhamid 6 , Abdelhameed Ibrahim 7 , Marwa M. Eid 8
Doi: https://doi.org/10.54216/JISIoT.160210
The Comment Feedback Optimization Algorithm (CFOA) presented a novel feedback-driven model for solving optimization problems, incorporating ideas based on positive and negative feedback loops. Unlike other optimization algorithms, CFOA includes feedback adjustments for better tuning the exploration-exploitation trade-off, thus making CFOA less sensitive to the dimensions of problems and their nonlinearity. Some proposed features include feedback dynamics for adaptive search options, parameter control by a decay function, and mechanisms for escaping local optima. CFOA’s performance has been benchmarked on CEC 2005 test cases with many evaluations. The results demonstrate better convergence speed, solution quality, and computational complexity compared with the Sine Cosine Algorithm (SCA), Gravitational Search Algorithm (GSA), and Tunicate Swarm Algorithm (TSH). The efficiency of the approach used by CFOA makes it an indispensable tool for solving real-world optimization problems across various application domains such as machine learning, engineering, and logistics.
Metaheuristic Optimization , Comment Feedback Optimization Algorithm (CFOA) , Feedbackdriven Optimization , Exploration-Exploitation Balance , High-Dimensional Optimization
[1] F. Emmert-Streib. From the digital data revolution toward a digital society: Pervasiveness of artificial intelligence. Machine Learning and Knowledge Extraction, 3(1), 2021.
[2] R. Gruetzemacher and J. Whittlestone. The transformative potential of artificial intelligence. Futures, 135:102884, 2022.
[3] M. Ramezani, A. Takian, A. Bakhtiari, H. R. Rabiee, A. A. Fazaeli, and S. Sazgarnejad. The application of artificial intelligence in health financing: A scoping review. Cost Effectiveness and Resource Allocation: C/E, 21:83, 2023.
[4] L. Ma and B. Sun. Machine learning and ai in marketing – connecting computing power to human insights. International Journal of Research in Marketing, 37(3):481–504, 2020.
[5] M. N. Omidvar, X. Li, and X. Yao. A review of population-based metaheuristics for large-scale black-box global optimization—part i. IEEE Transactions on Evolutionary Computation, 26(5):802–822, 2022.
[6] M. Abd Elaziz, A. Dahou, L. Abualigah, L. Yu, M. Alshinwan, A. M. Khasawneh, and S. Lu. Advanced metaheuristic optimization techniques in applications of deep neural networks: a review. Neural Computing and Applications, 33(21):14079–14099, 2021.
[7] G. Di Domenico, E. Andreis, A. Carlo Morelli, G. Merisio, V. Franzese, C. Giordano, A. Morselli, P. Panicucci, F. Ferrari, and F. Topputo. The erc-funded extrema project: Achieving self-driving interplanetary cubesats. In G. Fasano and J. D. Pint´er, editors, Modeling and Optimization in Space Engineering: New Concepts and Approaches, pages 167–199. Springer International Publishing, 2023.
[8] W. Deng, R. Chen, B. He, Y. Liu, L. Yin, and J. Guo. A novel two-stage hybrid swarm intelligence optimization algorithm and application. Soft Computing, 16(10):1707–1722, 2012.
[9] V. Bol´on-Canedo, N. S´anchez-Maro˜no, and A. Alonso-Betanzos. Feature selection for high-dimensional data. Progress in Artificial Intelligence, 5(2):65–75, 2016.
[10] L. C. Molina, L. Belanche, and A. Nebot. Feature selection algorithms: A survey and experimental evaluation. In 2002 IEEE International Conference on Data Mining, 2002. Proceedings., pages 306– 313, 2002.
[11] Tummala S. L. V. Ayyarao, N. S. S. Ramakrishna, R. M. Elavarasan, N. Polumahanthi, M. Rambabu, G. Saini, B. Khan, and B. Alatas. War strategy optimization algorithm: A new effective metaheuristic algorithm for global optimization. IEEE Access, 10:25073–25105, 2022.
[12] M. Dorigo, M. Birattari, and T. Stutzle. Ant colony optimization. IEEE Computational Intelligence Magazine, 1(4):28–39, 2006.
[13] F. Zhao, X. Hu, L.Wang, T. Xu, N. Zhu, and Jonrinaldi. A reinforcement learning-driven brain storm optimisation algorithm for multi-objective energy-efficient distributed assembly no-wait flow shop scheduling problem. International Journal of Production Research, 61(9):2854–2872, 2023.
[14] J. P. Papa, G. H. Rosa, A. N. de Souza, and L. C. S. Afonso. Feature selection through binary brain storm optimization. Computers & Electrical Engineering, 72:468–481, 2018.
[15] E. Osaba, E. Villar-Rodriguez, J. Del Ser, A. J. Nebro, D. Molina, A. LaTorre, P. N. Suganthan, C. A. Coello Coello, and F. Herrera. A tutorial on the design, experimentation and application of metaheuristic algorithms to real-world optimization problems. Swarm and Evolutionary Computation, 64:100888, 2021.
[16] A learning-based hybrid framework for dynamic balancing of exploration-exploitation: Combining regression analysis and metaheuristics. Mathematics, 9(16):1976, n.d.
[17] B. Abdollahzadeh and F. S. Gharehchopogh. A multi-objective optimization algorithm for feature selection problems. Engineering with Computers, 38(3):1845–1863, 2022.
[18] Amir Seyyedabbasi,Wadhah Zeyad Tareq Tareq, and Nebojsa Bacanin. An effective hybrid metaheuristic algorithm for solving global optimization algorithms. Multimedia Tools and Applications, pages 1–36, 2024.
[19] Q. Al-Tashi, S. J. Abdul Kadir, H. M. Rais, S. Mirjalili, and H. Alhussian. Binary optimization using hybrid grey wolf optimization for feature selection. IEEE Access, 7:39496–39508, 2019.
[20] S. Arora and P. Anand. Binary butterfly optimization approaches for feature selection. Expert Systems with Applications, 116:147–160, 2019.
[21] L.-Y. Chuang, S.-W. Tsai, and C.-H. Yang. Improved binary particle swarm optimization using catfish effect for feature selection. Expert Systems with Applications, 38(10):12699–12707, 2011.
[22] Raghav Prasad Parouha and Pooja Verma. Design and applications of an advanced hybrid meta-heuristic algorithm for optimization problems. Artificial Intelligence Review, 54(8):5931–6010, 2021.
[23] E.-S. M. El-Kenawy, M. M. Eid, M. Saber, and A. Ibrahim. Mbgwo-sfs: Modified binary grey wolf optimizer based on stochastic fractal search for feature selection. IEEE Access, 8:107635–107649, 2020.
[24] E. Emary, H. M. Zawbaa, and A. E. Hassanien. Binary grey wolf optimization approaches for feature selection. Neurocomputing, 172:371–381, 2016.
[25] M. Ghaemi and M.-R. Feizi-Derakhshi. Feature selection using forest optimization algorithm. Pattern Recognition, 60:121–129, 2016.