Volume 17 , Issue 2 , PP: 394-408, 2025 | Cite this article as | XML | Html | PDF | Full Length Article
Nadir Omer 1 *
Doi: https://doi.org/10.54216/FPA.170229
Spam e-mail has become a pervasive nuisance in today's digital world, posing significant challenges to efficient communication and information dissemination. Dealing with huge amounts of data with irrelevant and redundant features, which leads to high dimension. Nowadays, with the growth of using the internet, finding the secure E-mail classification system for cloud computing is a very important topic. Additionally, determining the best algorithm for choosing a subset of features has a big impact on how well automatic email classification works, making it one of the major issues. Among these is the Differential Evolution (DE) algorithm, which is computationally costly because of the slow convergence rate and evolutionary process. To address these issues, this study offers an intelligent scheme called Opposition Differential Evolution (ODE), which combines the Opposition Based Learning (OBL) and DE algorithms for effective automated feature subset selection. Its effectiveness is assessed using the support vector machine (SVM) to present a strong performance when evaluating the e-mail spam classification rate. Moreover, the OBL is used to accelerate and increase the convergence rate of traditional DE. To determine which features, contribute most to the reliability of the email spam classification, the subset features based on ODE that was used as feature subset selection are used.To assess the effectiveness of the proposed scheme, extensive experiments are conducted on spambase” and “spamassassin” benchmark email datasets, comprising a diverse collection of spam and non-spam emails. The results demonstrate that the Opposition Differential Evolution (ODE) algorithm yields superior performance compared to traditional machine learning and evolutionary techniques, displaying its robustness and efficiency in identifying spam emails accurately. The ODE algorithm effectively handles high-dimensional feature spaces, enhancing the model's discriminatory power while maintaining computational efficiency. Compared to the suggested ODE-SVM technique, which yields a result of 96.79 percent, the full-feature accuracy result was 93.55 percent. Additionally, empirical results demonstrate that our scheme may efficiently increase the number of features needed to improve the accuracy of the email spam classification.
Feature Selection , E-mail spam classification , Opposition Based Learning (OBL), Differential Evolution (DE) , Opposition Differential Evolution (ODE, Support Vector Machine (SVM)
Hamed, N.O., A.H. Samak, and M.A. Ahmad, Cloud e-mail security: An accurate e-mail spam classification based on enhanced binary differential evolution (BDE) algorithm. Journal of Intelligent & Fuzzy Systems, 2021. 41(6): p. 5943-5955.
2. R.Deepa Lakshmi, N.R., Supervised Learning Approach for Spam Classification Analysis using Data Mining Tools (IJCSE) International Journal on Computer Science and Engineering 2010. Vol. 02, (No. 08): p. 2760-2766.
3. Saad, O., A. Darwish, and R. Faraj, A survey of machine learning techniques for Spam filtering. IJCSNS, 2012. 12(2): p. 66.
4. Salehi, S. and A. Selamat. Hybrid simple artificial immune system (SAIS) and particle swarm optimization (PSO) for spam detection. 2011. IEEE.
5. Suebsing, A. and N. Hiransakolwong, A Novel Technique for Feature Subset Selection Based on Cosine Similarity. Applied Mathematical Sciences, 2012. 6(133): p. 6627-6655.
6. Khushaba, R.N., A. Al-Ani, and A. Al-Jumaily. Differential evolution based feature subset selection. in Pattern Recognition, 2008. ICPR 2008. 19th International Conference on. 2008. IEEE.
7. Lin, S.-W., et al., Particle swarm optimization for parameter determination and feature selection of support vector machines. Expert Systems with Applications, 2008. 35(4): p. 1817-1824.
8. Chen, J., et al., Feature selection for text classification with Naïve Bayes. Expert Systems with Applications, 2009. 36(3, Part 1): p. 5432-5435.
9. Choi, T.J., J. Togelius, and Y.-G. Cheong, A fast and efficient stochastic opposition-based learning for differential evolution in numerical optimization. Swarm and Evolutionary Computation, 2021. 60: p. 100768.
10. Qin, A. and X. Li. Differential evolution on the CEC-2013 single-objective continuous optimization testbed. in Evolutionary Computation (CEC), 2013 IEEE Congress on. 2013. IEEE.
11. Jun, Z., et al., Evolutionary Computation Meets Machine Learning: A Survey. Computational Intelligence Magazine, IEEE, 2011. 6(4): p. 68-75.
12. Rahnamayan, S., H.R. Tizhoosh, and M.M. Salama, Opposition-based differential evolution. Evolutionary Computation, IEEE Transactions on, 2008. 12(1): p. 64-79.
13. Tušar, T. and B. Filipič. Differential evolution versus genetic algorithms in multiobjective optimization. in Evolutionary Multi-Criterion Optimization. 2007. Springer.
14. GA, D.K. and S. Okdem, A simple and global optimization algorithm for engineering problems: differential evolution algorithm. Turk J Elec Engin, 2004. 12(1).
15. Sun, J., et al., Analysis of the distance between two classes for tuning SVM hyperparameters. Neural Networks, IEEE Transactions on, 2010. 21(2): p. 305-318.
16. Morariu, D., L. Vintan, and V. Tresp. Evolutionary feature selection for text documents using the SVM. 2006.
17. Fagbola Temitayo, O.S., digun Abimbola Hybrid GA-SVM for Efficient Feature Selection in E-mail Classification Computer Engineering and Intelligent Systems 2012. Vol 3, No.3, 2012(No.3,): p. 17-28.
18. Al-Tashi, Q., et al., Approaches to multi-objective feature selection: A systematic literature review. IEEE Access, 2020. 8: p. 125076-125096.
19. Ahandani, M.A. and H. Alavi-Rad, Opposition-based learning in the shuffled differential evolution algorithm. Soft computing, 2012. 16(8): p. 1303-1337.
20. Kokash, N., An introduction to heuristic algorithms. Department of Informatics and Telecommunications, 2005.
21. Jia, D., X. Duan, and M.K. Khan, An Efficient Binary Differential Evolution with Parameter Adaptation. International Journal of Computational Intelligence Systems, 2013. 6(2): p. 328-336.
22. Rahnamayan, S., H.R. Tizhoosh, and M. Salama, Opposition versus randomness in soft computing techniques. Applied Soft Computing, 2008. 8(2): p. 906-918.
23. Zhang, X., et al., Multi-class support vector machine optimized by inter-cluster distance and self-adaptive deferential evolution. Applied Mathematics and Computation, 2012. 218(9): p. 4973-4987.
24. Abraham, A., S. Das, and A. Konar. Document Clustering Using Differential Evolution. in Evolutionary Computation, 2006. CEC 2006. IEEE Congress on. 2006.
25. Storn, R. and K. Price, Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. Journal of global optimization, 1997. 11(4): p. 341-359.
26. Omran, M.G., Using opposition-based learning with particle swarm optimization and barebones differential evolution. Particle Swarm Optimization, InTech Education and Publishing, 2009.
27. Rahnamayan, S., H.R. Tizhoosh, and M.M. Salama. Opposition-based differential evolution (ODE) with variable jumping rate. in Foundations of Computational Intelligence, 2007. FOCI 2007. IEEE Symposium on. 2007. IEEE.
28. Wang, H., et al., Enhancing particle swarm optimization using generalized opposition-based learning. Information Sciences, 2011. 181(20): p. 4699-4714.
39. Tizhoosh, H.R. Opposition-based learning: a new scheme for machine intelligence. in Computational intelligence for modelling, control and automation, 2005 and international conference on intelligent agents, web technologies and internet commerce, international conference on. 2005. IEEE.
30. Xingshi, H., et al. Feature Selection with Discrete Binary Differential Evolution. in International Conference on Artificial Intelligence and Computational Intelligence, 2009. AICI '09. . 2009.