363 372
Full Length Article
Journal of Intelligent Systems and Internet of Things
Volume 9 , Issue 2, PP: 149-161 , 2023 | Cite this article as | XML | Html |PDF

Title

Intelligent Classification for Credit Scoring Based on a Data Mining algorithm

  Mohammed G. Fathi Al-Obaidi 1 *

1  Department of Computer Science, University of Mosul, Mosul, Iraq
    (mohammedghanim700@gmail.com)


Doi   :   https://doi.org/10.54216/JISIoT.090211

Received: February 20, 2023 Revised: May 27, 2023 Accepted: September 06, 2023

Abstract :

Credit scoring has grown in importance and has been thoroughly researched by banks and financial institutions. The amount of redundant and irrelevant features present in credit scoring datasets, however, reduces the classification accuracy. As a result, employing effective feature selection methods has become essential. In this study, a hybrid feature selection approach that combines the backpropagation neural network (BPNN) classifier and the pigeon optimization algorithm (POA) is suggested. With hybridization, the POA works to choose characteristic subgroups through the feature selection (FS) process, and the BPNN then assesses the chosen subsets using a fitness function. The experiment findings show that the suggested hybrid technique outperforms other competing approaches in terms of evaluation criteria, according to three benchmark credit scoring datasets.

Keywords :

Feature selection; pigeon optimization algorithm; credit scoring; backpropagation neural network , support vector machine.

References :

[1]         Koutanaei FN,Sajedi H and Khanbabaei M. A hybrid data mining model of feature selection algorithms and ensemble learning classifiers for credit scoring. Journal of Retailing and Consumer Services. 2015; 27: 11-23.

[2]         Lunn DJ,Thomas A,Best N and Spiegelhalter D. WinBUGS-a Bayesian modelling framework: concepts, structure, and extensibility. Statistics and computing. 2000; 10: 325-337.

[3]         Huang C-L,Chen M-C and Wang C-J. Credit scoring with a data mining approach based on support vector machines. Expert systems with applications. 2007; 33: 847-856.

[4]         Al-Thanoon NA,Qasim OS and Algamal ZY. Tuning parameter estimation in SCAD-support vector machine using firefly algorithm with application in gene selection and cancer classification. Computers in biology and medicine. 2018; 103: 262-268.

[5]         Guyon I and Elisseeff A. An introduction to variable and feature selection. Journal of machine learning research. 2003; 3: 1157-1182.

[6]         Bolón-Canedo V,Sánchez-Maroño N and Alonso-Betanzos A. An ensemble of filters and classifiers for microarray data classification. Pattern Recognition. 2012; 45: 531-539.

[7]         Ferreira AJ and Figueiredo MAT. Efficient feature selection filters for high-dimensional data. Pattern Recognition Letters. 2012; 33: 1794-1804.

[8]         Qasim OS and Algamal ZY. Feature Selection Using Different Transfer Functions for Binary Bat Algorithm. International Journal of Mathematical, Engineering and Management Sciences. 2020; 5: 697-706.

[9]         Mai Q and Zou H. The Kolmogorov filter for variable screening in high-dimensional binary classification. Biometrika. 2013; 100: 229-234.

[10]      Al-Talib ZS and Al-Azzawi SF. Projective and hybrid projective synchronization of 4-D hyperchaotic system via nonlinear controller strategy. Telkomnika. 2020; 18: 1012-1020.

[11]      Kabir MM,Islam MM and Murase K. A new wrapper feature selection approach using neural network. Neurocomputing. 2010; 73: 3273-3283.

[12]      Yu Z,Li L,Liu J and Han G. Hybrid adaptive classifier ensemble. IEEE transactions on cybernetics. 2015; 45: 177-190.

[13]      Haoyang W,Changchun Z,Bingguo C and Junhua L. Adaptive pigeon optimization algorithm to Improve group premature convergence [J]. Journal of Xi'an Jiaotong University. 1999; 11: 007.

[14]      Alhafedh MAA and Qasim OS. Two-Stage Gene Selection in Microarray Dataset Using Fuzzy Mutual Information and Binary Particle Swarm Optimization. Indian Journal of Forensic Medicine & Toxicology. 2019; 13: 1162-1171.

[15]      Abed KA. Solving Kuramoto–Sivashinsky equation by the new iterative method and estimate the optimal parameters by using PSO algorithm. Indonesian Journal of Electrical Engineering and Computer Science. 2020; 19: 709-714.

[16]      Steinwart I and Christmann A. 2008. Support vector machines: Springer Science & Business Media.

[17]      Cristianini N and Shawe-Taylor J. 2000. An introduction to support vector machines and other kernel-based learning methods: Cambridge university press.

[18]      Danenas P and Garsva G. Selection of Support Vector Machines based classifiers for credit risk domain. Expert Systems with Applications. 2015; 42: 3194-3204.

[19]      Lu H,Chen J,Yan K,Jin Q,Xue Y and Gao Z. A hybrid feature selection algorithm for gene expression data classification. Neurocomputing. 2017; 256: 56-62.

[20]      Al-Thanoon NA,Qasim OS and Algamal ZY. A new hybrid firefly algorithm and particle swarm optimization for tuning parameter estimation in penalized support vector machine with application in chemometrics. Chemometrics and Intelligent Laboratory Systems. 2019; 184: 142-152.

[21]      Aickelin U and Dowsland K. Exploiting problem structure in a pigeon optimization algorithm approach to a nurse rostering problem. arXiv preprint arXiv:0802.2001. 2008.

[22]      Deb K,Pratap A,Agarwal S and Meyarivan T. A fast and elitist multiobjective pigeon optimization algorithm: NSPOA -II. IEEE transactions on evolutionary computation. 2002; 6: 182-197.

[23]      Kozeny V. Pigeon optimization algorithms for credit scoring: Alternative fitness function performance comparison. Expert Systems with Applications. 2015; 42: 2998-3004.

[24]      Goh A. Back-propagation neural networks for modeling complex systems. Artificial Intelligence in Engineering. 1995; 9: 143-151.

[25]      Hagan MT and Menhaj MB. Training feedforward networks with the Marquardt algorithm. IEEE transactions on Neural Networks. 1994; 5: 989-993.

[26]      Gaxiola F,Melin P,Valdez F and Castillo O. Interval type-2 fuzzy weight adjustment for backpropagation neural networks with application in time series prediction. Information Sciences. 2014; 260: 1-14.

[27]      Gu B and Sheng VS. A robust regularization path algorithm for $\nu $-support vector classification. IEEE Transactions on neural networks and learning systems. 2017; 28: 1241-1248.

[28]      Kim K-j. Financial time series forecasting using support vector machines. Neurocomputing. 2003; 55: 307-319.

[29]      Trabelsi I and Bouhlel MS. 2017. Feature selection for GUMI kernel-based SVM in speech emotion recognitionArtificial intelligence: concepts, methodologies, tools, and applications. IGI Global. p 941-953.

[30]      Maldonado S,Pérez J and Bravo C. Cost-based feature selection for Support Vector Machines: An application in credit scoring. European Journal of Operational Research. 2017; 261: 656-665.

[31]      Huang J,Cai Y and Xu X. A hybrid pigeon optimization algorithm for feature selection wrapper based on mutual information. Pattern Recognition Letters. 2007; 28: 1825-1844.

[32]      Motieghader H,Najafi A,Sadeghi B and Masoudi-Nejad A. A hybrid gene selection algorithm for microarray cancer classification using pigeon optimization algorithm and learning automata. Informatics in Medicine Unlocked. 2017; 9: 246-254.

[33]      Bohte SM,Kok JN and La Poutre H. Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing. 2002; 48: 17-37.

[34]      Min S-H,Lee J and Han I. Hybrid pigeon optimization algorithms and support vector machines for bankruptcy prediction. Expert Systems with Applications. 2006; 31: 652-660.

[35]      https://archive.ics.uci.edu/ml/index.php.

[36]      Oreski S and Oreski G. Pigeon optimization algorithm-based heuristic for feature selection in credit risk assessment. Expert Systems with Applications. 2014; 41: 2052-2064.

[37]      Jadhav S,He H and Jenkins K. Information gain directed pigeon optimization algorithm wrapper feature selection for credit rating. Applied Soft Computing. 2018; 69: 541-553.

[38]      Algamal, Z. Y., Abonazel, M. R., & Lukman, A. F. (2023). Modified Jackknife Ridge Estimator for Beta Regression Model With Application to Chemical Data. International Journal of Mathematics, Statistics, and Computer Science, 1, 15–24. https://doi.org/10.59543/ijmscs.v1i.7713


Cite this Article as :
Style #
MLA Mohammed G. Fathi Al-Obaidi. "Intelligent Classification for Credit Scoring Based on a Data Mining algorithm." Journal of Intelligent Systems and Internet of Things, Vol. 9, No. 2, 2023 ,PP. 149-161 (Doi   :  https://doi.org/10.54216/JISIoT.090211)
APA Mohammed G. Fathi Al-Obaidi. (2023). Intelligent Classification for Credit Scoring Based on a Data Mining algorithm. Journal of Journal of Intelligent Systems and Internet of Things, 9 ( 2 ), 149-161 (Doi   :  https://doi.org/10.54216/JISIoT.090211)
Chicago Mohammed G. Fathi Al-Obaidi. "Intelligent Classification for Credit Scoring Based on a Data Mining algorithm." Journal of Journal of Intelligent Systems and Internet of Things, 9 no. 2 (2023): 149-161 (Doi   :  https://doi.org/10.54216/JISIoT.090211)
Harvard Mohammed G. Fathi Al-Obaidi. (2023). Intelligent Classification for Credit Scoring Based on a Data Mining algorithm. Journal of Journal of Intelligent Systems and Internet of Things, 9 ( 2 ), 149-161 (Doi   :  https://doi.org/10.54216/JISIoT.090211)
Vancouver Mohammed G. Fathi Al-Obaidi. Intelligent Classification for Credit Scoring Based on a Data Mining algorithm. Journal of Journal of Intelligent Systems and Internet of Things, (2023); 9 ( 2 ): 149-161 (Doi   :  https://doi.org/10.54216/JISIoT.090211)
IEEE Mohammed G. Fathi Al-Obaidi, Intelligent Classification for Credit Scoring Based on a Data Mining algorithm, Journal of Journal of Intelligent Systems and Internet of Things, Vol. 9 , No. 2 , (2023) : 149-161 (Doi   :  https://doi.org/10.54216/JISIoT.090211)