Volume 2 , Issue 1 , PP: 46-54, 2022 | Cite this article as | XML | Html | PDF | Full Length Article
Khadija Shazly 1 * , Nima Khodadadi 2
Doi: https://doi.org/10.54216/JAIM.020105
Credit card use is rapidly increasing as a result of the widespread availability of these cards, the ease of making electronic transfers, and the ubiquity of online shopping. But credit card debt poses a serious risk to businesses and governments alike, not to mention individual savers and investors. Consequently, the need for efficient, timely, and reliable ways to anticipate credit card risk has grown. In this study, we offer a framework that combines three classifiers, namely, support vector machines, multilayer perceptron and decision trees, to improve the network's accuracy. The proposed strategy is shown to be very competitive with others through simulation.
Credit scoring , Credit card , Machine learning , Classification , Metaheuristic optimization , K-Nearest neighbor , Random Forest , Support vector machines
[1] The comparisons of data mining techniques for the predictive accuracy of probability of default
of credit card clients | Semantic Scholar. https://www.semanticscholar.org/paper/Thecomparisonsof-data-mining-techniques-for-the-Yeh-Lien/1cacac4f0ea9fdff3cd88c151c94115a9fddcf33.
Accessed 6 Sep 2020
[2] Quantitative methods in credit management: a survey. Oper Res.
https://pubsonline.informs.org/doi/abs/10.1287/opre.42.4.589. Accessed 6 Sep 2020
[3] Bayesian data mining, with application to benchmarking and credit scoring—Giudici—
2001.Applied stochastic models in business and industry. WileyOnline Library.
https://onlinelibrary.wiley.com/doi/abs/10.1002/asmb.425. Accessed 6 Sep 2020
[4] Lee T-S, Chiu C-C, Lu C-J, Chen I-F (2002) Credit scoring using the hybrid neural discriminant
technique. Expert Syst Appl 23(3):245–254. https://doi.org/10.1016/S0957-4174(02)00044-1
[5] Yu L, Liu H (2004) Efficient feature selection via analysis of relevance and redundancy. JMach
Learn Res 5:1205–1224
[6] Yang Y, Pedersen J (1997) A comparative study on feature selection in text categorization
[7] Yan K, Zhang D (2015) Feature selection and analysis on correlated gas sensor data with recursive
feature elimination. Sens Actuators B Chem 212:353–363.
https://doi.org/10.1016/j.snb.2015.02.025
[8] Jain A, Zongker D (1997) Feature selection: evaluation, application, and small sample
performance. IEEE Trans Pattern Anal Mach Intell 19(2):153–158.
https://doi.org/10.1109/34.574797
[9] De Stefanoc, Fontanella F, Marrocco C, Scotto di FrecaA(2014)AGA -based feature selection
approach with an application to handwritten character recognition. Pattern Recogn Lett 35:130–
141. https://doi.org/10.1016/j.patrec.2013.01.026
[10] Zorarpacı E, Özel SA (2016) A hybrid approach of differential evolution and artificial bee colony
for feature selection. Expert Syst Appl 62:91–103. https://doi.org/10.1016/j.eswa.2016.06.004
[11] Too J, Abdullah AR, Mohd Saad N, Mohd Ali N, TeeW(2018) A new competitive binary grey
wolf optimizer to solve the feature selection problem in EMG signals classification. Computers
7(4), Art. no. 4. https://doi.org/10.3390/computers7040058
[12] Mirjalili S, Mirjalili SM, Lewis A (2014) Grey Wolf Optimizer. Adv Eng Softw 69:46–
61.https://doi.org/10.1016/j.advengsoft.2013.12.007
[13] Binary grey wolf optimization approaches for feature selection. Request PDF.
ResearchGate.https://www.researchgate.net/publication/280836997_Binary_Grey_Wolf_Optimi
zation_Approaches_for_Feature_Selection. Accessed 6 Sep 2020
[14] Chuang L-Y, Chang H-W, Tu C-J, Yang C-H (2008) Improved binary PSO for feature selection
using gene expression data. Comput Biol Chem 32(1):29–38.
https://doi.org/10.1016/j.compbiolchem.2007.09.005
[15] Text feature selection using ant colony optimization | Request PDF. ResearchGate.
https://www.researchgate.net/publication/222669067_Text_feature_selection_using_ant_colony
_optimization. Accessed 6 Sep 2020
[16] Feature selection with discrete binary differential evolution. IEEE conference publication.
https://ieeexplore.ieee.org/document/5376334. Accessed 6 Sep 2020
[17] Wright RE (1995) “Logistic regression”, in reading and understanding multivariate statistics.
American Psychological Association, Washington, pp 217–244
[18] An introduction to kernel and nearest-neighbor nonparametric regression: The American
statistician: Vol 46, No 3.
https://www.tandfonline.com/doi/abs/10.1080/00031305.1992.10475879.Accessed 6 Sep 2020
[19] Statistical Learning Theory. Wiley. https://www.wiley.com/en-us/Statistical+Learning+Theoryp-9780471030034. Accessed 6 Sep 2020
[20] Ting KM, Zheng Z (1999) Improving the performance of boosting for Naive Bayesian
classification. In: Methodologies for knowledge discovery and data mining, Berlin, Heidelberg,
pp 296–305, https://doi.org/10.1007/3-540-48912-6_41
[21] Quinlan JR (1986) Induction of decision trees. Mach Learn 1(1):81–106.
https://doi.org/10.1007/BF00116251
[22] Breiman L (2001) Random forests. Mach Learn 45(1):5–32.
https://doi.org/10.1023/A:1010933404324
[23] Dao S, Pham T (in press) Capacitated vehicle routing problem—a new clustering approach based
on hybridization of adaptive particle swarm optimization and grey wolf optimization. In:
Evolutionary data clustering: algorithms, and applications. Springer