Volume 11 , Issue 2 , PP: 30-41, 2024 | Cite this article as | XML | Html | PDF | Full Length Article
Zahari Md Rodzi 1 * , Wan Normila Mohamad 2 , Zhang Lu 3 , Faisal Al-Sharqi 4 , Rawan A. shlaka 5 , Ashraf Al-Quran 6 , Ali M. Alorsan Bany Awad 7
Doi: https://doi.org/10.54216/JISIoT.110203
This research employs DEMATEL analysis as a methodological approach to thoroughly examine the adverse consequences of implementing Artificial Intelligence (AI) among students enrolled at Universiti Teknologi MARA (UiTM) Negeri Sembilan, Malaysia. The analysis encompasses three distinct professional cohorts: student representatives, academic staff, and upper management. Through a systematic analysis of causal relationships between multiple factors, this study aims to identify and prioritize the fundamental elements contributing to the negative consequences associated with integrating artificial intelligence. The prominence of privacy and security concerns as a causal factor highlights the importance of implementing strong data protection measures and adhering to ethical practices related to AI. Furthermore, various factors connected with personal disconnection, restricted adaptability, dependance on technology, and insufficient emotional intelligence influence the adverse outcomes of artificial intelligence implementation among students. The results underscore the necessity of implementing focused interventions and strategies to tackle these difficulties and guarantee a harmonious and advantageous integration of artificial intelligence in students' educational journeys. Higher education institutions can effectively harness the advantages of AI while ensuring their students' welfare and educational achievements by recognizing and proactively addressing any potential limitations.
DEMATEL analysis , Artificial Intelligence , Higher Education , Student Welfare , Data Protection.
[1] N. D. Nguyen, “Exploring the role of AI in education,” London J. Soc. Sci., no. 6, pp. 84–95, 2023, doi: 10.31039/ljss.2023.6.108.
[2] Q. Bu, “Ethical Risks in Integrating Artificial Intelligence into Education and Potential Countermeasures,” Sci. Insights, vol. 41, no. 1, pp. 561–566, 2022, doi: 10.15354/si.22.re067.
[3] J. Mason, B. E. Peoples, and J. Lee, “Questioning the scope of AI standardization in learning, education, and training,” J. ICT Stand., vol. 8, no. 2, pp. 107–122, 2020, doi: 10.13052/jicts2245-800X.822.
[4] L. Gkinko and A. Elbanna, “Hope, tolerance and empathy: employees’ emotions when using an AI-enabled chatbot in a digitalised workplace,” Inf. Technol. People, vol. 35, no. 6, pp. 1714–1743, 2022, doi: 10.1108/ITP-04-2021-0328.
[5] I. Akulwar-Tajane, K. K. Parmar, P. H. Naik, and A. V. Shah, “Rethinking Screen Time during COVID-19: Impact on Psychological Well-Being in Physiotherapy Students,” Int. J. Clin. Exp. Med. Res., vol. 4, no. 4, pp. 201–216, 2020, doi: 10.26855/ijcemr.2020.10.014.
[6] K. Seo, J. Tang, I. Roll, S. Fels, and D. Yoon, “The impact of artificial intelligence on learner–instructor interaction in online learning,” Int. J. Educ. Technol. High. Educ., vol. 18, no. 1, 2021, doi: 10.1186/s41239-021-00292-9.
[7] A. Guilherme, “AI and education: the importance of teacher and student relations,” AI Soc., vol. 34, no. 1, pp. 47–54, 2019, doi: 10.1007/s00146-017-0693-8.
[8] J. A. Durlak and R. P. Weissberg, “Promoting social and emotional development is an essential part of students’ education,” Hum. Dev., vol. 54, no. 1, pp. 1–3, 2011, doi: 10.1159/000324337.
[9] R. Tiwari, “The integration of AI and machine learning in education and its potential to personalize and improve student learning experiences,” Interantional J. Sci. Res. Eng. Manag., vol. 07, no. 02, pp. 1–11, 2023, doi: 10.55041/ijsrem17645.
[10] M. Dimitrova, C. Sadler, A. Murphy, and S. Hatzipanagos, “WestminsterResearch learning environments . in e-Learning Environments,” Expert Syst., 2003.
[11] B. D. Horne, D. Nevo, J. O’Donovan, J. H. Cho, and S. Adalı, “Rating reliability and bias in news articles: Does AI assistance help everyone?,” Proc. 13th Int. Conf. Web Soc. Media, ICWSM 2019, no. Icwsm, pp. 247–256, 2019, doi: 10.1609/icwsm.v13i01.3226.
[12] M. Pushkarna, A. Zaldivar, and O. Kjartansson, Data Cards: Purposeful and Transparent Dataset Documentation for Responsible AI. 2022.
[13] T. Nazaretsky, M. Cukurova, M. Ariely, and G. Alexandron, “Confirmation bias and trust: Human factors that influence teachers’ attitudes towards AI-based educational technology,” CEUR Workshop Proc., vol. 3042, 2021.
[14] “Latham , Annabel ORCID logoORCID : https://orcid.org/0000-0002-8410- 7950 and Goltz , Sean ( 2019 ) A Survey of the General Public ’ s Views on the Ethics of using AI in Education . In : Lecture Notes in Artificial Intelligence 11625 - AIED 2019 . Lecture,” pp. 0–13, 2019.
[15] W. Holmes et al., “Ethics of AI in Education : Towards a Community-Wide Framework,” Int. J. Artif. Intell. Educ., vol. 32, pp. 504–526, 2022, doi: https://doi.org/10.1007/s40593-021-00239-1.
[16] K. Khare, B. Stewart, and A. Khare, “Artificial Intelligence and the Student Experience: An Institutional Perspective,” IAFOR J. Educ., vol. 6, no. 3, pp. 63–78, 2018, doi: 10.22492/ije.6.3.04.
[17] O. Kouzov, “The New Paradigms In Education and Support of Critical Thinking with Artificial Intelligence (AI) Tools,” Serdica J. Comput., vol. 13, no. 1–2, pp. 27–40, 2019, doi: 10.55630/sjc.2019.13.27-40.
[18] M. Estrada, D. Monferrer, A. Rodríguez, and M. Á. Moliner, “Does emotional intelligence influence academic performance? The role of compassion and engagement in education for sustainable development,” Sustain., vol. 13, no. 4, pp. 1–18, 2021, doi: 10.3390/su13041721.
[19] O. Oritsegbemi, “Human Intelligence versus AI: Implications for Emotional Aspects of Human Communication,” J. Adv. Res. Soc. Sci., vol. 6, no. 2, pp. 76–85, 2023, doi: 10.33422/jarss.v6i2.1005.
[20] V. S. Nakshine, P. Thute, M. N. Khatib, and B. Sarkar, “Increased Screen Time as a Cause of Declining Physical, Psychological Health, and Sleep Patterns: A Literary Review,” Cureus, vol. 14, no. 10, pp. 1–9, 2022, doi: 10.7759/cureus.30051.
[21] C. P. Society, D. Health, and T. Force, “Digital media: Promoting healthy screen use in school-aged children and adolescents,” Paediatr. Child Heal., vol. 24, no. 6, pp. 402–408, 2019, doi: 10.1093/pch/pxz095.
[22] Jamiatun Nadwa Ismail et al. The Integrated Novel Framework: Linguistic Variables in Pythagorean Neutrosophic Set with DEMATEL for Enhanced Decision Support. Int. J. Neutrosophic Sci., vol. 21, no. 2, pp. 129-141, 2023.
[23] Z. bin M. Rodzi et al., “Integrated Single-Valued Neutrosophic Normalized Weighted Bonferroni Mean (SVNNWBM)-DEMATEL for Analyzing the Key Barriers to Halal Certification Adoption in Malaysia,” Int. J. Neutrosophic Sci., vol. 21, no. 3, pp. 106–114, 2023.
[24] Ashraf Al-Quran, Faisal Al-Sharqi, Zahari Md. Rodzi, Mona Aladil, Rawan A. shlaka, Mamika Ujianita Romdhini, Mohammad K. Tahat, Obadah Said Solaiman. (2023). The Algebraic Structures of Q-Complex Neutrosophic Soft Sets Associated with Groups and Subgroups. International Journal of Neutrosophic Science, 22 ( 1 ), 60-76.
[25] F. Al-Sharqi, A. Al-Quran, M. U. Romdhini, Decision-making techniques based on similarity measures of possibility interval fuzzy soft environment, Iraqi Journal for Computer Science and Mathematics, vol. 4, pp.18–29, 2023.
[26] F. Al-Sharqi, Y. Al-Qudah and N. Alotaibi, Decision-making techniques based on similarity measures of possibility neutrosophic soft expert sets, Neutrosophic Sets and Systems, 55(1), 358--382, 2023.
[27] S. Opricovic and G. H. Tzeng, “Extended VIKOR method in comparison with outranking methods,” Eur. J. Oper. Res., vol. 178, no. 2, pp. 514–529, 2007, doi: 10.1016/j.ejor.2006.01.020.
[28] J. Shieh, H. Wu, and K. Huang, “Knowledge-Based Systems A DEMATEL method in identifying key success factors of hospital service quality,” Knowledge-Based Syst., vol. 23, no. 3, pp. 277–282, 2010, doi: 10.1016/j.knosys.2010.01.013.
[29] B. Han, G. Buchanan, and D. McKay, Learning in the Panopticon: Examining the Potential Impacts of AI Monitoring on Students, vol. 1, no. 1. Association for Computing Machinery, 2022.
[30] S. Thiebes, S. Lins, and A. Sunyaev, “Trustworthy artificial intelligence,” Electron. Mark., vol. 31, no. 2, pp. 447–464, 2021, doi: 10.1007/s12525-020-00441-4.
[31] O. Chaika, “Молодь і ринок,” no. October, 2023, doi: 10.24919/2617-0825.6/214.2023.
[32] N. N. T. Vy, “AI Implementation in ODR: A Game-Changer or a Troublemaker of Data Protection,” Vietnamese J. Leg. Sci., vol. 8, no. 1, pp. 1–24, 2023, doi: 10.2478/vjls-2023-0001.
[33] M. Korir, S. Slade, W. Holmes, Y. Héliot, and B. Rienties, “Investigating the dimensions of students’ privacy concern in the collection, use and sharing of data for learning analytics,” Comput. Hum. Behav. Reports, vol. 9, p. 100262, 2023, doi: 10.1016/j.chbr.2022.100262.
[34] C. Mutimukwe, O. Viberg, L. M. Oberg, and T. Cerratto-Pargman, “Students’ privacy concerns in learning analytics: Model development,” Br. J. Educ. Technol., vol. 53, no. 4, pp. 932–951, 2022, doi: 10.1111/bjet.13234.
[35] J. P. McNulty and Y. Politis, “Empathy, emotional intelligence and interprofessional skills in healthcare education,” J. Med. Imaging Radiat. Sci., vol. 54, no. 2, pp. 238–246, 2023, doi: 10.1016/j.jmir.2023.02.014.
[36] A. R. Pratama, F. M. Firmansyah, and F. Rahma, “Security awareness of single sign-on account in the academic community: the roles of demographics, privacy concerns, and Big-Five personality,” PeerJ Comput. Sci., vol. 8, pp. 1–20, 2022, doi: 10.7717/PEERJ-CS.918.
[37] F. Al-Sharqi, A. Al-Quran and Z. M. Rodzi, Multi-Attribute Group Decision-Making Based on Aggregation Operator and Score Function of Bipolar Neutrosophic Hypersoft Environment, Neutrosophic Sets and Systems, 61(1), 465-492, 2023.