Journal of Intelligent Systems and Internet of Things

Journal DOI

https://doi.org/10.54216/JISIoT

Submit Your Paper

2690-6791ISSN (Online) 2769-786XISSN (Print)

Volume 15 , Issue 2 , PP: 55-75, 2025 | Cite this article as | XML | Html | PDF | Full Length Article

Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition

Ponugoti Kalpana 1 * , Sarangam Kodati 2 , L. Smitha 3 , Dhasaratham 4 , Nara Sreekanth 5 , Aseel Smerat 6 , Muhannad Akram Ahmad 7

  • 1 Assistant Professor, Department of Computer Science and Engineering, AVN Institute of Engineering and Technology, Hyderabad, Telangana, 501510, India - (drkalpanacse@gmail.com)
  • 2 Associate Professor, Department of Information Technology, CVR College of Engineering, Hyderabad, Telangana, 501510, India - (k.sarangam@gmail.com)
  • 3 Assistant Professor, Department of Information Technology, G Narayanamma Institute of Technology and Science, Hyderabad, Telangana, India - (smitha2005sri@gnits.ac.in)
  • 4 Associate Professor, Department of Information Technology, TKR College of Engineering and Technology Hyderabad, Telangana, India - (dasarath.m@gmail.com)
  • 5 Associate Professor, Department of Computer Science and Engineering, BVRIT Hyderabad College of Engineering for Women, Hyderabad, Telangana, India - (nara.sreekanthap@gmail.com)
  • 6 Centre for Research Impact & Outcome, Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura, 140401, Punjab, India; Applied science research center, applied science private university, Amman 11931, Jordan - (Smerat.2020@gmail.com)
  • 7 Faculty of Economics and Administrative Sciences, Al Albayt University, Mafraq, Jordan - (dr.muhannadahmad@aabu.edu.jo)
  • Doi: https://doi.org/10.54216/JISIoT.150205

    Received: September 25, 2024 Revised: November 20, 2024 Accepted: January 10, 2025
    Abstract

    Due to the rapid expansion of the Internet of Things (IoT), supportive systems for healthcare have made significant advancements in both diagnosis and treatment processes. To provide optimal support in clinical settings and daily activities, these systems must accurately detect human movements. Real-time gait analysis plays a crucial role in developing advanced supportive systems. While machine learning and deep learning algorithms have significantly improved gait detection accuracy, many existing models primarily focus on enhancing detection accuracy, often neglecting computational overhead, which can affect real-time applicability. This paper proposes a novel hybrid combination of Sparse Gate Recurrent Units (SGRUs) and Devil Feared Feed Forward Networks (DFFFN) to effectively recognize human activities based on gait data. These data are gathered through Wearable Internet of Things (WIoT) devices. The SGRU and DFFFN networks extract spatio-temporal features for classification, enabling accurate gait recognition. Moreover, Explainable Artificial Intelligence (EAI) assesses the interoperability, scalability, and reliability of the proposed hybrid deep learning framework. Extensive experiments were conducted on real-time datasets and benchmark datasets, including WHU-Gait and OU-ISIR, to validate the algorithm’s efficacy against existing hybrid methods. SHAP models were also employed to evaluate feature importance and predict the degree of interoperability and robustness. The experimental results show that the method, combining Sparse GRUs and Tasmanian Devil Optimization (TDO)-inspired classifiers, achieves superior accuracy and computational efficiency compared to existing models. Tested on real-time and benchmark datasets, the model demonstrates significant potential for real-time healthcare applications, with an AUC of 0.988 on real-time data. These findings suggest that the approach offers practical benefits for improving gait recognition in clinical settings.

    Keywords :

    Internet of Things , WIoT , Explainable AI , Devil Feared Feed Forward Networks , Sparse Gated Recurrent Units

    References

    [1]       J. Bravo, L. Fuentes, and D. L. de Ipina, "Theme issue: Ubiquitous computing and ambient intelligence," Pers. Ubiquitous Comput., vol. 15, no. 3, pp. 315–316, 2011. doi: 10.1007/s00779-010-0358-9.

    [2]       A. Rahman, M. S. Hossain, G. Muhammad, et al., "Federated learning-based AI approaches in smart healthcare: Concepts, taxonomies, challenges, and open issues," Cluster Comput., vol. 26, pp. 2271–2311, 2023. doi: 10.1007/s10586-022-03658-4.

    [3]       J. Medina-Quero, S. Zhang, C. Nugent, and M. Espinilla, "Ensemble classifier of long short-term memory with fuzzy temporal windows on binary sensors for activity recognition," Expert Syst. Appl., vol. 114, pp. 441–453, 2018. doi: 10.1016/j.eswa.2018.07.068.

    [4]       R. Ali-Hamad, A. Salguero, M. H. Bouguelia, M. Espinilla, and M. Medina-Quero, "Efficient activity recognition in smart homes using delayed fuzzy temporal windows on binary sensors," IEEE J. Biomed. Health Inform., 2019. doi: 10.1109/JBHI.2019.2918412.

    [5]       F. J. Ordonez and D. Roggen, "Deep convolutional and lstm recurrent neural networks for multimodal wearable activity recognition," Sensors, vol. 16, no. 1, pp. 115, 2016. doi: 10.3390/s16010115.

    [6]       M. Kose, O. D. Incel, and C. Ersoy, "Online human activity recognition on smartphones," in Proc. 2nd Int. Workshop Mobile Sensing, Beijing, China, Apr. 2012.

    [7]       C. A. Ronao and S. B. Cho, "Deep convolutional neural networks for human activity recognition with smartphone sensors," in Proc. Int. Conf. Neural Inf. Process., Istanbul, Turkey, Nov. 2015, pp. 46–53. doi: 10.1007/978-3-319-26561-2_6.

    [8]       D. Anguita, A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz, "A public domain dataset for human activity recognition using smartphones," in Proc. European Symp. Artif. Neural Networks, Bruges, Belgium, Apr. 2013.

    [9]       R. San-Segundo, H. Blunck, J. Moreno-Pimentel, A. Stisen, and M. Gil-Martn, "Robust human activity recognition using smartwatches and smartphones," Eng. Appl. Artif. Intell., vol. 72, pp. 190–202, 2018. doi: 10.1016/j.engappai.2018.04.002.

    [10]    F. Attal, S. Mohammed, M. Dedabrishvili, F. Chamroukhi, L. Oukhellou, and Y. Amirat, "Physical human activity recognition using wearable sensors," Sensors, vol. 15, no. 12, pp. 31314–31338, 2015. doi: 10.3390/s151229858.

    [11]    A. Malaisé, P. Maurice, F. Colas, F. Charpillet, and S. Ivaldi, "Activity recognition with multiple wearable sensors for industrial applications," in Proc. 11th Int. Conf. Advances in Computer-Human Interactions, Rome, Italy, Mar. 2018.

    [12]    S. Allan, H. Henrik, S. Sourav, T. S. P., M. B. K., A. K. D., T. S., and M. J. Möller, "Smart devices are different: Assessing and mitigating mobile sensing heterogeneities for activity recognition," in Proc. 13th ACM Conf. Embedded Networked Sensor Systems, New York, NY, USA, Oct. 2015, pp. 127–140. doi: 10.1145/2809695.2809718.

    [13]    A. Subasi, M. Radhwan, R. Kurdi, and K. Khateeb, "IoT-based mobile healthcare system for human activity recognition," in Proc. 15th Learning and Technology Conf. (L&T), Jeddah, Saudi Arabia, Feb. 2018, pp. 29–34. doi: 10.1109/LT.2018.8368507.

    [14]    M. Kheirkhahan, S. Nair, A. Davoudi, P. Rashidi, A. A. Wanigatunga, D. B. Corbett, T. Mendoza, T. M. Manini, S. Ranka, "A smartwatch-based framework for real-time and online assessment and mobility monitoring," J. Biomed. Inform., vol. 89, pp. 29–40, 2019. doi: 10.1016/j.jbi.2018.11.003.

    [15]    K. M. Masum, A. Barua, E. H. Bahadur, M. R. Alam, M. A. U. Z. Chowdhury, and M. S. Alam, "Human activity recognition using multiple smartphone sensors," in Proc. 2018 Int. Conf. Innovations in Science, Engineering and Technology (ICISET), Chittagong, Bangladesh, Oct. 2018, pp. 468–473. doi: 10.1109/ICISET.2018.8745628.

    [16]    Y. Tian, X. Wang, W. Chen, Z. Liu, and L. Li, "Adaptive multiple classifiers fusion for inertial sensor-based human activity recognition," Clust. Comput., vol. 22, 2019. doi: 10.1007/s10586-017-1648-z.

    [17]    A. Peinado-Contreras and M. Munoz-Organero, "Gait-based identification using deep recurrent neural networks and acceleration patterns," Sensors, vol. 20, no. 23, pp. 6900, 2020. doi: 10.3390/s20236900.

    [18]    W. Shen and T. Tan, "Automated biometrics-based personal identification," Proc. Natl. Acad. Sci. USA, vol. 96, no. 20, pp. 11065–11066, 1999. doi: 10.1073/pnas.96.20.11065.

    [19]    R. He, X. Wu, Z. Sun, and T. Tan, "Wasserstein CNN: Learning invariant features for NIR-VIS face recognition," IEEE Trans. Pattern Anal. Mach. Intell., vol. 41, no. 8, pp. 1761–1773, 2018.

    [20]    K.-T. Nguyen, T.-L. Vo-Tran, D.-T. Dinh, and M.-T. Tran, "Gait recognition with multi-region size convolutional neural network for authentication with wearable sensors," in Proc. Int. Conf. Future Data and Security Engineering (FDSE), Ho Chi Minh City, Vietnam, Nov. 2017, pp. 14–23. doi: 10.1007/978-3-319-70004-5_14.

    [21]    A. Alharbi, K. Equbal, S. Ahmad, H. Ur Rahman, and H. Alyami, "Human gait analysis and prediction using the Levenberg-Marquardt method," Hindawi J. Healthcare Eng., vol. 2021, Article ID 5541255, 11 pages, 2021. doi: 10.1155/2021/5541255.

    [22]    A. M. Saleh and T. Hamoud, "Analysis and best parameters selection for person recognition based on gait model using CNN algorithm and image augmentation," J. Big Data, vol. 8, no. 1, 2021. doi: 10.1186/s40537-020-00387-6.

    [23]    J. Moon, N. A. Le, N. H. Minaya, and S.-I. Choi, "Multimodal few-shot learning for gait recognition," Appl. Sci., vol. 10, no. 24, pp. 7619, 2020. doi: 10.3390/app10217619.

    [24]    W. Jiang and Z. Yin, "Human activity recognition using wearable sensors by deep convolutional neural networks," in Proc. 23rd ACM Int. Conf. Multimedia, 2015, pp. 1307–1310.

    [25]    G. Laput and C. Harrison, "Sensing fine-grained hand activity with smartwatches," in Proc. 2019 CHI Conf. Human Factors Comput. Syst., 2019, pp. 338.

    [26]    S. Ha and S. Choi, "Convolutional neural networks for human activity recognition using multiple accelerometer and gyroscope sensors," in 2016 Int. Joint Conf. Neural Networks, pp. 381–388.

    [27]    Y.-H. Shen, K.-X. He, and W.-Q. Zhang, "SAM-GCNN: A gated convolutional neural network with segment-level attention mechanism for home activity monitoring," in 2018 IEEE Int. Symp. Signal Process. and Inf. Technol. (ISSPIT), pp. 679–684.

    [28]    H. Guo, L. Chen, L. Peng, and G. Chen, "Wearable sensor-based multimodal human activity recognition exploiting the diversity of classifier ensemble," in Proc. 2016 ACM Int. Joint Conf. Pervasive and Ubiquitous Comput., 2016, pp. 1112–1123.

    [29]    Y. Yuki, J. Nozaki, K. Hiroi, K. Kaji, and N. Kawaguchi, "Activity recognition using DualConvLSTM extracting local and global features for SHL recognition challenge," in 2018 ACM Int. Joint Conf. and 2018 Int. Symp. Pervasive and Ubiquitous Comput. and Wearable Comput., 2018, pp. 1643–1651. doi: 10.1145/3267305.3267533.

    [30]    B. Guo, U. Chen, D. Zhang, and Z. Yu, "Deep learning for sensor-based human activity recognition: Overview, challenges and opportunities," ACM Computing Surveys, vol. 37, no. 4, Article 111, Aug. 2018. doi: 10.1145/1122445.1122456.

    [31]    U. T. Kumbhar, R. Phursule, V. C. Patil, R. K. Moje, O. R. Shete, and M. A. Tayal, "Explainable AI-powered IoT systems for predictive and preventive healthcare: A framework for personalized health management and wellness optimization," J. Electrical Systems, vol. 19, no. 3, pp. 23–31, 2023. doi: 10.52783/jes.648.

    [32]    M. H. Wang, K. K. Lung Chong, Z. Lin, X. Yu, and Y. Pan, "An explainable artificial intelligence-based robustness optimization approach for age-related macular degeneration detection based on medical IoT systems," Electronics, vol. 12, no. 12, pp. 2267, 2023. doi: 10.3390/electronics12122697.

    [33]    X.-S. Yang and X. He, "Bat algorithm: Literature review and applications," Int. J. Bio-Inspired Comput., vol. 5, no. 3, pp. 141–149, 2013. doi: 10.1504/IJBIC.2013.055093.

    Cite This Article As :
    Kalpana, Ponugoti. , Kodati, Sarangam. , Smitha, L.. , , Dhasaratham. , Sreekanth, Nara. , Smerat, Aseel. , Akram, Muhannad. Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition. Journal of Intelligent Systems and Internet of Things, vol. , no. , 2025, pp. 55-75. DOI: https://doi.org/10.54216/JISIoT.150205
    Kalpana, P. Kodati, S. Smitha, L. , D. Sreekanth, N. Smerat, A. Akram, M. (2025). Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition. Journal of Intelligent Systems and Internet of Things, (), 55-75. DOI: https://doi.org/10.54216/JISIoT.150205
    Kalpana, Ponugoti. Kodati, Sarangam. Smitha, L.. , Dhasaratham. Sreekanth, Nara. Smerat, Aseel. Akram, Muhannad. Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition. Journal of Intelligent Systems and Internet of Things , no. (2025): 55-75. DOI: https://doi.org/10.54216/JISIoT.150205
    Kalpana, P. , Kodati, S. , Smitha, L. , , D. , Sreekanth, N. , Smerat, A. , Akram, M. (2025) . Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition. Journal of Intelligent Systems and Internet of Things , () , 55-75 . DOI: https://doi.org/10.54216/JISIoT.150205
    Kalpana P. , Kodati S. , Smitha L. , D. , Sreekanth N. , Smerat A. , Akram M. [2025]. Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition. Journal of Intelligent Systems and Internet of Things. (): 55-75. DOI: https://doi.org/10.54216/JISIoT.150205
    Kalpana, P. Kodati, S. Smitha, L. , D. Sreekanth, N. Smerat, A. Akram, M. "Explainable AI-Driven Gait Analysis Using Wearable Internet of Things (Wiot) and Human Activity Recognition," Journal of Intelligent Systems and Internet of Things, vol. , no. , pp. 55-75, 2025. DOI: https://doi.org/10.54216/JISIoT.150205