Journal of Artificial Intelligence and Metaheuristics

Journal DOI

https://doi.org/10.54216/JAIM

Submit Your Paper

2833-5597ISSN (Online)

Volume 2 , Issue 2 , PP: 08-17, 2022 | Cite this article as | XML | Html | PDF | Full Length Article

Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities

El-Sayed M. El-Kenawy 1 * , Abdelhameed Ibrahim 2 , Abdelaziz A. Abdelhamid 3 , Mohamed Saber 4 , Marwa M. Eid 5

  • 1 Department of Communications and Electronics, Delta Higher Institute of Engineering and Technology, Mansoura, 35111, Egypt - (skenawy@ieee.org)
  • 2 Computer Engineering and Control Systems Department, Faculty of Engineering, Mansoura University, 35516, Mansoura Egypt - (afai79@mans.edu.eg)
  • 3 Department of Computer Science, College of Computing and Information Technology, Shaqra University, Shaqra 11961, Saudi Arabia; Department of Computer Science, Faculty of Computer and Information Sciences, Ain Shams University, Cairo 11566, Egypt - (abdelaziz@su.edu.sa)
  • 4 Electronics and Communications Engineering Dep., Faculty of Engineering, Delta University for Science and Technology, Gamasa City, Mansoura, Egypt - (mohamed.saber@deltauniv.edu.eg)
  • 5 Faculty of Artificial Intelligence, Delta University for Science and Technology, Mansoura 35712, Egypt - (marwa.3eeed@gmail.com)
  • Doi: https://doi.org/10.54216/JAIM.020201

    Received: May 29, 2022 Accepted: November 16, 2022
    Abstract

    This research analyzes the effectiveness of several methods for categorizing human actions captured by inertial and magnetic sensor units worn on the chest, arms, and legs. Each device has tri-axial sensors, including a gyroscope, accelerometer, and magnetometer. Voting ensemble classification models, where votes are weighted and optimized with a new optimization technique, are offered as a means to actualize this classification problem. The optimization technique is a combination of the sine cosine and particle swarm optimization algorithms, and the ensemble model is made up of three classifiers: support vector machines, decision trees, and multilayer perceptron. The classifiers are checked for accuracy using three distinct cross-validation strategies. Classifiers' proper differentiation rates and computational costs are compared to help you choose the best one for your needs. When it comes to body location, sensor devices worn on the legs provide the most valuable data. From a comparison of the various sensor modalities, we can deduce that magnetometers, followed by accelerometers and gyroscopes, provide the best classification results when only a single sensor type is employed. Furthermore, the study contrasts three machine learning models—support vector machines, decision trees, and multilayer perceptron —with respect to their usability, controllability, and classifier performance. Results reveal that the suggested method performs well in categorizing both typical daily activities and athletic endeavors.

    Keywords :

    human activity classification , accelerometer , gyroscope , inertial sensors , body sensor , wearable sensors , machine learning , metaheuristic optimization algorithms

    References

    [1]  Titterton, D.H. and Weston, J.L. (2004). Strapdown Inertial Navigation Technology.  2nd edition.

    IEE, UK.

    [2]  Moeslund,  T.B.  and  Granum,  A  survey  of  computer  vision-based  human  motion  capture. 

    Computer Vision Image Underst , 81, 231–268, 2001.

    [3]  Moeslund, T.B., Hilton, A. and Krüger, V.,  A survey of advances in vision-based human motion 

    capture and analysis. Computer Vision Image Underst , 104, 90–126, 2006.

    [4]  Wang,  L.,  Hu,  W.  and  Tan,  T.,  Recent  developments  in  human  motion  analysis.  Pattern 

    Recognition, 36, 585–601, 2003.

    [5]  Aggarwal, J.K. and Cai, Q., Human motion analysis: a review. Computer Vision  Image Underst, 

    73, 428–440, 1999.

    [6]  Bandouch,  J.,  Jenkins,  O.C.  and  Beetz,  M.,  A  self-training  approach  for  visual  tracking  and 

    recognition of complex human activity patterns. Int. J. Comput. Vis., 99, 166–189, 2012.

    [7]  Turaga,  P.,  Chellappa,  R.,  Subrahmanian,  V.S.  and  Udrea,  O.,  Machine  recognition  of human 

    activities: a survey. IEEE Trans. Circuit Syst. Video, 18, 1473–1488, 2008.

    [8]  Luštrek,  M.  and  Kaluža,  B.,  Fall  detection  and  activity  recognition  with  machine  learning. 

    Informatica, 33, 205–212, 2009.

    [9]  Luštrek, M., Kaluža, B., Dovgan, E., Pogorelc, B. and Gams, M. (2009) Behavior Analysis Based 

    on  Coordinates  of  Body  Tags.  In  Tscheligi,  M.,  de  Ruyter,  B.,  Markopoulos,  P.,Wichert,  R., 

    Mirlacher,  T.,  Meschtscherjakov,  A.  and  Reitberger,  W.  (eds.),  Ambient  Intelligence,  Lecture 

    Notes in Computer Science 5859/2009, pp. 14–23. Springer, Berlin, Heidelberg.

    [10]  Darby, J., Li, B.H. and Costen, N.  Tracking human pose with multiple activity models. Pattern 

    Recognit., 43, 3042–3058, 2010. 

    [11]  Mayol-Cuevas,W.W., Tordoff, B.J. and Murray, D.W., On the choice and placement of wearable 

    vision sensors. IEEE Trans. Syst. Man Cybern. A, 39, 414–425, 2009.

    [12]  Kern,  N.,  Schiele,  B.  and  Schmidt,  A.  (2003)  Multi-Sensor  Activity  Context  Detection  for 

    Wearable Computing, In Aarts, E., Collier, R., van Loenen, E. and de Ruyter, B. (eds.), Ambient 

    Intelligence, Lecture Notes in Computer Science 2875, pp. 220–  232. Springer, Berlin, Heidelberg.

    [13]  Zijlstra,W.  and  Aminian,  K.,  Mobility  assessment  in  older  people:  new  possibilities  and 

    challenges. Eur. J. Ageing, 4, 3–12, 2007. 

    [14]  Mathie, M.J., Coster,A.C.F., Lovell, N.H. and Celler, B.G. (2004) Accelerometry: providing an 

    integrated, practical method for long-term, ambulatory monitoring of human movement. Physiol.

    Meas., 25, R1–R20.

    [15]  Wong, W.Y., Wong, M.S. and Lo, K.H.,  Clinical applications of sensors for human posture and 

    movement analysis: a review. Prosthet. Orthot. Int., 31, 62–75, 2007.

    [16]  Altun, K., Barshan,  B. and Tunçel, O.,  Comparative study on classifying human activities with 

    miniature inertial and magnetic sensors. Pattern Recognit., 43, 3605–3620, 2010.

    [17]  Patel,  S.,  Park,  H.,  Bonato,  P.,  Chan,  L.  and  Rodgers,  M.,  A  review  of  wearable  sensors  and 

    systems with application in rehabilitation. J. Neuroeng. Rehabil., 9, article number 21, 2012.

    [18]  El-sayed M. El-kenawy, Marwa M. Eid, Abdelhameed Ibrahim, Anemia Estimation for COVID-19 Patients Using A Machine Learning Model. Journal of Comnputer Science an d Information 

    Systems, 2(1) ,1-7, 2021.

    [19]  Fong, D.T.-P. and Chan,Y.-Y., The use of wearable inertial motion sensors in human lower limb 

    biomechanics studies: a systematic review. Sensors, 10, 11556–11565, 2010.

    [20]  Aminian, K., Robert, P., Buchser, E.E., Rutschmann, B., Hayoz, D. and Depairon, M.,  Physical 

    activity monitoring based on accelerometry: validation and comparison with video   observation. 

    Med. Biol. Eng. Comput., 37, 304–308, 1999.

    [21]  Roetenberg, D., Slycke, P.J. andVeltink, P.H.,  Ambulatory  position and orientation tracking fusing 

    magnetic and inertial sensing. IEEE Trans. Biomed. Eng., 54, 883–890, 2007. 

    [22]  Najafi, B., Aminian, K., Loew, F., Blanc, Y. and Robert, P.,  Measurement of stand-sit and sitstand  transitions  using  a  miniature  gyroscope  and  its  application  in  fall  risk  evaluation  in  the 

    elderly. IEEE Trans. Biomed. Eng., 49, 843–851, 2002.

    [23]  Najafi, B., Aminian, K., Paraschiv-Ionescu, A., Loew, F., Büla, C.J. and Robert P.,  Ambulatory 

    system for human motion analysis using a kinematic sensor: monitoring of daily physical activity 

    in the elderly. IEEE Trans. Biomed. Eng., 50, 711–723, 2003.

    [24]  Tao, Y., Hu, H. and Zhou, H.,  Integration of vision and inertial sensors for 3D arm motion tracking 

    in home-based rehabilitation. Int. J. Robot. Res., 26, 607–624, 2007.

    [25]  Viéville, T. and Faugeras, O.D. (1990) Cooperation of the Inertial and Visual Systems. NATO 

    ASI Series: Traditional and Nontraditional Robotic Sensors (59th edn). Vol. F63, pp. 339–350.

    Springer, Berlin, Heidelberg.

    [26]  Proc. Workshop on Integration of Vision and Inertial Sensors (InerVis), Coimbra, Portugal, June 

    2003; Barcelona, Spain, April 2005.

    [27]  Special  Issue  on  the  2nd  Workshop  on  Integration  of  Vision  and  Inertial  Sensors  (InerVis05) 

    (2007) Int. J. Robot. Res., 26, 295–302.

    [28]  Zhu,  R.  and  Zhou,  Z.,  Areal-time  articulated  human  motion  tracking  using  tri-axis 

    inertial/magnetic sensors package. IEEE T. Neural Syst. Rehab. Eng., 12, 295–302, 2004.

    [29]  Yun,  X.,  Bachmann,  E.R.,  Moore,  H.  and  Calusdian,  J.,  Self-Contained  Position  Tracking  of 

    Human  Movement  Using  Small  Inertial/Magnetic  Sensor  Modules.  Proc.  IEEE  Int.  Conf. 

    Robotic.Autom., Rome, Italy, April 10–14, 2526–2533. IEEE, New Jersey, 2007.

    [30]  Junker,  H.,  Amft,  O.,  Lukowicz,  P.  and  Tröster,  G.,  Gesture  spotting  with  body-worn inertial 

    sensors to detect user activities. Pattern Recognit., 41, 2010–2024, 2008.

    [31]  Bicocchi, N., Mamei, M. and Zambonelli, F., Detecting activities from body-worn accelerometers 

    via instance-based algorithms. Pervas. Mob. Comput., 6, 482–495, 2010.

    [32]  Zappi, P., Lombriser, C., Stiefmeier, T., Farella, E., Roggen, D., Benini, L. and Tröster, G. (2008) 

    Activity Recognition from On-  Body Sensors: Accuracy-Power Trade-Off by Dynamic Sensor 

    Selection,  In  R.Verdone  (ed.),Wireless  Sensor  Networks,  Lecture  Notes  in  Computer  Science 

    4913, pp. 17–33. Springer, Berlin, Heidelberg.

    [33]  Ghasemzadeh,  H.,  Loseu,  V.  and  Jafari,  R.,  Collaborative  Signal  Processing  for  Action 

    Recognition  in  Body  Sensor  Networks:  A  Distributed  Classification  Algorithm  Using  Motion 

    Transcripts.  Proc.  9th  ACM/IEEE  Int.  Conf.  Information  Processing  in  Sensor  Networks, 

    Stockholm, Sweden, April 12– 16, pp. 244–255, ACM, NewYork, USA, 2010.

    [34]  Schwarz, L.A., Mateus, D. and Navab, N.  Recognizing multiple human activities and tracking 

    full-body pose in unconstrained environments. Pattern Recognit., 45, 11–23, 2012.

    [35]  Ward, J.A., Lukowicz, P., Tröster, G. and Starner,  T.E.,  Activity recognition of assembly tasks 

    using body-worn microphones and accelerometers. IEEE Trans. Pattern Anal., 28,  1553–1567, 

    2006.

    [36]  Wang,  L.,  Gu,  T.,  Tao,  X.P.,  Chen,  H.H.  and  Lu,  J.,  Recognizing  multi-user  activities  using 

    wearable sensors in a smart home. Pervas. Mob. Comput., 7, 287–298, 2011.

    [37]  Yurtman, A. and Barshan, B.,  Inter-  and Intra-Subject Variations in Activity Recognition Using 

    Inertial  Sensors  and  Magnetometers.  5th  Int.  Conf.  Cognitive  Systems,  Collection  of  Posters, 

    Vienna, Austria, February 22–23, p. 8. Technical University of Vienna, Austria., 2012.

    [38]  Roggen, D. et al.,  OPPORTUNITY: Towards Opportunistic Activity and Context Recognition 

    Systems.  IEEE  Int.  Symp.  On  a  World  of  Wireless,  Mobile  and  Multimedia  Networks  & 

    Workshops, Kos Island, Greece, June 15–19. IEEE, New Jersey, 2009.

    [39]  Lukowicz, P., Timm-Giel, A., Lawo, M. and Herzog, O. (2007) WearIT@work: toward real-world 

    industrial wearable computing. IEEE Pervasive Comput., 6, 8–13, 2007.

    [40]  Pantelopoulos,  A.,  A  survey  on  wearable  sensor-based  systems  for  health  monitoring  and 

    prognosis. IEEE Trans. Syst. Man. Cybern C, 40, 1–12, 2010.

    [41]  CONFIDENCE:  Ubiquitous  care  system  to  support  independent  living,  FP7-ICT-214986. 

    http://www.confidence-eu.org/, 01.02.2008–31.07.2011. 

    [42]  Xsens Technologies B.V. (2009) Enschede, The Netherlands, MTi and MTx User Manual and 

    Technical Documentation, http://www.xsens.com.

    Cite This Article As :
    M., El-Sayed. , Ibrahim, Abdelhameed. , A., Abdelaziz. , Saber, Mohamed. , M., Marwa. Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities. Journal of Artificial Intelligence and Metaheuristics, vol. , no. , 2022, pp. 08-17. DOI: https://doi.org/10.54216/JAIM.020201
    M., E. Ibrahim, A. A., A. Saber, M. M., M. (2022). Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities. Journal of Artificial Intelligence and Metaheuristics, (), 08-17. DOI: https://doi.org/10.54216/JAIM.020201
    M., El-Sayed. Ibrahim, Abdelhameed. A., Abdelaziz. Saber, Mohamed. M., Marwa. Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities. Journal of Artificial Intelligence and Metaheuristics , no. (2022): 08-17. DOI: https://doi.org/10.54216/JAIM.020201
    M., E. , Ibrahim, A. , A., A. , Saber, M. , M., M. (2022) . Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities. Journal of Artificial Intelligence and Metaheuristics , () , 08-17 . DOI: https://doi.org/10.54216/JAIM.020201
    M. E. , Ibrahim A. , A. A. , Saber M. , M. M. [2022]. Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities. Journal of Artificial Intelligence and Metaheuristics. (): 08-17. DOI: https://doi.org/10.54216/JAIM.020201
    M., E. Ibrahim, A. A., A. Saber, M. M., M. "Metaheuristic Optimized Voting Ensemble for Recognizing Daily and Sports Activities," Journal of Artificial Intelligence and Metaheuristics, vol. , no. , pp. 08-17, 2022. DOI: https://doi.org/10.54216/JAIM.020201