128 116
Full Length Article
Neutrosophic and Information Fusion
Volume 2 , Issue 2, PP: 42-50 , 2023 | Cite this article as | XML | Html |PDF

Title

Inertial Information Fusion for Improved Vehicular Perception Systems

  Durdona Uktamova 1 *

1  Department of International Business Management, Tashkent State University of Economics, Uzbekistan
    (d.uktamova@tsue.uz)


Doi   :   https://doi.org/10.54216/NIF.020205

Received: June 12, 2023 Accepted: November 25, 2023

Abstract :

Advancing the capabilities of vehicle perception systems, through the fusion of sensor data is a pursuit in the field of vehicles and intelligent transportation systems. This study explores the complexities involved in enhancing vehicle perception with the goal of tackling the challenges associated with interpreting various sensor inputs to gain an understanding of the environment. By utilizing techniques that fuse information and clustering methodologies this research aims to identify driving scenarios based on patterns observed in sensor data allowing for a nuanced analysis of environmental variations. Additionally a classification framework using Convolutional Neural Networks (CNNs) is employed to accurately classify types of road surfaces demonstrating how deep learning models can effectively utilize sensor representations for environmental characterization. The methods employed encompass clustered data fusion, where K means clustering is utilized to segment sensor data into scenarios and CNN classification, for accurate identification of road surface types. The study achieved impressive findings using these methodologies, exhibiting unique clusters typical of various driving circumstances based on sensor aggregation and demonstrating the CNN's capacity for accurate road surface classification.

Keywords :

Information Fusion; Inertial Data; Vehicular Perception; Machine Learning.

References :

[1]    Ounoughi, Chahinez, and Sadok Ben Yahia. 2023. “Data Fusion for ITS: A Systematic Literature Review.” Information Fusion 89: 267–91.

[2]    Stiller, Christoph, Fernando Puente León, and Marco Kruse. 2011. “Information Fusion for Automotive Applications--An Overview.” Information Fusion 12 (4): 244–52.

[3]    Seeliger, Florian, and Klaus Dietmayer. 2014. “Inter-Vehicle Information-Fusion with Shared Perception Information.” In 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), 2087–93.

[4]    Aeberhard, Michael, Stefan Schlichtharle, Nico Kaempchen, and Torsten Bertram. 2012. “Track-to-Track Fusion with Asynchronous Sensors Using Information Matrix Fusion for Surround Environment Perception.” IEEE Transactions on Intelligent Transportation Systems 13 (4): 1717–26.

[5]    Morales, Joshua J, Joe J Khalife, and Zaher M Kassas. 2021. “Information Fusion Strategies for Collaborative Inertial Radio SLAM.” IEEE Transactions on Intelligent Transportation Systems 23 (8): 12935–52.

[6]    Carmona, Juan, Fernando Garc\’\ia, David Mart\’\in, Arturo de la Escalera, and José Mar\’\ia Armingol. 2015. “Data Fusion for Driver Behaviour Analysis.” Sensors 15 (10): 25968–91.

[7]    Zhang, Xiaobin, Liangfei Xu, Jianqiu Li, and Minggao Ouyang. 2013. “Real-Time Estimation of Vehicle Mass and Road Grade Based on Multi-Sensor Data Fusion.” In 2013 IEEE Vehicle Power and Propulsion Conference (VPPC), 1–7.

[8]    Liu, Ze, Yingfeng Cai, Hai Wang, Long Chen, Hongbo Gao, Yunyi Jia, and Yicheng Li. 2021. “Robust Target Recognition and Tracking of Self-Driving Cars with Radar and Camera Information Fusion under Severe Weather Conditions.” IEEE Transactions on Intelligent Transportation Systems 23 (7): 6640–53.

[9]    Hoang, Gia-Minh, Beno\^\it Denis, Jérôme Härri, and Dirk T M Slock. 2017. “Robust Data Fusion for Cooperative Vehicular Localization in Tunnels.” In 2017 IEEE Intelligent Vehicles Symposium (IV), 1372–77.

[10] Khankalantary, Saeed, Sadra Rafatnia, and Hassan Mohammadkhani. 2020. “An Adaptive Constrained Type-2 Fuzzy Hammerstein Neural Network Data Fusion Scheme for Low-Cost SINS/GNSS Navigation System.” Applied Soft Computing 86: 105917.

[11] Li, Aijuan, Jiaping Cao, Shunming Li, Zhen Huang, Jinbo Wang, and Gang Liu. 2022. “Map Construction and Path Planning Method for a Mobile Robot Based on Multi-Sensor Information Fusion.” Applied Sciences 12 (6): 2913.

[12] Gruyer, Dominique, Rachid Belaroussi, and Marc Revilloud. 2014. “Map-Aided Localization with Lateral Perception.” In 2014 IEEE Intelligent Vehicles Symposium Proceedings, 674–80.

[13] Dai, Minpeng, Haoyang Li, Jian Liang, Chunxi Zhang, Xiong Pan, Yizhuo Tian, Jinguo Cao, and Yuxuan Wang. 2023. “Lane Level Positioning Method for Unmanned Driving Based on Inertial System and Vector Map Information Fusion Applicable to GNSS Denied Environments.” Drones 7 (4): 239.

[14] Zhao, Yibing, Jining Li, Linhui Li, Mingheng Zhang, Lie Guo, and others. 2013. “Environmental Perception and Sensor Data Fusion for Unmanned Ground Vehicle.” Mathematical Problems in Engineering 2013.

[15] Du, Hao, Wei Wang, Chaowen Xu, Ran Xiao, and Changyin Sun. 2020. “Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion.” Sensors 20 (3): 919.

[16] Velasco-Hernandez, Gustavo, John Barry, Joseph Walsh, and others. 2020. “Autonomous Driving Architectures, Perception and Data Fusion: A Review.” In 2020 IEEE 16th International Conference on Intelligent Computer Communication and Processing (ICCP), 315–21.

[17] Cui, Jia-shan, Fang-rui Zhang, Dong-zhu Feng, Cong Li, Fei Li, and Qi-chen Tian. 2023. “An Improved SLAM Based on RK-VIF: Vision and Inertial Information Fusion via Runge-Kutta Method.” Defence Technology 21: 133–46.

[18] Fayyad, Jamil, Mohammad A Jaradat, Dominique Gruyer, and Homayoun Najjaran. 2020. “Deep Learning Sensor Fusion for Autonomous Vehicle Perception and Localization: A Review.” Sensors 20 (15): 4220.

[19] Polychronopoulos, Aris, Ullrich Scheunert, and Fabio Tango. 2004. “Centralized Data Fusion for Obstacle and Road Borders Tracking in a Collision Warning System.” In International Conference on Information Fusion, 210–16.

[20] Lytrivis, Panagiotis, Angelos Amditis, and George Thomaidis. 2009. “Sensor Data Fusion in Automotive Applications.”

[21] Liu, Long, Zhelong Wang, and Sen Qiu. 2020. “Driving Behavior Tracking and Recognition Based on Multisensors Data Fusion.” IEEE Sensors Journal 20 (18): 10811–23.

[22] Shi, Guolong, Yigang He, Bing Li, and Qiwu Luo. 2017. “Application of Multi-Sensor Information Fusion Based on Improved Particle Swarm Optimization in Unmanned System Path Planning.” International Journal of Online Engineering 13 (8).

[23] Wang, Peng, Xiangming Wen, Luhan Wang, Zhaoming Lu, and Lu Ma. 2019. “An Improved DS Based Vehicular Multi-Sensors’ Perceptual Data Fusion for Automated Driving Decision-Making.” In 2019 IEEE 90th Vehicular Technology Conference (VTC2019-Fall), 1–7.


Cite this Article as :
Style #
MLA Durdona Uktamova. "Inertial Information Fusion for Improved Vehicular Perception Systems." Neutrosophic and Information Fusion, Vol. 2, No. 2, 2023 ,PP. 42-50 (Doi   :  https://doi.org/10.54216/NIF.020205)
APA Durdona Uktamova. (2023). Inertial Information Fusion for Improved Vehicular Perception Systems. Journal of Neutrosophic and Information Fusion, 2 ( 2 ), 42-50 (Doi   :  https://doi.org/10.54216/NIF.020205)
Chicago Durdona Uktamova. "Inertial Information Fusion for Improved Vehicular Perception Systems." Journal of Neutrosophic and Information Fusion, 2 no. 2 (2023): 42-50 (Doi   :  https://doi.org/10.54216/NIF.020205)
Harvard Durdona Uktamova. (2023). Inertial Information Fusion for Improved Vehicular Perception Systems. Journal of Neutrosophic and Information Fusion, 2 ( 2 ), 42-50 (Doi   :  https://doi.org/10.54216/NIF.020205)
Vancouver Durdona Uktamova. Inertial Information Fusion for Improved Vehicular Perception Systems. Journal of Neutrosophic and Information Fusion, (2023); 2 ( 2 ): 42-50 (Doi   :  https://doi.org/10.54216/NIF.020205)
IEEE Durdona Uktamova, Inertial Information Fusion for Improved Vehicular Perception Systems, Journal of Neutrosophic and Information Fusion, Vol. 2 , No. 2 , (2023) : 42-50 (Doi   :  https://doi.org/10.54216/NIF.020205)