Fusion: Practice and Applications

Submit Your Paper

2692-4048ISSN (Online) 2770-0070ISSN (Print)

Volume 21 , Issue 2 , PP: 476-488, 2026 | Cite this article as | XML | Html | PDF | Full Length Article

Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing

Mostafa Borhani 1 *

  • 1 Smart Tech Services, Muscat, Oman - (borhani@iSmartGCC.com)
  • Doi: https://doi.org/10.54216/FPA.210229

    Received: April 18, 2025 Revised: June 28, 2025 Accepted: August 20, 2025
    Abstract

    Infrastructure-based autonomous perception operates under fundamentally different constraints than vehicle mounted systems: elevated-mounting geometries producing depression-angle-dependent sparse point clouds, a 12.95 W IEEE 802.3af Power-over- Ethernet (PoE) power ceiling, and distributed asynchronous sensing governed by IEEE 1588v2 precision time protocol (PTP) synchronization uncertainty. Existing automotive radar–LiDAR fusion frameworks assume abundant power, dense sensing, and synchronous measurements — assumptions that all fail in fixed infrastructure deployments. This paper presents XADAR, an uncertainty-aware multi-modal fusion framework designed for these infrastructure-specific constraints. XADAR makes three princi-pal contributions: (1) a covariance inflation mechanism that propagates PTP synchronization uncertainty continuously through the fusion pipeline, replacing hard synchronization thresholds with a smooth degradation curve proportional to temporal offset; (2) adap-tive sensor-specific fusion weights derived from modality covariance matrices that account for IWR6843 77 GHz FMCW radar Doppler ambiguity and ground-reflection multipath, and TFS20-L ToF LiDAR atmospheric scattering and range-zone limitations; and (3) a complete reproducible architecture including an IEEE 802.3af-compliant power budget (5.78 W maximum concurrent load; 41.6% PoE safety margin), quantitative 77 GHz propagation analysis based on ITU-R P.676-12 and P.838-3 (10.7 dB fade margin at 100 m under 50 mm/hr rain), and an MIL-STD-1629 FMEA covering twelve failure modes with severity classifications. A structured five-stage validation pathway from synthetic temporal-offset experiments to sixmonth field trials is defined for future empirical work.

    Keywords :

    Multi-sensor fusion , Radar-LiDAR fusion , Uncertainty quantification , Covariance inflation , Power-over- Ethernet , IEEE 1588v2 precision time protocol , Smart infrastructure perception , 77 GHz FMCW radar

    References

    [1] Tesla, Inc., “Autonomy Investor Day,” Palo Alto, CA, USA, Apr. 22, 2019.

     

    [2] P. Sun et al., “Scalability in perception for autonomous driving: Waymo Open Dataset,” in Proc. IEEE/CVF CVPR, 2020, pp. 2446–2454, doi: 10.1109/CVPR42600.2020.00252.

     

    [3] H. Caesar et al., “nuScenes: A multimodal dataset for autonomous driving,” in Proc. IEEE/CVF CVPR, 2020, pp. 11621–11631.

     

    [4] J. Zhang, C. Ge, W. Xiao, M. Tang, J. Mills, B. Coifman, and N. Chen, “Roadside lidar-based scene understanding toward intelligent traffic perception: A comprehensive review,” ISPRS J. Photogramm. Remote Sens., vol. 233, pp. 69–88, 2026, doi: 10.1016/j.isprsjprs.2026.01.012.

     

    [5] K. Dresner and P. Stone, “A multiagent approach to autonomous intersection management,” J. Artif. Intell. Res., vol. 31, pp. 591– 656, 2008.

     

    [6] IEEE Standards Association, IEEE 802.3-2022: IEEE Standard for Ethernet. Piscataway, NJ, USA: IEEE, 2022.

     

    [7] W. Zimmer, J. Birkner, M. Brucker, H. T. Nguyen, S. Petrovski, B. Wang, and A. C. Knoll, “InfraDet3D: Multi-modal 3D object detection based on roadside infrastructure camera and LiDAR sensors,” in Proc. IEEE Intell. Vehicles Symp. (IV), 2023, doi: 10.1109/IV55152.2023.10186723.

     

    [8] X. Zhang, Y. Li, J. Wang, X. Qin, Y. Shen, Z. Fan, and X. Tan, “InScope: A new real-world 3D infrastructureside collaborative perception dataset for open traffic scenarios,” Information Fusion, vol. 128, Art. no. 103951, 2026, doi: 10.1016/j.inffus.2025.103951.

     

    [9] W. Zimmer, G. A. Wardana, S. Sritharan, X. Zhou, R. Song, and A. C. Knoll, “TUMTraf V2X cooperative perception dataset,” in Proc. IEEE/CVF CVPR, 2024, pp. 22668–22677, doi: 10.1109/CVPR52733.2024.02139.

     

    [10] Y. Wang, Q. Mao, H. Zhu, J. Deng, Y. Zhang, J. Ji, H. Li, and Y. Zhang, “Multi-Modal 3D Object Detection in Autonomous Driving: A Survey,” Int. J. Comput. Vis., vol. 131, no. 8, pp. 2122–2152, 2023, doi: 10.1007/s11263-023-01784-z.

     

    [11] J. Song, L. Zhao, and K. A. Skinner, “LiRaFusion: Deep adaptive LiDAR-radar fusion for 3D object detection,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), 2024, doi: 10.1109/ICRA57147.2024.10611436.

     

    [12] A. H. Lang, S. Vora, H. Caesar, L. Zhou, J. Yang, and O. Beijbom, “PointPillars: Fast encoders for object detection from point clouds,” in Proc. IEEE/CVF CVPR, 2019, pp. 12697–12705, doi: 10.1109/CVPR.2019.01298.

     

    [13] A. Geiger, P. Lenz, and R. Urtasun, “Are we ready for autonomous driving? The KITTI vision benchmark suite,” in Proc. IEEE/CVF CVPR, 2012, pp. 3354–3361.

     

    [14] IEEE Standards Association, IEEE 1588-2019: IEEE Standard for a Precision Clock Synchronization Protocol for Networked Measurement and Control Systems. Piscataway, NJ, USA: IEEE, 2020.

     

    [15] Texas Instruments, TPS23861 IEEE 802.3at Quad Port Power-over-Ethernet PSE Controller Datasheet, Rev. I, 2023.

     

    [16] Analog Devices, MAX5953A–MAX5953D: IEEE 802.3af PD Interface and PWM Controllers with Integrated Power MOSFETs Data Sheet, Rev. 1, 2006.

     

    [17] Texas Instruments, IWR6843 Single-Chip 60- to 64-GHz Intelligent mmWave Sensor Datasheet, SWRS220D, 2023.

     

    [18] ITU-R, Recommendation ITU-R P.676-12: Attenuation by Atmospheric Gases and Related Effects. Geneva, Switzerland: ITU-R, 2019.

     

    [19] ITU-R, Recommendation ITU-R P.838-3: Specific Attenuation Model for Rain for Use in Prediction Methods. Geneva, Switzerland: ITU-R, 2005.

     

    [20] Microchip Technology, KSZ9031RNX Gigabit Ethernet Transceiver with RGMII Data Sheet, DS00002117K, 2024.

     

    [21] A. Dosovitskiy et al., “CARLA: An open urban driving simulator,” in Proc. Conf. Robot Learn. (CoRL), vol. 78, 2017, pp. 1–16.

     

    [22] U.S. Department of Defense, MIL-STD-1629A: Procedures for Performing a Failure Mode, Effects, and Criticality Analysis. Washington, DC, USA: DoD, 1980.

     

    [23] H. Caesar et al., “nuScenes: A multimodal dataset for autonomous driving,” in Proc. IEEE/CVF CVPR, 2020, pp. 11621–11631.

     

    [24] J. Ajgl and O. Straka, “Fusion of multiple estimates by covariance intersection: Why and how it is suboptimal,” Int. J. Appl. Math. Comput. Sci., vol. 28, no. 3, pp. 521–530, 2018, doi: 10.2478/amcs-2018-0040.

     

    [25] V. A. Sindagi, Y. Zhou, and O. Tuzel, “MVX-Net: Multimodal VoxelNet for 3D object detection,” in Proc. IEEE Int. Conf. Robot. Autom. (ICRA), 2019, pp. 7276–7282, doi: 10.1109/ICRA.2019.8794195.

     

    [26] D. Feng, C. Haase-Sch”utz, L. Rosenbaum, H. Hertlein, C. Gl”aser, F. Timm, W. Wiesbeck, and K. Dietmayer, “Deep multimodal object detection and semantic segmentation for autonomous driving: Datasets, methods, and challenges,” IEEE Trans. Intell. Transport. Syst., vol. 22, no. 3, pp. 1341–1360, 2021, doi: 10.1109/TITS.2020.2972974.

     

    [27] M. Borhani, V. Sedghi, and M. M. Nayebi, “A new technique in passive coherent radar signal processing,” in Proc. Eur. Radar Conf. (EuRAD), Paris, France, Oct. 2005, pp. 149–151, doi: 10.1109/EURAD.2005.1605587.

     

    [28] Telecommunications Industry Association, TIA-568.2-D: Balanced Twisted-Pair Telecommunications Cabling and Components Standard. Arlington, VA, USA: TIA, 2018.

     

    [29] M. I. Skolnik, Introduction to Radar Systems, 3rd ed. New York, NY, USA: McGraw-Hill, 2001.

     

    [30] Benewake (Beijing) Co., Ltd., TFS20-L Datasheet, Aug. 2024. [Online]. Available: https://en.benewake.com/uploadfiles/2024/08/20240821162819250.pdf

    Cite This Article As :
    Borhani, Mostafa. Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing. Fusion: Practice and Applications, vol. , no. , 2026, pp. 476-488. DOI: https://doi.org/10.54216/FPA.210229
    Borhani, M. (2026). Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing. Fusion: Practice and Applications, (), 476-488. DOI: https://doi.org/10.54216/FPA.210229
    Borhani, Mostafa. Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing. Fusion: Practice and Applications , no. (2026): 476-488. DOI: https://doi.org/10.54216/FPA.210229
    Borhani, M. (2026) . Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing. Fusion: Practice and Applications , () , 476-488 . DOI: https://doi.org/10.54216/FPA.210229
    Borhani M. [2026]. Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing. Fusion: Practice and Applications. (): 476-488. DOI: https://doi.org/10.54216/FPA.210229
    Borhani, M. "Uncertainty-Aware Radar-LiDAR Fusion for PoE-Constrained Smart Infrastructure Perception with Asynchronous Sensing," Fusion: Practice and Applications, vol. , no. , pp. 476-488, 2026. DOI: https://doi.org/10.54216/FPA.210229