SENSOR MODALITIES FOR RELATIVE LOCALIZATION OF COOPERATIVE UAV SWARMS IN GNSS-DENIED ENVIRONMENTS

Authors

DOI:

https://doi.org/10.20998/2413-3000.2025.11.8

Keywords:

multi-agent systems, relative localization, computer vision, visual–inertial odometry, cooperative simultaneous localization and mapping, distributed optimization, factor graphs, sensor fusion, active optical markers, GNSS-denied environments, autonomous unmanned aerial vehicles, SWaP constraints

Abstract

The deployment of cooperative swarms of unmanned aerial vehicles in environments without access to global satellite navigation systems represents one of the key challenges in modern autonomous multi-agent systems. Under such conditions, relative localization among agents must provide sufficient accuracy and consistency while operating under strict constraints on size, weight, and power consumption, which are characteristic of mass-produced micro-scale platforms. This paper presents a systematic review of sensor modalities used for relative localization of cooperative swarms in navigation-denied environments, with a particular focus on their computational properties, observability, and robustness to environmental factors. The study analyzes and classifies three principal classes of sensor modalities: radio-frequency distance measurement based on ultra-wideband communication, passive visual methods including visual–inertial odometry and deep learning–based approaches, and systems employing active optical markers. The analysis demonstrates that radio-frequency ranging methods offer low computational cost but suffer from fundamental observability limitations due to the lack of angular information. Passive visual approaches are capable of achieving high accuracy and global consistency; however, they require substantial computational resources, which restricts their practical applicability on platforms with severe hardware constraints. A critical evaluation indicates that active optical marker systems, when combined with distributed state graph optimization methods, constitute a practically viable compromise between localization accuracy, computational efficiency, and robustness under degraded environmental conditions. Particular attention is given to distributed estimation architectures that enable scalability and consistency without centralized processing. The paper concludes by outlining directions for future research aimed at developing hybrid frameworks capable of dynamically switching between full relative pose estimation and direction-only tracking in challenging operational environments.

References

Li S., Shan F., Liu J. et al. Onboard Ranging-based Relative Localization and Stability for Lightweight Aerial Swarms. arXiv, 2024. DOI: 10.48550/arXiv.2003.05853.

Gong B., Wang S., Hao M. et al. Range-based collaborative relative navigation for multiple unmanned aerial vehicles using consensus extended Kalman filter. Aerospace Science and Technology. Vol. 112, 2021. P. 106647.

Luo H., Liu Y., Guo C. et al. SuperVINS: A Real-Time Visual-Inertial SLAM Framework for Challenging Imaging Conditions. arXiv, 2024. DOI: 10.48550/arXiv.2407.21348.

Masiero A., Gurturk M., Toth C. et al. A Test on Collaborative Vision and UWB-based Positioning. ISPRS Archives. Vol. XLVIII-1/W2-2023. P. 1185–1190.

Chakraborty A., Sharma R., Brink K. M. Cooperative Localization for Multirotor Unmanned Aerial Vehicles. AIAA, 2019. DOI: 10.2514/6.2019-0684.

Huang G. Visual-Inertial Navigation: A Concise Review. arXiv, 2019. DOI: 10.48550/arXiv.1906.02650.

Walter V., Saska M., Franchi A. Fast Mutual Relative Localization of UAVs using Ultraviolet LED Markers. Proc. of IEEE ICUAS, 2018. P. 1217–1226.

Walter V., Staub N., Franchi A. et al. UVDAR System for Visual Relative Localization With Application to Leader–Follower Formations of Multirotor UAVs. IEEE RA-L. Vol. 4, Issue 3. P. 2637–2644.

Jospin L., Stoven-Dubois A., Cucci D. A. Photometric Long-Range Positioning of LED Targets for Cooperative Navigation in UAVs. Drones. Vol. 3, Issue 3. P. 69.

Li S., Lei H., Zhu C. et al. Bearing-Only Passive Localization and Optimized Adjustment for UAV Formations Under Electromagnetic Silence. Drones. Vol. 9, Issue 11. P. 767.

Walter V., Staub N., Saska M. et al. Mutual Localization of UAVs based on Blinking Ultraviolet Markers and 3D Time-Position Hough Transform. IEEE CASE, 2018. P. 298–303.

Xu H., Liu P., Chen X. et al. D2SLAM: Decentralized and Distributed Collaborative Visual-inertial SLAM System for Aerial Swarm. arXiv, 2024. DOI: 10.48550/arXiv.2211.01538.

Zhu P., Geneva P., Ren W. et al. Distributed Visual-Inertial Cooperative Localization. Proc. of IEEE/RSJ IROS, 2021. P. 8714–8721.

Published

2026-01-17