Vision-Based Neighbor Selection Method for Occlusion-Resilient Uncrewed Aerial Vehicle Swarm Coordination in Three-Dimensional Environments
DOI:
https://doi.org/10.20535/2786-8729.6.2025.331602Keywords:
UAV swarm, vision-based localization, formation control, decentralized coordination, artificial potential fieldAbstract
Uncrewed aerial vehicle (UAV) swarms provide superior scalability, reliability, and efficiency compared to individual UAVs, enabling transformative applications in search and rescue, precision agriculture, environmental monitoring, and urban surveillance. However, their dependence on Global Navigation Satellite Systems (GNSS) and wireless communication introduces vulnerabilities like signal loss, jamming, and scalability constraints, particularly in GNSS-denied environments. This study advances swarm robotics by developing a novel neighbor selection method for occlusion-resilient, vision-based coordination of UAV swarms in three-dimensional (3D) environments, addressing the problem of visual occlusions that disrupt decentralized flocking. Unlike prior research focusing on planar settings or communication-dependent systems, we model swarm coordination as an artificial potential field problem. Additionally, we evaluate performance through metrics like minimum nearest neighbor distance (collision avoidance), alignment (velocity synchronization), and union (cohesion). Using simulations in point mass and realistic quadcopter dynamics (Gazebo with PX4) environments, we assess swarm behavior across dense, default, and sparse configurations. Our findings reveal that occlusions degrade alignment (below 0.9) and distances (below 0.5 m) in dense swarms exceeding 70 agents, increasing collision risks. Our novel method, incorporating metric, topographic, and Delaunay strategies, mitigates these effects. Topographic selection achieves high alignment (above 0.9) in small swarms (up to 50 agents), while Delaunay ensures perfect cohesion (union = 1) and robust alignment across all swarm sizes. Validation in simulations confirms these results. Furthermore, our method enables communication-free coordination that matches or surpasses communication-enabled performance, with topographic selection outperforming (alignment above 0.9 vs. 0.85) in small swarms and Delaunay excelling in larger ones. This result eliminates the need for inter-agent communication, enhancing resilience and bandwidth efficiency. These findings establish a scalable, infrastructure-independent framework for UAV swarms, with practical value for autonomous operations in complex, occlusion-prone environments.
References
M. Abdelkader, S. Güler, H. Jaleel, and J. S. Shamma, “Aerial swarms: recent applications and challenges,” Current Robotics Reports, vol. 2, no. 3, pp. 309–320, 2021, https://doi.org/10.1007/s43154-021-00063-4.
D. Floreano and R. J. Wood, “Science, technology and the future of small autonomous drones,” Nature, vol. 521, no. 7553, pp. 460–466, 2015, https://doi.org/10.1038/nature14542.
T. Cieslewski, S. Choudhary and D. Scaramuzza, "Data-Efficient Decentralized Visual SLAM," IEEE International Conference on Robotics and Automation (ICRA), Brisbane, QLD, Australia, 2018, pp. 2466–2473, https://doi.org/10.1109/ICRA.2018.8461155.
G. Vasarhelyi, C. Virágh, G. Somorjai, N. Tarcai, T. Szörenyi, T. Nepusz and T. Vicsek, “Outdoor flocking and formation flight with autonomous aerial robots,” IEEE/RSJ International Conference on Intelligent Robots and Systems, IL, USA, 2014, pp. 3866–3873, 2014, https://doi.org/10.1109/IROS.2014.6943105.
X. Yan, H. Deng, and Q. Quan, “Active infrared coded target design and pose estimation for multiple objects,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 2019, https://doi.org/10.1109/IROS40897.2019.8967660.
M. Petrlik, T. Baca, D. Hert, M. Vrba, T. Krajnik, and M. Saska, “A robust UAV system for operations in a constrained environment,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 2169–2176, 2020, https://doi.org/10.1109/LRA.2020.2970980.
P. Petráček, V. Walter, T. Báča, and M. Saska, “Bio-inspired compact swarms of unmanned aerial vehicles without communication and external localization,” Bioinspiration & Biomimetics, vol. 16, no. 2, p. 026009, 2020, https://doi.org/10.1088/1748-3190/abc6b3.
F. Schilling, F. Schiano, and D. Floreano, “Vision-Based drone flocking in outdoor environments,” IEEE Robotics and Automation Letters, vol. 6, no. 2, pp. 2954–2961, 2021, https://doi.org/10.1109/LRA.2021.3062298.
F. Schilling, E. Soria, and D. Floreano, “On the Scalability of Vision-Based Drone Swarms in the Presence of Occlusions,” IEEE Access, vol. 10, pp. 28133–28146, 2022, https://doi.org/10.1109/ACCESS.2022.3158758.
R. Martínez-Clark, J. Pliego-Jimenez, J. F. Flores-Resendiz, and D. Avilés-Velázquez, “Optimum k-Nearest Neighbors for Heading Synchronization on a Swarm of UAVs under a Time-Evolving Communication Network,” Entropy, vol. 25, no. 6, p. 853, 2023, https://doi.org/10.3390/e25060853.
V. Kumar and R. De, “Efficient flocking: metric versus topological interactions,” Royal Society Open Science, vol. 8, no. 9, 2021, https://doi.org/10.1098/rsos.202158.
C. W. Reynolds, “Flocks, herds and schools: A distributed behavioral model,” ACM SIGGRAPH Computer Graphics, vol. 21, no. 4, pp. 25–34, 1987, https://doi.org/10.1145/37402.37406.
Y. Albrekht and A. Pysarenko, “Exploring the power of heterogeneous UAV swarms through reinforcement learning,” Technology Audit and Production Reserves, vol. 6, no. 2(74), pp. 6–10, 2023, https://doi.org/10.15587/2706-5448.2023.293063.
O. Smovzhenko, A. Pysarenko, " Evaluating Vision-Based Drone Swarms Performance under Visual Occlusions", 13th International Scientific And Practical Conference On Information Systems and Technologies Infocom Advanced Solutions, Kyiv, Ukraine, 2025, pp. 42–44. [Online]. Available: https://ist.kpi.ua/wp-content/uploads/2025/05/infocom-advanced-solutions-2025_compressed-3.pdf.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Information, Computing and Intelligent systems

This work is licensed under a Creative Commons Attribution 4.0 International License.