Open-Source Visual Target-Tracking System Both on Simulation Environment and Real Unmanned Aerial Vehicles
Loading...
Date
Journal Title
Journal ISSN
Volume Title
Open Access Color
Green Open Access
No
OpenAIRE Downloads
OpenAIRE Views
Publicly Funded
No
Abstract
This work presents an investigation into the domain of dynamic target tracking through object detection, particularly emphasizing the context of open-source applications like PX4, ROS, and YOLO. Over the years, achieving real-time object tracking on UAVs in dynamic environments has been a formidable challenge, necessitating offline computations or substantial onboard processing resources. However, contemporary UAVs are now equipped with advanced edge embedded devices, sensors, and cameras, enabling the integration of deep learning-based vision applications. This advancement offers the prospect of directly deploying cutting-edge applications onto UAVs, thereby expanding their utility in areas such as surveillance, search and rescue, and videography. To fully harness the potential of these vision applications, a communication infrastructure interfacing with the UAV’s underneath closed controllers becomes imperative. We’ve developed an integrated visual target-tracking system that connects a flight controller unit with a graphical unit by leveraging ROS tools and open-source deep learning packages. The overall integrated system based on ROS, deep learning applications, and custom PID controllers is shared on GitHub as open-source software package in a way that benefits everyone interested: https://github.com/miralab-ai/vision-ROS. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
Description
Keywords
Computer Vision, ROS, Sim2Real, UAV, Visual Target Tracking, YOLOv7-Tiny
Fields of Science
Citation
WoS Q
Scopus Q

OpenCitations Citation Count
N/A
Volume
Issue
Start Page
147
End Page
159
PlumX Metrics
Citations
Scopus : 1
Captures
Mendeley Readers : 3
Google Scholar™


