Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Permanent URI for this collectionhttps://hdl.handle.net/11147/7148
Browse
2 results
Search Results
Now showing 1 - 2 of 2
Conference Object Citation - Scopus: 4Real Time Computer Vision Based Robotic Arm Controller With Ros and Gazebo Simulation Environment(Institute of Electrical and Electronics Engineers Inc., 2023) Aksoy,E.; Çakir,A.D.; Erol,B.A.; Gumus,A.Robotic arms are widely prevalent and find utility in a variety of applications. However, a significant and widespread challenge faced by these arms is their inability to replicate the intricate functionalities of a human hand, primarily due to the distinct structure of the hand. The primary objective of this work is to create a simulation of a robotic arm that can replicate the movements and functions of a human hand in real-time. Data obtained from hand and arm gestures created with Mediapipe will be transferred in real-time to the robotic arm that is visualized and simulated on ROS and Gazebo. Thus, the hand and arm movements of the user in front of the camera will be effective in real-time on the manipulable joints. This advancement holds the potential to facilitate the construction of a robot capable of emulating both the hand and the arm of humans with high fidelity, thereby enabling comprehensive control over the robotic arm's actions in real-time. © 2023 IEEE.Conference Object Citation - Scopus: 1Open-Source Visual Target-Tracking System Both on Simulation Environment and Real Unmanned Aerial Vehicles(Springer Science and Business Media Deutschland GmbH, 2024) Yılmaz,C.; Ozgun,A.; Erol,B.A.; Gumus,A.This work presents an investigation into the domain of dynamic target tracking through object detection, particularly emphasizing the context of open-source applications like PX4, ROS, and YOLO. Over the years, achieving real-time object tracking on UAVs in dynamic environments has been a formidable challenge, necessitating offline computations or substantial onboard processing resources. However, contemporary UAVs are now equipped with advanced edge embedded devices, sensors, and cameras, enabling the integration of deep learning-based vision applications. This advancement offers the prospect of directly deploying cutting-edge applications onto UAVs, thereby expanding their utility in areas such as surveillance, search and rescue, and videography. To fully harness the potential of these vision applications, a communication infrastructure interfacing with the UAV’s underneath closed controllers becomes imperative. We’ve developed an integrated visual target-tracking system that connects a flight controller unit with a graphical unit by leveraging ROS tools and open-source deep learning packages. The overall integrated system based on ROS, deep learning applications, and custom PID controllers is shared on GitHub as open-source software package in a way that benefits everyone interested: https://github.com/miralab-ai/vision-ROS. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
