Erol, Berat Alper

Loading...
Name Variants
Erol, B. A.
Erol, B. Alper
Berat A. Erol
Job Title
Email Address
beraterol@iyte.edu.tr
Main Affiliation
03.04. Department of Computer Engineering
Status
Current Staff
Website
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID

Sustainable Development Goals

SDG data is not available
Documents

18

Citations

349

h-index

11

WoS data could not be loaded because of an error. Please refresh the page or try again later.
Scholarly Output

3

Articles

1

Views / Downloads

632/116

Supervised MSc Theses

0

Supervised PhD Theses

0

WoS Citation Count

15

Scopus Citation Count

22

Patents

0

Projects

0

WoS Citations per Publication

5.00

Scopus Citations per Publication

7.33

Open Access Source

1

Supervised Theses

0

JournalCount
14th International Conference on Electrical and Electronics Engineering, ELECO 2023 - Proceedings -- 14th International Conference on Electrical and Electronics Engineering, ELECO 2023 -- 30 November 2023 through 2 December 2023 -- Virtual, Bursa -- 1971351
EAI/Springer Innovations in Communication and Computing -- 2nd International Congress of Electrical and Computer Engineering, ICECENG 2023 -- 22 November 2023 through 25 November 2023 -- Bandirma -- 3097991
Robotics and Autonomous Systems1
Current Page: 1 / 1

Scopus Quartile Distribution

Competency Cloud

GCRIS Competency Cloud

Scholarly Output Search Results

Now showing 1 - 3 of 3
  • Conference Object
    Citation - Scopus: 1
    Open-Source Visual Target-Tracking System Both on Simulation Environment and Real Unmanned Aerial Vehicles
    (Springer Science and Business Media Deutschland GmbH, 2024) Yılmaz,C.; Ozgun,A.; Erol,B.A.; Gumus,A.
    This work presents an investigation into the domain of dynamic target tracking through object detection, particularly emphasizing the context of open-source applications like PX4, ROS, and YOLO. Over the years, achieving real-time object tracking on UAVs in dynamic environments has been a formidable challenge, necessitating offline computations or substantial onboard processing resources. However, contemporary UAVs are now equipped with advanced edge embedded devices, sensors, and cameras, enabling the integration of deep learning-based vision applications. This advancement offers the prospect of directly deploying cutting-edge applications onto UAVs, thereby expanding their utility in areas such as surveillance, search and rescue, and videography. To fully harness the potential of these vision applications, a communication infrastructure interfacing with the UAV’s underneath closed controllers becomes imperative. We’ve developed an integrated visual target-tracking system that connects a flight controller unit with a graphical unit by leveraging ROS tools and open-source deep learning packages. The overall integrated system based on ROS, deep learning applications, and custom PID controllers is shared on GitHub as open-source software package in a way that benefits everyone interested: https://github.com/miralab-ai/vision-ROS. © The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.
  • Article
    Citation - WoS: 15
    Citation - Scopus: 17
    A Novel Search and Survey Technique for Unmanned Aerial Systems in Detecting and Estimating the Area for Wildfires
    (Elsevier B.V., 2021) Sarkar, M.; Yan, X.; Erol, B.A.; Raptis, I.; Homaifar, A.
    In recent years Unmanned Aerial Vehicles (UAVs) have progressively been utilized for wildfire management, and are especially in prevalent in forest fire monitoring missions. To ensure the fast detection and accurate area estimation of forest fires, a two-step search and survey algorithm for multi-UAV system is proposed to address these fire scenarios. Initially, a grid-based partition method is applied to divide the area-of-interest into several search areas. Then, an archetype search pattern is used to provide timely UAV exploration within those sub-areas. Once the fire zones are detected, a novel survey strategy is employed for UAVs to discover the boundary points of the fire zones, so that the area of the fire zones can be estimated using the sampled boundary points. In addition, the effect of wind is accounted for improving fire zone boundary estimates. The proposed search-and-survey procedure is validated on multiple simulated scenarios using the U.S. Air Force's mission-realistic Aerospace Multi-Agent Simulation Environment (AMASE) software. Simulation results showcase that the proposed search pattern can effectively discover the seeded fire zones within 40 min of the mission. This is relatively faster than the other two well-known search patterns. Moreover, the proposed survey technique provides a coverage estimate with at least 85% accuracy for the area of interest within 90 min of the mission. © 2021 Elsevier B.V.
  • Conference Object
    Citation - Scopus: 4
    Real Time Computer Vision Based Robotic Arm Controller With Ros and Gazebo Simulation Environment
    (Institute of Electrical and Electronics Engineers Inc., 2023) Aksoy,E.; Çakir,A.D.; Erol,B.A.; Gumus,A.
    Robotic arms are widely prevalent and find utility in a variety of applications. However, a significant and widespread challenge faced by these arms is their inability to replicate the intricate functionalities of a human hand, primarily due to the distinct structure of the hand. The primary objective of this work is to create a simulation of a robotic arm that can replicate the movements and functions of a human hand in real-time. Data obtained from hand and arm gestures created with Mediapipe will be transferred in real-time to the robotic arm that is visualized and simulated on ROS and Gazebo. Thus, the hand and arm movements of the user in front of the camera will be effective in real-time on the manipulable joints. This advancement holds the potential to facilitate the construction of a robot capable of emulating both the hand and the arm of humans with high fidelity, thereby enabling comprehensive control over the robotic arm's actions in real-time. © 2023 IEEE.