Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection

Permanent URI for this collectionhttps://hdl.handle.net/11147/7148

Browse

Search Results

Now showing 1 - 3 of 3
  • Article
    Citation - WoS: 5
    Citation - Scopus: 5
    Automated Deep Learning Model Development Based on Weight Sensitivity and Model Selection Statistics
    (Pergamon-elsevier Science Ltd, 2025) Yalcin, Damla; Deliismail, Ozgun; Tuncer, Basak; Boy, Onur Can; Bayar, Ibrahim; Kayar, Gizem; Sildir, Hasan
    Current sustainable production and consumption processes call for technological integration with the realm of computational modeling especially in the form of sophisticated data-driven architectures. Advanced mathematical formulations are essential for deep learning approach to account for revealing patterns under nonlinear and complex interactions to enable better prediction capabilities for subsequent optimization and control tasks. Bayesian Information Criterion and Akaike Information Criterion are introduced as additional constraints to a mixed-integer training problem which employs a parameter sensitivity related objective function, unlike traditional methods which minimize the training error under fixed architecture. The resulting comprehensive optimization formulation is flexible as a simultaneous approach is introduced through algorithmic differentiation to benefit from advanced solvers to handle computational challenges and theoretical issues. Proposed formulation delivers 40% reduction, in architecture with high accuracy. The performance of the approach is compared to fully connected traditional methods on two different case studies from large scale chemical plants.
  • Conference Object
    Citation - WoS: 1
    Konteyner Görüntülerini Kullanarak Hasar Tespiti ve Sınıflandırması
    (IEEE, 2020) Imamoglu, Zeynep Ekici; Tuglular, Tugkan; Bastanlar, Yalin
    In the logistics sector, digital transformation is of great importance in terms of competition. In the present case, container warehouse entry / exit operations are carried out manually by the logistics personnel including container damage detection. During container warehouse entry / exit process, the process of detecting damaged containers is carried out by the personnel and several minutes are required to upload to the IT system. The aim of our work is to automate the detection of damaged containers. This way, the mistakes made by the personnel will be eliminated and the process will be accelerated. In this work, we propose to use a convolutional neural network (CNN) that takes the container images and classify them as damaged or undamaged. We modeled the problem as a binary classification and employed different CNN models. The result we obtained shows that there is no single best method for the classification. It is shown how the dataset was created and how the parameters used in the layered structures affect the models employed in this study.
  • Article
    Citation - Scopus: 20
    Estrus Detection and Dairy Cow Identification With Cascade Deep Learning for Augmented Reality-Ready Livestock Farming
    (Multidisciplinary Digital Publishing Institute (MDPI), 2023) Arıkan, İ.; Ayav, T.; Seçkin, A.Ç.; Soygazi, F.
    Accurate prediction of the estrus period is crucial for optimizing insemination efficiency and reducing costs in animal husbandry, a vital sector for global food production. Precise estrus period determination is essential to avoid economic losses, such as milk production reductions, delayed calf births, and disqualification from government support. The proposed method integrates estrus period detection with cow identification using augmented reality (AR). It initiates deep learning-based mounting detection, followed by identifying the mounting region of interest (ROI) using YOLOv5. The ROI is then cropped with padding, and cow ID detection is executed using YOLOv5 on the cropped ROI. The system subsequently records the identified cow IDs. The proposed system accurately detects mounting behavior with 99% accuracy, identifies the ROI where mounting occurs with 98% accuracy, and detects the mounting couple with 94% accuracy. The high success of all operations with the proposed system demonstrates its potential contribution to AR and artificial intelligence applications in livestock farming. © 2023 by the authors.