Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Permanent URI for this collectionhttps://hdl.handle.net/11147/7148
Browse
9 results
Search Results
Book Part Citation - Scopus: 1Automated Analysis of Phase-Contrast Optical Microscopy Time-Lapse Images: Application To Wound Healing and Cell Motility Assays of Breast Cancer(Elsevier, 2023) Erdem, Yusuf Sait; Ayanzadeh, Aydın; Mayalı, Berkay; Balıkçı, Muhammed; Belli, Özge Nur; Uçar, Mahmut; Yalçın Özuysal, Özden; Pesen Okvur, Devrim; Önal, Sevgi; Morani, Kenan; Iheme, Leonardo Obinna; Töreyin, Behçet UğurThis chapter describes a workflow for analyzing phase-contrast microscopy (PCM) data from two fundamental types of biomedical assays: assays for cell motility and assays for wound healing. The workflow of the analysis is composed of the methods for acquiring, restoring, segmenting, and quantifying biomedical data. In the literature, there have been separate methods aimed at specific stages of PCM data analysis. Nonetheless, there has never been a complete workflow for all stages of analysis. This work is an innovation that proposes an end-to-end workflow for image pre-processing, deep learning segmentation, tracking, and quantification stages in cell motility and wound healing assay analyses. The findings indicate that domain knowledge can be used to make simple but significant improvements to the results of cutting-edge methods. Furthermore, even for deep learning-based methods, pre-processing is clearly a necessary step in the workflow. © 2023 Elsevier Inc. All rights reserved.Conference Object Citation - Scopus: 2Yara İyileşmesi Mikroskopi Görüntü Serilerinin Otomatik Analizi - Bir Ön-çalışma(IEEE, 2020) Mayalı, Berkay; Şaylığ, Orkun; Yalçın Özuysal, Özden; Pesen Okvur, Devrim; Töreyin, Behçet Uğur; Ünay, DevrimCollective cell analysis from microscopy image series is important for wound healing research. Computer-based automation of such analyses may help in rapid acquisition of reliable and reproducible results. In this study phase -contrast optical microscopy image series of an in-vitro wound healing essay is manually delineated by two experts and its analysis is realized, traditional image processing and deep learning based approaches for automated segmentation of wound area are developed and their perlOrmance comparisons are carried out.Conference Object Citation - Scopus: 1A Preliminary Study on Cell Motility Analysis From Phase-Contrast Microscopy Image Series(IEEE, 2020) Kayan, Emre; Kavuşan, Tarık; Önal, Sevgi; Pesen Okvur, Devrim; Yalçın Özuysal, Özden; Töreyin, Behçet Uğur; Ünay, DevrimAnalyses of morphology, polarity, and motility of cells is important for cell biology research such as metastatic and invasive capacity of cells, wound healing, and embryonic development. Automation of such analyses using image series of phase-contrast optical microscopy, which allows label-free imaging of live cells in their living environment, is a need. With this purpose, in this study image series of a cell motility experiment is manually annotated, and an automation algorithm realizing motion and shape analyses of cells using the annotated data is developed. In addition, due to the low number of annotated data at hand, a U-Net based solution is devised for automated segmentation of the cells and its performance is evaluated.Article Citation - WoS: 3Citation - Scopus: 3Elimination of Useless Images From Raw Camera-Trap Data(Türkiye Klinikleri Journal of Medical Sciences, 2019) Tekeli, Ulaş; Baştanlar, YalınCamera-traps are motion triggered cameras that are used to observe animals in nature. The number of images collected from camera-traps has increased significantly with the widening use of camera-traps thanks to advances in digital technology. A great workload is required for wild-life researchers to group and label these images. We propose a system to decrease the amount of time spent by the researchers by eliminating useless images from raw camera-trap data. These images are too bright, too dark, blurred, or they contain no animals To eliminate bright, dark, and blurred images we employ techniques based on image histograms and fast Fourier transform. To eliminate the images without animals, we propose a system combining convolutional neural networks and background subtraction. We experimentally show that the proposed approach keeps 99% of photos with animals while eliminating more than 50% of photos without animals. We also present a software prototype that employs developed algorithms to eliminate useless images.Conference Object Citation - WoS: 3Citation - Scopus: 5Image Processing Based Stiffness Mapping of a Haptic Device(Springer, 2017) Taner, Barış; Dede, Mehmet İsmet CanThe widely accepted performance criteria of haptic devices, which are transparency and z-width, are affected by the stiffness characteristics of the haptic device’s mechanism. In addition indirect measurement of the handle pose of a haptic device is also affected by the stiffness characteristic. In this study, image processing techniques are used in the experimental setup to develop a stiffness map of a haptic device. The experimentally developed stiffness map is presented and the results are discussed by addressing future works.Article Citation - WoS: 3Citation - Scopus: 3Texture Analysis of Polymer Modified Bitumen Images(Carl Hanser Verlag GmbH & Co. KG, 2011) Gümüştekin, Şevket; Topal, Ali; Şengöz, BurakThis study aims to analyze the textural features extracted from microscopic images of elastomeric and plastomeric type polymer modified bitumen (PMB) including five different types and contents of polymers. Fluorescence microscopy was used to capture microscopic images from thin films of PMB samples at different magnification scales (400×, 100×, and 40×). Gabor filters were utilized to extract the textural features of bitumen images. The features were used in three different query tests to quantify their representation capacity. The K nearest neighbor classifier was tested using leave-one-out cross validation. Textural analysis on the captured images provided numerical results that are in compliance with subjective visual tests. © 2011 Carl Hanser Verlag, Munich, Germany.Article Citation - WoS: 246Citation - Scopus: 245Step-By Quantitative Analysis of Focal Adhesions(Elsevier Ltd., 2014) Horzum, Utku; Özdil, Berrin; Pesen Okvur, DevrimFocal adhesions (FAs) are specialized adhesive structures which serve as cellular communication units between cells and the surrounding extracellular matrix. FAs are involved in signal transduction and actin cytoskeleton organization. FAs mediate cell adhesion, which is a critical phenomenon in cancer research. Since cells can form many and micrometer scale FAs, their quantitative analysis demands well-optimized image analysis approaches [1-3]. Here, we have optimized the analysis of FAs of MDA-MB-231 breast cancer cells. The optimization is based on proper processing of immunofluorescence images of vinculin, which is one of the markers of FAs. All image processing steps are carried out using the ImageJ software, which is freely available and in the public domain. The advantages of our method are:The analysis steps are simplified by combining different plugins of the ImageJ program.FAs are better detected with minimal false negatives due to optimized processing of fluorescent images.This approach can be applied to quantify a variety of fluorescent images comprising focal and/or localized signals within a high background such as FAs, one of the many complex signaling structures in a cell.Article Citation - WoS: 12Citation - Scopus: 19Quality Evaluation of Alaska Pollock (theragra Chalcogramma) Roe by Image Analysis. Part I: Weight Prediction(Taylor and Francis Ltd., 2012) Balaban, Murat Ömer; Chombeau, Melanie; Gümüş, Bahar; Cırban, DilşatRoe is an important product of the Alaska pollock (Theragra chalcogramma) industry. About 31% of the value for all pollock products comes from roe, yet roe is 5% of the weight of the fish. Currently, the size (weight), color, and maturity of the roe are subjectively evaluated. The objective of this study was to develop methods to predict the weight of Alaska pollock roe based on its view area from a camera and to differentiate between single and double roes. One hundred and forty-two pollock roes were picked from a processing line in a Kodiak, AK plant. Each roe was weighed, placed in a light box equipped with a digital video camera, images were taken at two different angles from one side, then turned over and presented at two different angles again (four images for each roe). A reference square of known surface area was placed by the roe. The following equations were used to fit the view area (X) versus weight (Y) data: linear, power, and second-order polynomial. Error rates for the classification of roes by weight decreased significantly when weight prediction equations for single and double roes were developed separately. A turn angle method, a box method, and a modified box method were tested to differentiate single and double roes by image analysis. Machine vision can accurately determine the weight of pollock roe. Practical Application Abstract: An image analysis method to accurately determine if pollock roe is a single or a double was developed. Then view area versus weight correlations were found for single and double roes that reduced incorrect weight classification rates to half that of human graders. © 2012 Copyright Taylor and Francis Group, LLC.Article Citation - WoS: 57Citation - Scopus: 69Prediction of the Weight of Alaskan Pollock Using Image Analysis(John Wiley and Sons Inc., 2010) Balaban, Murat Ömer; Chombeau, Melanie; Cırban, Dilşat; Gümüş, BaharDetermining the size and quality attributes of fish by machine vision is gaining acceptance and increasing use in the seafood industry. Objectivity, speed, and record keeping are advantages in using this method. The objective of this work was to develop the mathematical correlations to predict the weight of whole Alaskan Pollock (Theragra chalcogramma) based on its view area from a camera. One hundred and sixty whole Pollock were obtained fresh, within 2 d after catch from a Kodiak, Alaska, processing plant. The fish were first weighed, then placed in a light box equipped with a Nikon D200 digital camera. A reference square of known surface area was placed by the fish. The obtained image was analyzed to calculate the view area of each fish. The following equations were used to fit the view area (X) compared with weight (Y) data: linear, power, and 2nd-order polynomial. The power fit (Y = A·XB) gave the highest R2 for the fit (0.99). The effect of fins and tail on the accuracy of the weight prediction using view area were evaluated. Removing fins and tails did not improve prediction accuracy. Machine vision can accurately predict the weight of whole Pollock. © 2010 Institute of Food Technologists®.
