Improved Cell Segmentation Using Deep Learning in Label-Free Optical Microscopy Images

dc.contributor.author Ayanzadeh, Aydın
dc.contributor.author Yalçın Özuysal, Özden
dc.contributor.author Pesen Okvur, Devrim
dc.contributor.author Önal, Sevgi
dc.contributor.author Töreyin, Behçet Uğur
dc.contributor.author Ünay, Devrim
dc.date.accessioned 2021-12-02T18:16:17Z
dc.date.available 2021-12-02T18:16:17Z
dc.date.issued 2021
dc.description.abstract The recently popular deep neural networks (DNNs) have a significant effect on the improvement of segmentation accuracy from various perspectives, including robustness and completeness in comparison to conventional methods. We determined that the naive U-Net has some lacks in specific perspectives and there is high potential for further enhancements on the model. Therefore, we employed some modifications in different folds of the U-Net to overcome this problem. Based on the probable opportunity for improvement, we develop a novel architecture by using an alternative feature extractor in the encoder of U-Net and replacing the plain blocks with residual blocks in the decoder. This alteration makes the model superconvergent yielding improved performance results on two challenging optical microscopy image series: a phase-contrast dataset of our own (MDA-MB-231) and a brightfield dataset from a well-known challenge (DSB2018). We utilized the U-Net with pretrained ResNet-18 as the encoder for the segmentation task. Hence, following the modifications, we redesign a novel skip-connection to reduce the semantic gap between the encoder and the decoder. The proposed skip-connection increases the accuracy of the model on both datasets. The proposed segmentation approach results in Jaccard Index values of 85.0% and 89.2% on the DSB2018 and MDA-MB-231 datasets, respectively. The results reveal that our method achieves competitive results compared to the state-of-the-art approaches and surpasses the performance of baseline approaches. en_US
dc.description.sponsorship This work has been supported by the Scientific and Technological Research Council of Turkey (TUBITAK) under Grant 119E578. The data used in this study is collected under the Marie Curie IRG grant (no: FP7 PIRG08-GA-2010-27697). Aydin Ayanzadeh's work is supported, in part, by Vodafone Turkey, under project no. ITUVF20180901P04 within the context of ITU Vodafone Future Lab R&D program. en_US
dc.identifier.doi 10.3906/elk-2105-244
dc.identifier.issn 1300-0632
dc.identifier.issn 1303-6203
dc.identifier.scopus 2-s2.0-85117246190
dc.identifier.uri https://doi.org/10.3906/elk-2105-244
dc.identifier.uri https://hdl.handle.net/11147/11832
dc.identifier.uri https://search.trdizin.gov.tr/yayin/detay/526977
dc.language.iso en en_US
dc.publisher TÜBİTAK - Türkiye Bilimsel ve Teknolojik Araştırma Kurumu en_US
dc.relation.ispartof Turkish Journal of Electrical Engineering and Computer Sciences en_US
dc.rights info:eu-repo/semantics/openAccess en_US
dc.subject Segmentation en_US
dc.subject Breast cancer en_US
dc.subject Convolutional neural networks en_US
dc.subject Optical microscopy en_US
dc.subject Phase-contrast microscopy en_US
dc.subject Brightfield en_US
dc.title Improved Cell Segmentation Using Deep Learning in Label-Free Optical Microscopy Images en_US
dc.type Article en_US
dspace.entity.type Publication
gdc.author.id 0000-0003-0552-368X
gdc.author.id 0000-0001-8333-4193
gdc.author.id 0000-0003-0552-368X en_US
gdc.author.id 0000-0001-8333-4193 en_US
gdc.author.institutional Yalçın Özuysal, Özden
gdc.author.institutional Pesen Okvur, Devrim
gdc.author.institutional Önal, Sevgi
gdc.bip.impulseclass C5
gdc.bip.influenceclass C5
gdc.bip.popularityclass C4
gdc.coar.access open access
gdc.coar.type text::journal::journal article
gdc.collaboration.industrial false
gdc.description.department İzmir Institute of Technology. Molecular Biology and Genetics en_US
gdc.description.endpage 2868 en_US
gdc.description.publicationcategory Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı en_US
gdc.description.scopusquality Q2
gdc.description.startpage 2855 en_US
gdc.description.volume 29 en_US
gdc.description.wosquality Q3
gdc.identifier.openalex W3205191126
gdc.identifier.trdizinid 526977
gdc.identifier.wos WOS:000709712800006
gdc.index.type WoS
gdc.index.type Scopus
gdc.index.type TR-Dizin
gdc.oaire.accesstype GOLD
gdc.oaire.diamondjournal false
gdc.oaire.impulse 3.0
gdc.oaire.influence 2.919005E-9
gdc.oaire.isgreen true
gdc.oaire.popularity 4.755305E-9
gdc.oaire.publicfunded false
gdc.oaire.sciencefields 0202 electrical engineering, electronic engineering, information engineering
gdc.oaire.sciencefields 02 engineering and technology
gdc.openalex.fwci 0.65058598
gdc.openalex.normalizedpercentile 0.72
gdc.opencitations.count 3
gdc.plumx.mendeley 11
gdc.plumx.scopuscites 6
gdc.scopus.citedcount 6
gdc.wos.citedcount 3
relation.isAuthorOfPublication.latestForDiscovery 8e1732f3-2bf8-4231-b7a4-7e94b485eb97
relation.isOrgUnitOfPublication.latestForDiscovery 9af2b05f-28ac-4013-8abe-a4dfe192da5e

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Name:
elk-29-si-1-18-2105-244.pdf
Size:
3.12 MB
Format:
Adobe Portable Document Format
Description:
Article (Makale)