Improved Colorectal Gland Segmentation in Histopathology Images with Adaptive Resizer-Enhanced U-Net Models
| dc.contributor.author | Fidan, E. | |
| dc.contributor.author | Gumus, A. | |
| dc.date.accessioned | 2026-02-25T15:01:55Z | |
| dc.date.available | 2026-02-25T15:01:55Z | |
| dc.date.issued | 2026 | |
| dc.description.abstract | Utilizing low-resolution images for computer vision tasks such as classification and segmentation can sometimes hinder the model’s ability to accurately learn essential features. While using high-resolution images and designing compatible models might seem like viable solutions, they are not always feasible due to energy efficiency and graphical computation constraints. Downsizing images for model training and application is an effective approach for improving computational efficiency and optimizing model performance.The bilinear resizing method, commonly employed for this purpose, inherently causes information loss due to its numerical approach, which relies solely on the four nearest pixel values to compute each target pixel. This limitation becomes more pronounced with high-resolution images, where the down sampling process intensifies the loss of critical information. However, recent advancements have introduced adaptive resizer modules, which dynamically adjust image dimensions to better preserve essential features before processing by deep learning models. In this study, an adaptive resizer-based segmentation framework is proposed for the gland segmentation task, which is crucial for accurate disease diagnosis, particularly in cancer analysis. Three distinct encoder-decoder architecture segmentation models are assessed for image segmentation using the Colorectal Adenocarcinoma Gland (CRAG) gland segmentation database. Each architecture was tested separately, employing six different backbone encoders that were pretrained on the ImageNet dataset. The comparative analysis showed that the adaptive resizer improved segmentation performance, increasing the Intersection over Union (IoU) metric by an average of 5.6%. This enhancement raised the lowest IoU from 62% to 70% and the highest to 78%. The code is available on GitHub at https://github.com/miralab-ai/adaptive-resizer-segmentation. © The Author(s) 2026. | en_US |
| dc.identifier.doi | 10.1007/s00521-025-11817-y | |
| dc.identifier.issn | 0941-0643 | |
| dc.identifier.scopus | 2-s2.0-105028379603 | |
| dc.identifier.uri | https://doi.org/10.1007/s00521-025-11817-y | |
| dc.identifier.uri | https://hdl.handle.net/11147/18959 | |
| dc.language.iso | en | en_US |
| dc.publisher | Springer Science and Business Media Deutschland GmbH | en_US |
| dc.relation.ispartof | Neural Computing and Applications | en_US |
| dc.rights | info:eu-repo/semantics/openAccess | en_US |
| dc.subject | Adaptive Resizer | en_US |
| dc.subject | Gland Database | en_US |
| dc.subject | Learnable Image Process | en_US |
| dc.subject | Segmentation Framework | en_US |
| dc.title | Improved Colorectal Gland Segmentation in Histopathology Images with Adaptive Resizer-Enhanced U-Net Models | en_US |
| dc.type | Article | en_US |
| dspace.entity.type | Publication | |
| gdc.author.scopusid | 60347440200 | |
| gdc.author.scopusid | 35315599800 | |
| gdc.description.department | İzmir Institute of Technology | en_US |
| gdc.description.departmenttemp | [Fidan] Ekrem, Department of Electrical and Electronic Engineering, Izmir Yüksek Teknoloji Enstitüsü, Izmir, Turkey; [Gumus] Abdurrahman, Department of Electrical and Electronic Engineering, Izmir Yüksek Teknoloji Enstitüsü, Izmir, Turkey, Department of Computer Engineering, Isparta University of Applied Sciences, Isparta, Isparta, Turkey | en_US |
| gdc.description.issue | 1 | en_US |
| gdc.description.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
| gdc.description.scopusquality | N/A | |
| gdc.description.volume | 38 | en_US |
| gdc.description.wosquality | N/A | |
| gdc.index.type | Scopus | |
| relation.isAuthorOfPublication.latestForDiscovery | ce5ce1e2-17ef-4da2-946d-b7a26e44e461 | |
| relation.isOrgUnitOfPublication.latestForDiscovery | 9af2b05f-28ac-4018-8abe-a4dfe192da5e |
