WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Permanent URI for this collectionhttps://hdl.handle.net/11147/7150
Browse
2 results
Search Results
Article Ggnn: Group-Guided Nearest Neighbors for Efficient Image Matching(Springer, 2025) Cine, Ersin; Bastanlar, Yalin; Ozuysal, MustafaThe widely adopted image matching approach remains dependent on exhaustive matching of local features across images. Existing methods aiming to improve efficiency either approximate nearest neighbor (NN) search, compromising accuracy, or apply filtering only after establishing tentative matches, which restricts potential efficiency gains. We challenge the assumption that exhaustive NN search is necessary by proposing a more efficient hierarchical approach that maintains matching accuracy without relying on full-scale NN search. Our key insight is that efficiently identifying sufficiently similar, geometrically meaningful feature matches-rather than the most similar but geometrically random ones-can improve or maintain performance at a lower computational cost. We propose a novel method, Group-Guided Nearest Neighbors (GGNN), which matches groups of features first and then matches individual features only within these matched groups. This hierarchical pipeline reduces the computational complexity of feature matching from \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\theta (n<^>2)$$\end{document} to \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\theta (n \sqrt{n})$$\end{document}, significantly improving efficiency. Experimental results on homography estimation demonstrate that GGNN outperforms standard NN search while achieving performance comparable to state-of-the-art methods. Additionally, we formulate GGNN as a general framework, where conventional NN search is a special case with a single global feature group. This formulation provides a continuum of feature matching methods with varying computational costs, enabling automatic selection based on a given time budget.Article Citation - WoS: 4Citation - Scopus: 4Organolabeler: a Quick and Accurate Annotation Tool for Organoid Images(Amer Chemical Soc, 2024) Kahveci, Burak; Polatli, Elifsu; Bastanlar, Yalin; Guven, SinanOrganoids are self-assembled 3D cellular structures that resemble organs structurally and functionally, providing in vitro platforms for molecular and therapeutic studies. Generation of organoids from human cells often requires long and costly procedures with arguably low efficiency. Prediction and selection of cellular aggregates that result in healthy and functional organoids can be achieved by using artificial intelligence-based tools. Transforming images of 3D cellular constructs into digitally processable data sets for training deep learning models requires labeling of morphological boundaries, which often is performed manually. Here, we report an application named OrganoLabeler, which can create large image-based data sets in a consistent, reliable, fast, and user-friendly manner. OrganoLabeler can create segmented versions of images with combinations of contrast adjusting, K-means clustering, CLAHE, binary, and Otsu thresholding methods. We created embryoid body and brain organoid data sets, of which segmented images were manually created by human researchers and compared with OrganoLabeler. Validation is performed by training U-Net models, which are deep learning models specialized in image segmentation. U-Net models, which are trained with images segmented by OrganoLabeler, achieved similar or better segmentation accuracies than the ones trained with manually labeled reference images. OrganoLabeler can replace manual labeling, providing faster and more accurate results for organoid research free of charge.
