An Efficient Algorithm for Large-Scale Quasi-Supervised Learning
Loading...
Files
Date
Authors
Karaçalı, Bilge
Journal Title
Journal ISSN
Volume Title
Publisher
Open Access Color
BRONZE
Green Open Access
Yes
OpenAIRE Downloads
OpenAIRE Views
Publicly Funded
No
Abstract
We present a novel formulation for quasi-supervised learning that extends the learning paradigm to large datasets. Quasi-supervised learning computes the posterior probabilities of overlapping datasets at each sample and labels those that are highly specific to their respective datasets. The proposed formulation partitions the data into sample groups to compute the dataset posterior probabilities in a smaller computational complexity. In experiments on synthetic as well as real datasets, the proposed algorithm attained significant reduction in the computation time for similar recognition performances compared to the original algorithm, effectively generalizing the quasi-supervised learning paradigm to applications characterized by very large datasets.
Description
Keywords
Large-scale pattern recognition, Nearest neighbor rule, Posterior probability estimation, Quasi-supervised learning, Transductive inference, Large-scale pattern recognition, Posterior probability estimation, Transductive inference, Nearest neighbor rule, Quasi-supervised learning
Fields of Science
0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology
Citation
Karaçalı, B. (2016). An efficient algorithm for large-scale quasi-supervised learning. Pattern Analysis and Applications, 19(2), 311-323. doi:10.1007/s10044-014-0401-y
WoS Q
Scopus Q

OpenCitations Citation Count
1
Volume
19
Issue
2
Start Page
311
End Page
323
PlumX Metrics
Citations
Scopus : 1
Captures
Mendeley Readers : 4
SCOPUS™ Citations
1
checked on Apr 30, 2026
Page Views
1032
checked on Apr 30, 2026
Downloads
506
checked on Apr 30, 2026
Google Scholar™


