Progressively Select and Reject Pseudo-labelled Samples for Open-Set Domain Adaptation

Qian Wang, Fanlin Meng, Toby P. Breckon

Research output: Contribution to journalArticlepeer-review

14 Downloads (Pure)


Domain adaptation solves image classification problems in the target domain by taking advantage of the labelled source data and unlabelled target data. Usually, the source and target domains share the same set of classes. As a special case, Open-Set Domain Adaptation (OSDA) assumes there exist additional classes in the target domain but are not present in the source domain. To solve such a domain adaptation problem, our proposed method learns discriminative common subspaces for the source and target domains using a novel Open-Set Locality
Preserving Projection (OSLPP) algorithm. The source and target domain data are aligned in the learned common spaces class-wise. To handle the open-set classification problem, our method progressively selects target samples to be pseudo-labelled as known classes, rejects the outliers if they are detected as unknown classes, and leaves the remaining target samples as uncertain. The common subspace learning algorithm OSLPP simultaneously aligns the labelled source data and pseudo-labelled target data from known classes and pushes the rejected target data away from the known classes. The common subspace learning and the pseudo-labelled sample selection/rejection facilitate each other in an iterative learning framework and achieve state-of-the-art performance on four benchmark datasets Office-31, Office-Home, VisDA17 and Syn2Real-O with the average HOS of 87.6%, 67.0%, 76.1% and 65.6% respectively.
Original languageEnglish
Pages (from-to)1-12
JournalIEEE Transactions on Artificial Intelligence
Early online date25 Mar 2024
Publication statusE-pub ahead of print - 25 Mar 2024


  • Open-set domain adaptation
  • Pseudo-labelling
  • Locality preserving projection


Dive into the research topics of 'Progressively Select and Reject Pseudo-labelled Samples for Open-Set Domain Adaptation'. Together they form a unique fingerprint.

Cite this