Testing and learning on distributions with symmetric noise invariance

Ho Chung Leon Law, Christopher Yau, Dino Sejdinovic

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD), the resulting distance between distributions, are useful tools for fully nonparametric two-sample testing and learning on distributions. However, it is rare that all possible differences between samples are of interest - discovered differences can be due to different types of measurement noise, data collection artefacts or other irrelevant sources of variability. We propose distances between distributions which encode invariance to additive symmetric noise, aimed at testing whether the assumed true underlying processes differ. Moreover, we construct invariant features of distributions, leading to learning algorithms robust to the impairment of the input distributions with symmetric additive noise.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 30 (NIPS 2017)
EditorsI Guyon, U Von Luxburg, S Bengio, H Wallach, R Fergus, S Vishwanathan, R Garnett
Place of PublicationBurlington, MA
PublisherMorgan Kaufmann Publishers
Pages1344-1354
Number of pages11
ISBN (Print)9781510860964
Publication statusPublished - Dec 2017
Event31st Annual Conference on Neural Information Processing Systems - Long Beach, United States
Duration: 4 Dec 20179 Dec 2017

Conference

Conference31st Annual Conference on Neural Information Processing Systems
Abbreviated titleNIPS 2017
Country/TerritoryUnited States
CityLong Beach
Period4/12/179/12/17

Fingerprint

Dive into the research topics of 'Testing and learning on distributions with symmetric noise invariance'. Together they form a unique fingerprint.

Cite this