Abstract
Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD), the resulting distance between distributions, are useful tools for fully nonparametric two-sample testing and learning on distributions. However, it is rare that all possible differences between samples are of interest - discovered differences can be due to different types of measurement noise, data collection artefacts or other irrelevant sources of variability. We propose distances between distributions which encode invariance to additive symmetric noise, aimed at testing whether the assumed true underlying processes differ. Moreover, we construct invariant features of distributions, leading to learning algorithms robust to the impairment of the input distributions with symmetric additive noise.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 30 (NIPS 2017) |
Editors | I Guyon, U Von Luxburg, S Bengio, H Wallach, R Fergus, S Vishwanathan, R Garnett |
Place of Publication | Burlington, MA |
Publisher | Morgan Kaufmann Publishers |
Pages | 1344-1354 |
Number of pages | 11 |
ISBN (Print) | 9781510860964 |
Publication status | Published - Dec 2017 |
Event | 31st Annual Conference on Neural Information Processing Systems - Long Beach, United States Duration: 4 Dec 2017 → 9 Dec 2017 |
Conference
Conference | 31st Annual Conference on Neural Information Processing Systems |
---|---|
Abbreviated title | NIPS 2017 |
Country/Territory | United States |
City | Long Beach |
Period | 4/12/17 → 9/12/17 |