Direct Gaussian process quantile regression using expectation propagation

Alexis Boukouvalas*, Remi Barillec, Dan Cornford

*Corresponding author for this work

Research output: Chapter in Book/Conference proceedingConference contributionpeer-review

Abstract

Direct quantile regression involves estimating a given quantile of a response variable as a function of input variables. We present a new framework for direct quantile regression where a Gaussian process model is learned, minimising the expected tilted loss function. The integration required in learning is not analytically tractable so to speed up the learning we employ the Expectation Propagation algorithm. We describe how this work relates to other quantile regression methods and apply the method on both synthetic and real data sets. The method is shown to be competitive with state of the art methods whilst allowing for the leverage of the full Gaussian process probabilistic framework.

Original languageEnglish
Title of host publicationProceedings of the 29th International Conference on Machine Learning, ICML 2012
Place of PublicationMadison, WI
PublisherOmnipress
Pages1695-1702
Number of pages8
ISBN (Print)9781450312851
Publication statusPublished - 26 Jun 2012
Event29th International Conference on Machine Learning, ICML 2012 - Edinburgh, United Kingdom
Duration: 26 Jun 20121 Jul 2012

Publication series

NameProceedings of the 29th International Conference on Machine Learning, ICML 2012
Volume2

Conference

Conference29th International Conference on Machine Learning, ICML 2012
Country/TerritoryUnited Kingdom
CityEdinburgh
Period26/06/121/07/12

Fingerprint

Dive into the research topics of 'Direct Gaussian process quantile regression using expectation propagation'. Together they form a unique fingerprint.

Cite this