Sparse Overlapping Sets Lasso for multitask learning and its applications to fMRI analysis

Nikhil Rao, Christopher Cox, Robert Nowak, Timothy T. Rogers

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Multitask learning can be effective when features useful in one task are also useful for other tasks, and the group lasso is a standard method for selecting a common subset of features. In this paper, we are interested in a less restrictive form of multitask learning, wherein (1) the available features can be organized into subsets according to a notion of similarity and (2) features useful in one task are similar, but not necessarily identical, to the features best suited for other tasks. The main contribution of this paper is a new procedure called Sparse Overlapping Sets (SOS) lasso, a convex optimization that automatically selects similar features for related learning tasks. Error bounds are derived for SOSlasso and its consistency is established for squared error loss. In particular, SOSlasso is motivated by multisubject fMRI studies in which functional activity is classified using brain voxels as features. Experiments with real and synthetic data demonstrate the advantages of SOSlasso compared to the lasso and group lasso.
Original languageEnglish
Title of host publicationAdvances in neural information processing systems 26
EditorsC. Burges, L. Bottou, M. Welling, Z. Ghahramani, K. Weinberger
Publication statusPublished - 2013


Dive into the research topics of 'Sparse Overlapping Sets Lasso for multitask learning and its applications to fMRI analysis'. Together they form a unique fingerprint.

Cite this