CSE
CSE
CSE CSE


AI Seminar

Convex Methods for Latent Representation Learning

Dale Schuurmans


Professor of Computer Science
University of Alberta
 
Tuesday, September 15, 2015
3:00pm - 4:15pm
3725 BBBB

Add to Google Calendar

About the Event

Automated feature discovery is a fundamental problem in data analysis. Although classical feature learning methods fail to guarantee optimal solutions in general, convex reformulations have been developed for a number of such problems. Most of these reformulations are based on one of two key strategies: relaxing pairwise representations, or exploiting induced matrix norms. Despite their use of relaxation, convex reformulations can demonstrate significant improvements in solution quality by eliminating local minima. I will discuss a few recent convex reformulations for representative learning problems, including robust regression, hidden-layer network training, and multi-view learning---demonstrating how latent representation discovery can co-occur with parameter optimization while admitting globally optimal relaxed solutions. In some cases, meaningful rounding guarantees can also be achieved.

Additional Information

Sponsor(s): Toyota

Faculty Sponsor: Michael Wellman

Open to: Public