Monday, May 28, 2012

Reading List for the ICML 2012

The list of accepted papers for the ICML 2012 is out, and following some of my colleagues, I'll post the papers that at first hand cached my eye:

(Disclaimer: Since my research tends to be on nonparametric statistics, I tend to gravitate towards paper on those topics)                                      



Gaussian Process Regression Networks
Andrew Wilson, David Knowles, Zoubin Ghahramani

Abstract: We introduce a new regression framework, Gaussian process regression networks (GPRN), which combines the structural properties of Bayesian neural networks with the nonparametric flexibility of Gaussian processes. GPRN accommodates input (predictor) dependent signal and noise correlations between multiple output (response) variables, input dependent length-scales and amplitudes, and heavy-tailed predictive distributions. We derive both elliptical slice sampling and variational Bayes inference procedures for GPRN. We apply GPRN as a multiple output regression and multivariate volatility model, demonstrating substantially improved performance over eight popular multiple output (multi-task) Gaussian process models and three multivariate volatility models on real datasets, including a 1000 dimensional gene expression dataset.

Quick Opinion: Based on the abstract and a quick reading of the arxiv version, this sure looks like a nice variation for Gaussian Processes. And merging both Bayesian Neural Networks with GP seems both a good idea for specific problems like Gene Regulatory Networks Inference, given that some people have been using Recursive Neural Networks for such tasks.

Modeling Images using Transformed Indian Buffet Processes
KE ZHAI, Yuening Hu, Jordan Boyd-Graber, Sinead Williamson


Abstract: Latent feature models are attractive for image modeling; images generally contain multiple objects. However, many latent feature models ignore that objects can appear at different locations, or require pre-segmentation of images. While the transformed Indian buffet process (tIBP) provides a method for modeling transformation-invariant features in simple, unsegmented binary images, in its current form it is inappropriate for real images because of computational constraints and modeling assumptions. We combine the tIBP with likelihoods appropriate for real images. We also develop an efficient inference scheme using the cross-correlation between images and features that is both theoretically and empirically faster than existing inference techniques. We demonstrate that, using our method, we are able to discover reasonable components and achieve effective image reconstruction in natural images.

Quick Opinion: I could not find the pdf, so based on the Abstract, the paper seems pretty interesting, although I'm curious in which way did they extend tIBP using likelihoods.



Abstract: Bayesian models offer great flexibility for clustering applications—Bayesian nonparametrics can be used for modeling infinite mixtures, and hierarchical Bayesian models can be utilized for shared clusters across multiple data sets. For the most part, such flexibility is lacking in classical clustering methods such as k-means. In this paper, we revisit the k-means clustering algorithm from a Bayesian nonparametric viewpoint. Inspired by the asymptotic connection between k-means and mixtures of Gaussians, we show that a Gibbs sampling algorithm for the Dirichlet process mixture approaches a hard clustering algorithm in the limit, and further that the resulting algorithm monotonically minimizes an elegant underlying k-means-like clustering objective that includes a penalty for the number of clusters. We generalize this analysis to the case of clustering multiple data sets through a similar asymptotic argument with the hierarchical Dirichlet process. We also discuss further extensions that highlight the benefits of our analysis: i) a spectral relaxation involving thresholded eigenvectors, and ii) a normalized cut graph clustering algorithm that does not fix the number of clusters in the graph.

Quick Opinion: I think this extension was something that was missing in ML, I'm very intrigued on this paper in particular, I remember reading on how K-means was a relaxation for Mixture of distributions with circular Gaussians.  


   

No comments:

Post a Comment