Variational inference of penalized regression with submodular functions

Koh Takeuchi, Yuichi Yoshida, Yoshinobu Kawahara

Research output: Contribution to conferencePaperpeer-review

Abstract

Various regularizers inducing structured-sparsity are constructed as Lovász extensions of submodular functions. In this paper, we consider a hierarchical probabilistic model of linear regression and its kernel extension with this type of regularization, and develop a variational inference scheme for the posterior estimate on this model. We derive an upper bound on the partition function with an approximation guarantee, and then show that minimizing this bound is equivalent to the minimization of a quadratic function over the polyhedron determined by the corresponding submodular function, which can be solved efficiently by the proximal gradient algorithm. Our scheme gives a natural extension of the Bayesian Lasso model for the maximum a posteriori (MAP) estimation to a variety of regularizers inducing structured sparsity, and thus this work provides a principled way to transfer the advantages of the Bayesian formulation into those models. Finally, we investigate the empirical performance of our scheme with several Bayesian variants of widely known models such as Lasso, generalized fused Lasso, and non-overlapping group Lasso.

Original languageEnglish
Publication statusPublished - Jan 1 2019
Event35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 - Tel Aviv, Israel
Duration: Jul 22 2019Jul 25 2019

Conference

Conference35th Conference on Uncertainty in Artificial Intelligence, UAI 2019
Country/TerritoryIsrael
CityTel Aviv
Period7/22/197/25/19

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Variational inference of penalized regression with submodular functions'. Together they form a unique fingerprint.

Cite this