Variational inference of penalized regression with submodular functions

Koh Takeuchi, Yuichi Yoshida, Yoshinobu Kawahara

Research output: Contribution to conferencePaper

Abstract

Various regularizers inducing structured-sparsity are constructed as Lovász extensions of submodular functions. In this paper, we consider a hierarchical probabilistic model of linear regression and its kernel extension with this type of regularization, and develop a variational inference scheme for the posterior estimate on this model. We derive an upper bound on the partition function with an approximation guarantee, and then show that minimizing this bound is equivalent to the minimization of a quadratic function over the polyhedron determined by the corresponding submodular function, which can be solved efficiently by the proximal gradient algorithm. Our scheme gives a natural extension of the Bayesian Lasso model for the maximum a posteriori (MAP) estimation to a variety of regularizers inducing structured sparsity, and thus this work provides a principled way to transfer the advantages of the Bayesian formulation into those models. Finally, we investigate the empirical performance of our scheme with several Bayesian variants of widely known models such as Lasso, generalized fused Lasso, and non-overlapping group Lasso.

Original languageEnglish
Publication statusPublished - Jan 1 2019
Event35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 - Tel Aviv, Israel
Duration: Jul 22 2019Jul 25 2019

Conference

Conference35th Conference on Uncertainty in Artificial Intelligence, UAI 2019
CountryIsrael
CityTel Aviv
Period7/22/197/25/19

Fingerprint

Linear regression
Statistical Models

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence

Cite this

Takeuchi, K., Yoshida, Y., & Kawahara, Y. (2019). Variational inference of penalized regression with submodular functions. Paper presented at 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019, Tel Aviv, Israel.

Variational inference of penalized regression with submodular functions. / Takeuchi, Koh; Yoshida, Yuichi; Kawahara, Yoshinobu.

2019. Paper presented at 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019, Tel Aviv, Israel.

Research output: Contribution to conferencePaper

Takeuchi, K, Yoshida, Y & Kawahara, Y 2019, 'Variational inference of penalized regression with submodular functions', Paper presented at 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019, Tel Aviv, Israel, 7/22/19 - 7/25/19.
Takeuchi K, Yoshida Y, Kawahara Y. Variational inference of penalized regression with submodular functions. 2019. Paper presented at 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019, Tel Aviv, Israel.
Takeuchi, Koh ; Yoshida, Yuichi ; Kawahara, Yoshinobu. / Variational inference of penalized regression with submodular functions. Paper presented at 35th Conference on Uncertainty in Artificial Intelligence, UAI 2019, Tel Aviv, Israel.
@conference{cff45fedd9834226bdb2ce4bef2c06e9,
title = "Variational inference of penalized regression with submodular functions",
abstract = "Various regularizers inducing structured-sparsity are constructed as Lov{\'a}sz extensions of submodular functions. In this paper, we consider a hierarchical probabilistic model of linear regression and its kernel extension with this type of regularization, and develop a variational inference scheme for the posterior estimate on this model. We derive an upper bound on the partition function with an approximation guarantee, and then show that minimizing this bound is equivalent to the minimization of a quadratic function over the polyhedron determined by the corresponding submodular function, which can be solved efficiently by the proximal gradient algorithm. Our scheme gives a natural extension of the Bayesian Lasso model for the maximum a posteriori (MAP) estimation to a variety of regularizers inducing structured sparsity, and thus this work provides a principled way to transfer the advantages of the Bayesian formulation into those models. Finally, we investigate the empirical performance of our scheme with several Bayesian variants of widely known models such as Lasso, generalized fused Lasso, and non-overlapping group Lasso.",
author = "Koh Takeuchi and Yuichi Yoshida and Yoshinobu Kawahara",
year = "2019",
month = "1",
day = "1",
language = "English",
note = "35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 ; Conference date: 22-07-2019 Through 25-07-2019",

}

TY - CONF

T1 - Variational inference of penalized regression with submodular functions

AU - Takeuchi, Koh

AU - Yoshida, Yuichi

AU - Kawahara, Yoshinobu

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Various regularizers inducing structured-sparsity are constructed as Lovász extensions of submodular functions. In this paper, we consider a hierarchical probabilistic model of linear regression and its kernel extension with this type of regularization, and develop a variational inference scheme for the posterior estimate on this model. We derive an upper bound on the partition function with an approximation guarantee, and then show that minimizing this bound is equivalent to the minimization of a quadratic function over the polyhedron determined by the corresponding submodular function, which can be solved efficiently by the proximal gradient algorithm. Our scheme gives a natural extension of the Bayesian Lasso model for the maximum a posteriori (MAP) estimation to a variety of regularizers inducing structured sparsity, and thus this work provides a principled way to transfer the advantages of the Bayesian formulation into those models. Finally, we investigate the empirical performance of our scheme with several Bayesian variants of widely known models such as Lasso, generalized fused Lasso, and non-overlapping group Lasso.

AB - Various regularizers inducing structured-sparsity are constructed as Lovász extensions of submodular functions. In this paper, we consider a hierarchical probabilistic model of linear regression and its kernel extension with this type of regularization, and develop a variational inference scheme for the posterior estimate on this model. We derive an upper bound on the partition function with an approximation guarantee, and then show that minimizing this bound is equivalent to the minimization of a quadratic function over the polyhedron determined by the corresponding submodular function, which can be solved efficiently by the proximal gradient algorithm. Our scheme gives a natural extension of the Bayesian Lasso model for the maximum a posteriori (MAP) estimation to a variety of regularizers inducing structured sparsity, and thus this work provides a principled way to transfer the advantages of the Bayesian formulation into those models. Finally, we investigate the empirical performance of our scheme with several Bayesian variants of widely known models such as Lasso, generalized fused Lasso, and non-overlapping group Lasso.

UR - http://www.scopus.com/inward/record.url?scp=85073246566&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85073246566&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85073246566

ER -