Multi-Task Learning for Compositional Data via Sparse Network Lasso

Akira Okazaki, Shuichi Kawano

Research output: Contribution to journalArticlepeer-review

Abstract

Multi-task learning is a statistical methodology that aims to improve the generalization performances of estimation and prediction tasks by sharing common information among multiple tasks. On the other hand, compositional data consist of proportions as components summing to one. Because components of compositional data depend on each other, existing methods for multi-task learning cannot be directly applied to them. In the framework of multi-task learning, a network lasso regularization enables us to consider each sample as a single task and construct different models for each one. In this paper, we propose a multi-task learning method for compositional data using a sparse network lasso. We focus on a symmetric form of the log-contrast model, which is a regression model with compositional covariates. Our proposed method enables us to extract latent clusters and relevant variables for compositional data by considering relationships among samples. The effectiveness of the proposed method is evaluated through simulation studies and application to gut microbiome data. Both results show that the prediction accuracy of our proposed method is better than existing methods when information about relationships among samples is appropriately obtained.

Original languageEnglish
Article number1839
JournalEntropy
Volume24
Issue number12
DOIs
Publication statusPublished - Dec 2022

All Science Journal Classification (ASJC) codes

  • Information Systems
  • Mathematical Physics
  • Physics and Astronomy (miscellaneous)
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Multi-Task Learning for Compositional Data via Sparse Network Lasso'. Together they form a unique fingerprint.

Cite this