Feature-based inductive transfer learning through minimum encoding

Hao Shao, Einoshin Suzuki

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

This paper proposes an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Despite numerous works on this topic, few of them have a solid theoretical framework and are parameter-free. Our EMDLP overcomes these flaws and allows us to evaluate the inferiority of the results of transfer learning with the add-sum of the code lengths of five components: the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. We design a code book to build the connections between the source and the target tasks. Extensive experiments using both real and artificial data sets show that EMDLP is robust against noise and performs better on the classification accuracy than the state-of-the-art methods.

Original languageEnglish
Title of host publicationProceedings of the 11th SIAM International Conference on Data Mining, SDM 2011
Pages259-270
Number of pages12
Publication statusPublished - 2011
Event11th SIAM International Conference on Data Mining, SDM 2011 - Mesa, AZ, United States
Duration: Apr 28 2011Apr 30 2011

Other

Other11th SIAM International Conference on Data Mining, SDM 2011
CountryUnited States
CityMesa, AZ
Period4/28/114/30/11

Fingerprint

Labels
Defects
Experiments

All Science Journal Classification (ASJC) codes

  • Software

Cite this

Shao, H., & Suzuki, E. (2011). Feature-based inductive transfer learning through minimum encoding. In Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011 (pp. 259-270)

Feature-based inductive transfer learning through minimum encoding. / Shao, Hao; Suzuki, Einoshin.

Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011. 2011. p. 259-270.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Shao, H & Suzuki, E 2011, Feature-based inductive transfer learning through minimum encoding. in Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011. pp. 259-270, 11th SIAM International Conference on Data Mining, SDM 2011, Mesa, AZ, United States, 4/28/11.
Shao H, Suzuki E. Feature-based inductive transfer learning through minimum encoding. In Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011. 2011. p. 259-270
Shao, Hao ; Suzuki, Einoshin. / Feature-based inductive transfer learning through minimum encoding. Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011. 2011. pp. 259-270
@inproceedings{ae4cedb1beb9417995025439d82ae398,
title = "Feature-based inductive transfer learning through minimum encoding",
abstract = "This paper proposes an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Despite numerous works on this topic, few of them have a solid theoretical framework and are parameter-free. Our EMDLP overcomes these flaws and allows us to evaluate the inferiority of the results of transfer learning with the add-sum of the code lengths of five components: the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. We design a code book to build the connections between the source and the target tasks. Extensive experiments using both real and artificial data sets show that EMDLP is robust against noise and performs better on the classification accuracy than the state-of-the-art methods.",
author = "Hao Shao and Einoshin Suzuki",
year = "2011",
language = "English",
isbn = "9780898719925",
pages = "259--270",
booktitle = "Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011",

}

TY - GEN

T1 - Feature-based inductive transfer learning through minimum encoding

AU - Shao, Hao

AU - Suzuki, Einoshin

PY - 2011

Y1 - 2011

N2 - This paper proposes an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Despite numerous works on this topic, few of them have a solid theoretical framework and are parameter-free. Our EMDLP overcomes these flaws and allows us to evaluate the inferiority of the results of transfer learning with the add-sum of the code lengths of five components: the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. We design a code book to build the connections between the source and the target tasks. Extensive experiments using both real and artificial data sets show that EMDLP is robust against noise and performs better on the classification accuracy than the state-of-the-art methods.

AB - This paper proposes an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Despite numerous works on this topic, few of them have a solid theoretical framework and are parameter-free. Our EMDLP overcomes these flaws and allows us to evaluate the inferiority of the results of transfer learning with the add-sum of the code lengths of five components: the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. We design a code book to build the connections between the source and the target tasks. Extensive experiments using both real and artificial data sets show that EMDLP is robust against noise and performs better on the classification accuracy than the state-of-the-art methods.

UR - http://www.scopus.com/inward/record.url?scp=84872715330&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84872715330&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9780898719925

SP - 259

EP - 270

BT - Proceedings of the 11th SIAM International Conference on Data Mining, SDM 2011

ER -