This paper proposes an Extended Minimum Description Length Principle (EMDLP) for feature-based inductive transfer learning, in which both the source and the target data sets contain class labels and relevant features are transferred from the source domain to the target one. Despite numerous works on this topic, few of them have a solid theoretical framework and are parameter-free. Our EMDLP overcomes these flaws and allows us to evaluate the inferiority of the results of transfer learning with the add-sum of the code lengths of five components: the corresponding two hypotheses, the two data sets with the help of the hypotheses, and the set of the transferred features. We design a code book to build the connections between the source and the target tasks. Extensive experiments using both real and artificial data sets show that EMDLP is robust against noise and performs better on the classification accuracy than the state-of-the-art methods.