Theory and algorithm for learning with dissimilarity functions

Liwei Wang, Masashi Sugiyama, Cheng Yang, Kohei Hatano, Jufu Feng

Research output: Contribution to journalLetterpeer-review

15 Citations (Scopus)


We study the problem of classification when only a dissimilarity function between objects is accessible. That is, data samples are represented not by feature vectors but in terms of their pairwise dissimilarities. We establish sufficient conditions for dissimilarity functions to allow building accurate classifiers. The theory immediately suggests a learning paradigm: construct an ensemble of simple classifiers, each depending on a pair of examples; then find a convex combination of them to achieve a large margin. We next develop a practical algorithm referred to as dissimilarity- based boosting (DBoost) for learning with dissimilarity functions under theoretical guidance. Experiments on a variety of databases demonstrate that the DBoost algorithm is promising for several dissimilarity measures widely used in practice.

Original languageEnglish
Pages (from-to)1459-1484
Number of pages26
JournalNeural Computation
Issue number5
Publication statusPublished - May 2009

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience


Dive into the research topics of 'Theory and algorithm for learning with dissimilarity functions'. Together they form a unique fingerprint.

Cite this