Minimum Description Length Principle applied to structure adaptation for classification under concept drift

Link:
Autor/in:
Verlag/Körperschaft:
Hamburg University of Technology
Erscheinungsjahr:
2016
Medientyp:
Text
Beschreibung:
  • Traditional supervised machine learning tests the learned classifiers on data which are drawn from the same distribution as the data used for the learning. In practice, this hypothesis does not always hold and the learned classifier has to be transferred from the space of learning data (also called source data) to the space of test data (also called target data) where it is not directly applicable. To operate this transfer, several methods aim at extracting common structural features in the source and target. Our approach employs a neural model to encode the structure of data: such a model is shown to compress the information in the sense of Kolmogorov theory of information. To perform transfer from source to target, we adapt a result shown for analogy reasoning: the structure of the source and target models are learned by applying the Minimum Description Length Principle which assumes that the chosen transformation has the shortest symbolic description on a universal Turing machine. We encounter a minimization problem over the source and target models. To describe the transfer, we develop a multi-level description of the model transformation which is used directly in the minimization of the description length. Our approach has been tested on toy examples, the difficulty of which can be controlled easily by a one-dimensional parameter and is shown to work efficiently on a wide range of problems.
Beziehungen:
DOI 10.1109/IJCNN.2016.7727558
Quellsystem:
TUHH Open Research

Interne Metadaten
Quelldatensatz
oai:tore.tuhh.de:11420/15266