Predictive complexity and information

Michael V. Vyugin, Vladimir V. V'Yugin

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

The notions of predictive complexity and of corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional KG(x) and conditional KG(x|y) predictive complexities are studied. We define an algorithm which has some "expanding property". It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information IG(y:x) is studied. We show that this information is noncommutative in a very strong sense and present asymptotic relations between values IG(y:x), IG(x:y), KG(x) and KG(y).

Original languageEnglish
Pages (from-to)539-554
Number of pages16
JournalJournal of Computer and System Sciences
Volume70
Issue number4
DOIs
Publication statusPublished - Jun 2005
Externally publishedYes

Keywords

  • Algorithmic prediction
  • Expanding property
  • Kolmogorov complexity
  • Loss functions
  • Machine learning
  • Predictive complexity
  • Predictive information

Fingerprint

Dive into the research topics of 'Predictive complexity and information'. Together they form a unique fingerprint.

Cite this