Predictive complexity and information

Michael V. Vyugin, Vladimir V. V'yugin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)

Abstract

A new notion of predictive complexity and corresponding amount of information are considered. Predictive complexity is a generalization of Kolmogorov complexity which bounds the ability of any algorithm to predict elements of a sequence of outcomes. We consider predictive complexity for a wide class of bounded loss functions which are generalizations of square-loss function. Relations between unconditional KG(x) and conditional KG(x|y) predictive complexities are studied. We define an algorithm which has some "expanding property". It transforms with positive probability sequences of given predictive complexity into sequences of essentially bigger predictive complexity. A concept of amount of predictive information IG(y : x) is studied. We show that this information is non-commutative in a very strong sense and present asymptotic relations between values IG(y : x), IG(x : y), KG(x) and KG(y).

Original languageEnglish
Title of host publicationComputational Learning Theory - 15th Annual Conference on Computational Learning Theory, COLT 2002, Proceedings
EditorsJyrki Kivinen, Robert H. Sloan
PublisherSpringer Verlag
Pages90-105
Number of pages16
ISBN (Electronic)354043836X, 9783540438366
Publication statusPublished - 2002
Externally publishedYes
Event15th Annual Conference on Computational Learning Theory, COLT 2002 - Sydney, Australia
Duration: 8 Jul 200210 Jul 2002

Publication series

NameLecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science)
Volume2375
ISSN (Print)0302-9743

Conference

Conference15th Annual Conference on Computational Learning Theory, COLT 2002
Country/TerritoryAustralia
CitySydney
Period8/07/0210/07/02

Fingerprint

Dive into the research topics of 'Predictive complexity and information'. Together they form a unique fingerprint.

Cite this