## Abstract

One of the most efficient way of improving the performance of learning algorithms is "snooping", i.e. using some information about the data to be predicted for choosing the parameters of the learning algorithm or the learning algorithm itself. Allowing different degrees of snooping α makes it possible to attain a better performance Loss_{P}(x) of a prediction strategy P on the given data set x. We study the "snooping curves" L_{x}(α)=inf_{K(P)≤α}Loss_{P}(x), where K(P) is the Kolmogorov complexity of the prediction strategy P. We prove that every non-increasing function can be approximated with arbitrary precision by some snooping function L_{x}. Our framework is that of on-line prediction; for simplicity we assume that sequences x are binary.

Original language | English |
---|---|

Pages (from-to) | 407-415 |

Number of pages | 9 |

Journal | Theoretical Computer Science |

Volume | 276 |

Issue number | 1-2 |

DOIs | |

Publication status | Published - 6 Apr 2002 |

Externally published | Yes |

## Keywords

- Computer learning
- Computer prediction
- Kolmogorov complexity
- Snooping