PhilSci Archive

Information, learning and falsification

Balduzzi, David (2011) Information, learning and falsification. Advances in Neural Information Processing Systems (NIPS).

[img]
Preview
PDF
ilf.pdf - Accepted Version

Download (84kB)

Abstract

There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out. The third, statistical learning theory, has introduced measures of capacity that control (in part) the expected risk of classifiers. These capacities quantify the expectations regarding future data that learning algorithms embed into classifiers.

This note describes a new method of quantifying information, effective information, that links algorithmic information to Shannon information, and also links both to capacities arising in statistical learning theory. After introducing the measure, we show that it provides a non-universal analog of Kolmogorov complexity. We then apply it to derive basic capacities in statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. A nice byproduct of our approach is an interpretation of the explanatory power of a learning algorithm in terms of the number of hypotheses it falsifies, counted in two different ways for the two capacities. We also discuss how effective information relates to information gain, Shannon and mutual information.


Export/Citation: EndNote | BibTeX | Dublin Core | ASCII/Text Citation (Chicago) | HTML Citation | OpenURL
Social Networking:
Share |

Item Type: Other
Creators:
CreatorsEmailORCID
Balduzzi, Daviddbalduzzi@gmail.com
Keywords: falsification, information theory, statistical learning theory, Popper
Subjects: General Issues > Confirmation/Induction
General Issues > Formal Learning Theory
Specific Sciences > Probability/Statistics
Depositing User: Dr David Balduzzi
Date Deposited: 19 Jan 2012 03:35
Last Modified: 19 Jan 2012 03:35
Item ID: 8989
Journal or Publication Title: Philosophy of Machine Learning workshop
Publisher: Advances in Neural Information Processing Systems (NIPS)
Subjects: General Issues > Confirmation/Induction
General Issues > Formal Learning Theory
Specific Sciences > Probability/Statistics
Date: December 2011
URI: https://philsci-archive.pitt.edu/id/eprint/8989

Monthly Views for the past 3 years

Monthly Downloads for the past 3 years

Plum Analytics

Actions (login required)

View Item View Item