PhilSci Archive

In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour? - Bridging the Gap Between Dynamical Systems Theory and Communication Theory

Frigg, Roman (2003) In What Sense is the Kolmogorov-Sinai Entropy a Measure for Chaotic Behaviour? - Bridging the Gap Between Dynamical Systems Theory and Communication Theory. UNSPECIFIED.

[img]
Preview
PDF
Download (227Kb) | Preview

    Abstract

    On an influential account, chaos is explained in terms of random behaviour; and random behaviour in turn is explained in terms of having positive Kolmogorov-Sinai entropy (KSE). Though intuitively plausible, the association of the KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. I provide this justification for the case of Hamiltonian systems by proving that the KSE is equivalent to a generalized version of Shannon's communication-theoretic entropy under certain plausible assumptions. I then discuss consequences of this equivalence for randomness in chaotic dynamical systems.


    Export/Citation:EndNote | BibTeX | Dublin Core | ASCII/Text Citation (Chicago) | HTML Citation | OpenURL
    Social Networking:

    Item Type: Other
    Keywords: Chaos, randomness, Kolmogorov-Sinai entropy, Shannon, communication theory, information.
    Subjects: Specific Sciences > Physics > Statistical Mechanics/Thermodynamics
    Depositing User: Roman Frigg
    Date Deposited: 29 Sep 2006
    Last Modified: 07 Oct 2010 11:14
    Item ID: 2929
    URI: http://philsci-archive.pitt.edu/id/eprint/2929

    Actions (login required)

    View Item

    Document Downloads