A Measure-Theoretic Perspective on Multivariate Information with Applications to Data Science
von Kyle Schick-Poland
Datum der mündl. Prüfung:2024-10-23
Erschienen:2025-09-17
Betreuer:Prof. Dr. Michael Wibral
Gutachter:Prof. Dr. Fred Wolf
Gutachter:Prof. Dr. Anja Sturm
Gutachter:Prof. Dr. Jens Niemeyer
Gutachter:Prof. Dr. Stefan Klumpp
Gutachter:Prof. Dr. Jörg Enderlein
Dateien
Name:PhD_thesis.pdf
Size:2.30Mb
Format:PDF
Zusammenfassung
Englisch
This dissertation establishes a measure-theoretic foundation for information theory by formulating entropy, mutual information, and related quantities on a common type of probability spaces. Using Radon–Nikodym derivatives of induced probability measures, and Lebesgue integration, it extends Shannon’s framework beyond discrete variables to continuous and mixed distributions in a mathematically consistent manner. The main contribution is the introduction of Partial Information Decomposition (PID) within this framework. While continuous PID measures exist, they suffer from limited generality. Here, unique, redundant, and synergistic information are defined as functionals on measurable spaces, expressed via Radon–Nikodym derivatives of conditional probability measures with respect to product measures on σ-algebras. This establishes a PID based on redundancy rigorously across arbitrary measurable spaces. The measure-theoretic extension of TE is expressed in terms of conditional Radon–Nikodym derivatives on product σ-algebras, yielding a generalized quantification of directional information flow in stochastic processes. Building on this, new k-nearest neighbour estimators are derived for PID and Transfer Entropy (TE). These estimators extend the Kraskov–Stögbauer–Grassberger (KSG) method to assess the respective information terms in continuous and mixed distributions, with empirical proofs of convergence relying on the law of large numbers. By unifying discrete, continuous, and mixed settings under a σ-algebraic formalism and deriving provably convergent estimators, this work advances the mathematical foundations of information decomposition and informative measures, providing a rigorous framework for future developments in modern information theory.
Keywords: Information Theory; Partial Information Decomposition; Transfer Entropy; Estimation; Measure Theory
