Show simple item record

A Measure-Theoretic Perspective on Multivariate Information with Applications to Data Science

dc.contributor.advisorWibral, Michael Prof. Dr.
dc.contributor.authorSchick-Poland, Kyle
dc.date.accessioned2025-09-17T17:53:48Z
dc.date.available2025-09-23T00:50:07Z
dc.date.issued2025-09-17
dc.identifier.urihttp://resolver.sub.uni-goettingen.de/purl?ediss-11858/16236
dc.identifier.urihttp://dx.doi.org/10.53846/goediss-11518
dc.format.extent129de
dc.language.isoengde
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subject.ddc530de
dc.titleA Measure-Theoretic Perspective on Multivariate Information with Applications to Data Sciencede
dc.typedoctoralThesisde
dc.contributor.refereeWolf, Fred Prof. Dr.
dc.date.examination2024-10-23de
dc.subject.gokPhysik (PPN621336750)de
dc.description.abstractengThis dissertation establishes a measure-theoretic foundation for information theory by formulating entropy, mutual information, and related quantities on a common type of probability spaces. Using Radon–Nikodym derivatives of induced probability measures, and Lebesgue integration, it extends Shannon’s framework beyond discrete variables to continuous and mixed distributions in a mathematically consistent manner. The main contribution is the introduction of Partial Information Decomposition (PID) within this framework. While continuous PID measures exist, they suffer from limited generality. Here, unique, redundant, and synergistic information are defined as functionals on measurable spaces, expressed via Radon–Nikodym derivatives of conditional probability measures with respect to product measures on σ-algebras. This establishes a PID based on redundancy rigorously across arbitrary measurable spaces. The measure-theoretic extension of TE is expressed in terms of conditional Radon–Nikodym derivatives on product σ-algebras, yielding a generalized quantification of directional information flow in stochastic processes. Building on this, new k-nearest neighbour estimators are derived for PID and Transfer Entropy (TE). These estimators extend the Kraskov–Stögbauer–Grassberger (KSG) method to assess the respective information terms in continuous and mixed distributions, with empirical proofs of convergence relying on the law of large numbers. By unifying discrete, continuous, and mixed settings under a σ-algebraic formalism and deriving provably convergent estimators, this work advances the mathematical foundations of information decomposition and informative measures, providing a rigorous framework for future developments in modern information theory.de
dc.contributor.coRefereeSturm, Anja Prof. Dr.
dc.contributor.thirdRefereeNiemeyer, Jens Prof. Dr.
dc.contributor.thirdRefereeKlumpp, Stefan Prof. Dr.
dc.contributor.thirdRefereeEnderlein, Jörg Prof. Dr.
dc.subject.engInformation Theoryde
dc.subject.engPartial Information Decompositionde
dc.subject.engTransfer Entropyde
dc.subject.engEstimationde
dc.subject.engMeasure Theoryde
dc.identifier.urnurn:nbn:de:gbv:7-ediss-16236-8
dc.affiliation.instituteFakultät für Physikde
dc.description.embargoed2025-09-22de
dc.identifier.ppn1936215853
dc.notes.confirmationsentConfirmation sent 2025-09-17T19:45:01de


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record