• Deutsch
    • English
  • English 
    • Deutsch
    • English
  • Login
Item View 
  •   Home
  • Naturwissenschaften, Mathematik und Informatik
  • Fakultät für Physik (inkl. GAUSS)
  • Item View
  •   Home
  • Naturwissenschaften, Mathematik und Informatik
  • Fakultät für Physik (inkl. GAUSS)
  • Item View
JavaScript is disabled for your browser. Some features of this site may not work without it.

A Measure-Theoretic Perspective on Multivariate Information with Applications to Data Science

by Kyle Schick-Poland
Doctoral thesis
Date of Examination:2024-10-23
Date of issue:2025-09-17
Advisor:Prof. Dr. Michael Wibral
Referee:Prof. Dr. Fred Wolf
Referee:Prof. Dr. Anja Sturm
Referee:Prof. Dr. Jens Niemeyer
Referee:Prof. Dr. Stefan Klumpp
Referee:Prof. Dr. Jörg Enderlein
crossref-logoPersistent Address: http://dx.doi.org/10.53846/goediss-11518

 

 

Files in this item

Name:PhD_thesis.pdf
Size:2.30Mb
Format:PDF
ViewOpen

The following license files are associated with this item:


Abstract

English

This dissertation establishes a measure-theoretic foundation for information theory by formulating entropy, mutual information, and related quantities on a common type of probability spaces. Using Radon–Nikodym derivatives of induced probability measures, and Lebesgue integration, it extends Shannon’s framework beyond discrete variables to continuous and mixed distributions in a mathematically consistent manner. The main contribution is the introduction of Partial Information Decomposition (PID) within this framework. While continuous PID measures exist, they suffer from limited generality. Here, unique, redundant, and synergistic information are defined as functionals on measurable spaces, expressed via Radon–Nikodym derivatives of conditional probability measures with respect to product measures on σ-algebras. This establishes a PID based on redundancy rigorously across arbitrary measurable spaces. The measure-theoretic extension of TE is expressed in terms of conditional Radon–Nikodym derivatives on product σ-algebras, yielding a generalized quantification of directional information flow in stochastic processes. Building on this, new k-nearest neighbour estimators are derived for PID and Transfer Entropy (TE). These estimators extend the Kraskov–Stögbauer–Grassberger (KSG) method to assess the respective information terms in continuous and mixed distributions, with empirical proofs of convergence relying on the law of large numbers. By unifying discrete, continuous, and mixed settings under a σ-algebraic formalism and deriving provably convergent estimators, this work advances the mathematical foundations of information decomposition and informative measures, providing a rigorous framework for future developments in modern information theory.
Keywords: Information Theory; Partial Information Decomposition; Transfer Entropy; Estimation; Measure Theory
 

Statistik

Publish here

Browse

All of eDissFaculties & ProgramsIssue DateAuthorAdvisor & RefereeAdvisorRefereeTitlesTypeThis FacultyIssue DateAuthorAdvisor & RefereeAdvisorRefereeTitlesType

Help & Info

Publishing on eDissPDF GuideTerms of ContractFAQ

Contact Us | Impressum | Cookie Consents | Data Protection Information | Accessibility
eDiss Office - SUB Göttingen (Central Library)
Platz der Göttinger Sieben 1
Mo - Fr 10:00 – 12:00 h


Tel.: +49 (0)551 39-27809 (general inquiries)
Tel.: +49 (0)551 39-28655 (open access/parallel publications)
ediss_AT_sub.uni-goettingen.de
[Please replace "_AT_" with the "@" sign when using our email adresses.]
Göttingen State and University Library | Göttingen University
Medicine Library (Doctoral candidates of medicine only)
Robert-Koch-Str. 40
Mon – Fri 8:00 – 24:00 h
Sat - Sun 8:00 – 22:00 h
Holidays 10:00 – 20:00 h
Tel.: +49 551 39-8395 (general inquiries)
Tel.: +49 (0)551 39-28655 (open access/parallel publications)
bbmed_AT_sub.uni-goettingen.de
[Please replace "_AT_" with the "@" sign when using our email adresses.]