Application of offset estimator of differential entropy and mutual information with multivariate data

Ivan Marin-Franch, Martín Sanz-Sabater, David Foster

Research output: Contribution to journalArticlepeer-review

Abstract

Numerical estimators of differential entropy and mutual information can be slow to converge as sample size increases. The offset Kozachenko–Leonenko (KLo) method described here implements an offset version of the Kozachenko–Leonenko estimator that can markedly improve convergence. Its use is illustrated in applications to the comparison of trivariate data from successive scene color images and the comparison of univariate data from stereophonic music tracks. Publicly available code for KLo estimation of both differential entropy and mutual information is provided for R, Python, and MATLAB computing environments at https://github.com/imarinfr/klo.
Original languageEnglish
Article numbere16
JournalExperimental Results
Volume3
DOIs
Publication statusPublished - 5 Sept 2022

Keywords

  • R
  • Kozachenko-Leonenko estimator
  • information theory
  • mutual information
  • nonparametric statistics
  • and MATLAB
  • Python

Fingerprint

Dive into the research topics of 'Application of offset estimator of differential entropy and mutual information with multivariate data'. Together they form a unique fingerprint.

Cite this