A measure of the information content of EIT data.

Andy Adler, Richard Youmaran, William R B Lionheart

    Research output: Contribution to journalArticlepeer-review

    Abstract

    We ask: how many bits of information (in the Shannon sense) do we get from a set of EIT measurements? Here, the term information in measurements (IM) is defined as: the decrease in uncertainty about the contents of a medium, due to a set of measurements. This decrease in uncertainty is quantified by the change from the inter-class model, q, defined by the prior information, to the intra-class model, p, given by the measured data (corrupted by noise). IM is measured by the expected relative entropy (Kullback-Leibler divergence) between distributions q and p, and corresponds to the channel capacity in an analogous communications system. Based on a Gaussian model of the measurement noise, (Sigma(n)), and a prior model of the image element covariances (Sigma(x)), we calculate IM = 1/2 summation operator log(2)([SNR](i) + 1), where [SNR](i) is the signal-to-noise ratio for each independent measurement calculated from the prior and noise models. For an example, we consider saline tank measurements from a 16 electrode EIT system, with a 2 cm radius non-conductive target, and calculate IM =179 bits. Temporal sequences of frames are considered, and formulae for IM as a function of temporal image element correlations are derived. We suggest that this measure may allow novel insights into questions such as distinguishability limits, optimal measurement schemes and data fusion.
    Original languageEnglish
    Pages (from-to)S101-S109
    JournalPhysiological Measurement
    Volume29
    Issue number6
    DOIs
    Publication statusPublished - Jun 2008

    Keywords

    • Electrical impedance tomography
    • Kullback-Leibler divergence
    • Measurement information

    Fingerprint

    Dive into the research topics of 'A measure of the information content of EIT data.'. Together they form a unique fingerprint.

    Cite this