Covariance structure regularization via entropy loss function

    Research output: Contribution to journalArticlepeer-review

    Abstract

    The need to estimate structured covariance matrices arises in a variety of applications and the problem is widely studied in statistics. A new method is proposed for regularizing the covariance structure of a given covariance matrix whose underlying structure has been blurred by random noise, particularly when the dimension of the covariance matrix is high. The regularization is made by choosing an optimal structure from an available class of covariance structures in terms of minimizing the discrepancy, defined via the entropy loss function, between the given matrix and the class. A range of potential candidate structures comprising tridiagonal Toeplitz, compound symmetry, AR(1), and banded Toeplitz is considered. It is shown that for the first three structures local or global minimizers of the discrepancy can be computed by one-dimensional optimization, while for the fourth structure Newton's method enables efficient computation of the global minimizer. Simulation studies are conducted, showing that the proposed new approach provides a reliable way to regularize covariance structures. The approach is also applied to real data analysis, demonstrating the usefulness of the proposed approach in practice. © 2013 Elsevier B.V. All rights reserved.
    Original languageEnglish
    Pages (from-to)315-327
    Number of pages12
    JournalComputational Statistics and Data Analysis
    Volume72
    Early online date16 Oct 2013
    DOIs
    Publication statusPublished - 1 Apr 2014

    Keywords

    • Covariance estimation
    • Covariance structure
    • Entropy loss function
    • Kullback-Leibler divergence
    • Regularization

    Fingerprint

    Dive into the research topics of 'Covariance structure regularization via entropy loss function'. Together they form a unique fingerprint.

    Cite this