Universal entropy of word ordering across linguistic families

Marcelo A. Montemurro, Damián H. Zanette

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Background: The language faculty is probably the most distinctive feature of our species, and endows us with a unique ability to exchange highly structured information. In written language, information is encoded by the concatenation of basic symbols under grammatical and semantic constraints. As is also the case in other natural information carriers, the resulting symbolic sequences show a delicate balance between order and disorder. That balance is determined by the interplay between the diversity of symbols and by their specific ordering in the sequences. Here we used entropy to quantify the contribution of different organizational levels to the overall statistical structure of language. Methodology/Principal Findings: We computed a relative entropy measure to quantify the degree of ordering in word sequences from languages belonging to several linguistic families. While a direct estimation of the overall entropy of language yielded values that varied for the different families considered, the relative entropy quantifying word ordering presented an almost constant value for all those families. Conclusions/Significance: Our results indicate that despite the differences in the structure and vocabulary of the languages analyzed, the impact of word ordering in the structure of language is a statistical linguistic universal. © 2011 Montemurro, Zanette.
    Original languageEnglish
    Article numbere19875
    JournalPLoS ONE
    Volume6
    Issue number5
    DOIs
    Publication statusPublished - 2011

    Fingerprint

    Dive into the research topics of 'Universal entropy of word ordering across linguistic families'. Together they form a unique fingerprint.

    Cite this