A Study of Alternative Axiomatic Justifications for the Entropy Function, and of its Use in Uncertain Reasoning

E. Howarth

Research output: ThesisMaster's Thesis

82 Downloads (Pure)

Abstract

The nature of information is examined. The axiom of information theory: that the object of a measure of information should be a probability distribution, is discussed, as is the relationship of information to uncertainty. Several desiderata of an information measure from the literature are presented and discussed, together with justifications and criticisms of these. Various axiomatizations of Shannon’s information measure are considered and compared, together with alternative justifications and characterizations of alternative measures. Arguments for extending the domain of an information measure to include partial (non-exhaustive) probability distributions are examined and found to be convincing.Various justifications of the Maximum Entropy Principle (MEP) are similarly discussed and compared. These include the information theoretic justification which relies on the justification of H as a measure of information, as well as others which do not.The Minimum Gain Inference Process, is introduced in order to explore the consequences of allowing assignment of zero probabilities (in contrast to the MEP). Its behaviour is explored through a simple case-study. It is shown to satisfy certain principles of inductive reasoning, and is also shown not to satisfy certain others. It is found unlikely to be of great use due to certain undesirable properties.
Original languageEnglish
Awarding Institution
  • University of Manchester
Place of PublicationManchester, U.K.
Publisher
Publication statusPublished - Oct 2007

Keywords

  • Foundations of Information Theory
  • Uncertain Reasoning

Fingerprint

Dive into the research topics of 'A Study of Alternative Axiomatic Justifications for the Entropy Function, and of its Use in Uncertain Reasoning'. Together they form a unique fingerprint.

Cite this