Abstract
The nature of information is examined. The axiom of information theory: that the object of a measure of information should be a probability distribution, is discussed, as is the relationship of information to uncertainty. Several desiderata of an information measure from the literature are presented and discussed, together with justifications and criticisms of these. Various axiomatizations of Shannon’s information measure are considered and compared, together with alternative justifications and characterizations of alternative measures. Arguments for extending the domain of an information measure to include partial (non-exhaustive) probability distributions are examined and found to be convincing.Various justifications of the Maximum Entropy Principle (MEP) are similarly discussed and compared. These include the information theoretic justification which relies on the justification of H as a measure of information, as well as others which do not.The Minimum Gain Inference Process, is introduced in order to explore the consequences of allowing assignment of zero probabilities (in contrast to the MEP). Its behaviour is explored through a simple case-study. It is shown to satisfy certain principles of inductive reasoning, and is also shown not to satisfy certain others. It is found unlikely to be of great use due to certain undesirable properties.
Original language | English |
---|---|
Awarding Institution |
|
Place of Publication | Manchester, U.K. |
Publisher | |
Publication status | Published - Oct 2007 |
Keywords
- Foundations of Information Theory
- Uncertain Reasoning