A Generalization of Shannon's Information Theory

Int. J. Of General Systems 28 (6):453-490 (1999)
Download Edit this record How to cite View on PhilPapers
Abstract
A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation.
PhilPapers/Archive ID
LUAGO
Revision history
Archival date: 2016-02-25
View upload history
References found in this work BETA

No references found.

Add more references

Citations of this work BETA

Add more citations

Added to PP index
2016-02-25

Total views
136 ( #22,412 of 42,948 )

Recent downloads (6 months)
38 ( #17,921 of 42,948 )

How can I increase my downloads?

Downloads since first upload
This graph includes both downloads from PhilArchive and clicks to external links.