A Generalization of Shannon's Information Theory

Int. J. Of General Systems 28 (6):453-490 (1999)
Download Edit this record How to cite View on PhilPapers
Abstract
A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication model , information measures for semantic information and sensory information, and the coding meanings of generalized entropy and generalized mutual information are introduced. Assessments and optimizations of pattern recognition, predictions, and detection with the generalized information criterion are discussed. For economization of communication, a revised version of rate-distortion theory: rate-of-keeping-precision theory, which is a theory for datum compression and also a theory for matching an objective channels with the subjective understanding of information receivers, is proposed. Applications include stock market forecasting and video image presentation.
PhilPapers/Archive ID
LUAGO
Upload history
Archival date: 2016-02-25
View other versions
Added to PP index
2016-02-25

Total views
177 ( #25,258 of 53,049 )

Recent downloads (6 months)
13 ( #38,514 of 53,049 )

How can I increase my downloads?

Downloads since first upload
This graph includes both downloads from PhilArchive and clicks on external links on PhilPapers.