Abstract
Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s probability defined by Zadeh is treated as the logical probability sought by Popper, and the membership grade is treated as the truth-value of a proposition and also as the posterior logical probability. The classical relative information formula (Information=log(Posterior probability / Prior probability) is revised into SIF by replacing the posterior probability with the membership grade and the prior probability with the fuzzy set’s probability. The SIF can be explained as “Information=Testing severity – Relative square deviation” and hence can be used as Popper's information criterion to test scientific theories or propositions. The information measure defined by the SIF also means the spared codeword length as the classical information measure. This paper introduces the set-Bayes’ formula which establishes the relationship between statistical probability and logical probability, derives Fuzzy Information Criterion (FIC) for the optimization of semantic channel, and discusses applications of SIF and FIC in areas such as linguistic communication, prediction, estimation, test, GPS, translation, and fuzzy reasoning. Particularly, through a detailed example of reasoning, it is proved that we can improve semantic channel with proper fuzziness to increase average semantic information to reach its upper limit: Shannon mutual information.