Order:
See also
Chenguang Lu
Liaoning Technical University
  1. Information Reflection Theory Based on Information Theories, Analog Symbolism, and the Generalized Relativity Principle.Chenguang Lu - 2023 - Comput. Sci. Math. Forum 8 (1):45.
    Reflection Theory holds that our sensations reflect physical properties, whereas Empiricism believes that sense (data), presentations, and phenomena are the ultimate existence. Lenin adhered to Reflection Theory and criticized Helmholtz’s sensory symbolism for affirming the similarity between a sensation and a physical property. By using information and color vision theories, analyzing the ostensive definition with inverted qualia, and extending the relativity principle, this paper affirms the external world’s existence independent of personal sensations. Still, it denies the similarity between a sense (...)
    Download  
     
    Export citation  
     
    Bookmark  
  2. Channels’ Confirmation and Predictions’ Confirmation: From the Medical Test to the Raven Paradox.Chenguang Lu - 2020 - Entropy 22 (4):384.
    After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based on the (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  3. Semantic Information G Theory and Logical Bayesian Inference for Machine Learning.Chenguang Lu - 2019 - Information 10 (8):261.
    An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...)
    Download  
     
    Export citation  
     
    Bookmark   1 citation  
  4. Causal Confirmation Measures: From Simpson’s Paradox to COVID-19.Chenguang Lu - 2023 - Entropy 25 (1):143.
    When we compare the influences of two causes on an outcome, if the conclusion from every group is against that from the conflation, we think there is Simpson’s Paradox. The Existing Causal Inference Theory (ECIT) can make the overall conclusion consistent with the grouping conclusion by removing the confounder’s influence to eliminate the paradox. The ECIT uses relative risk difference Pd = max(0, (R − 1)/R) (R denotes the risk ratio) as the probability of causation. In contrast, Philosopher Fitelson uses (...)
    Download  
     
    Export citation  
     
    Bookmark  
  5.  99
    Causal Confirmation Measures: From Simpson’s Paradox to COVID-19.Chenguang Lu - 2023 - Entropy 25 (1):143.
    When we compare the influences of two causes on an outcome, if the conclusion from every group is against that from the conflation, we think there is Simpson’s Paradox. The Existing Causal Inference Theory (ECIT) can make the overall conclusion consistent with the grouping conclusion by removing the confounder’s influence to eliminate the paradox. The ECIT uses relative risk difference Pd = max(0, (R − 1)/R) (R denotes the risk ratio) as the probability of causation. In contrast, Philosopher Fitelson uses (...)
    Download  
     
    Export citation  
     
    Bookmark  
  6. A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
    A generalized information theory is proposed as a natural extension of Shannon's information theory. It proposes that information comes from forecasts. The more precise and the more unexpected a forecast is, the more information it conveys. If subjective forecast always conforms with objective facts then the generalized information measure will be equivalent to Shannon's information measure. The generalized communication model is consistent with K. R. Popper's model of knowledge evolution. The mathematical foundations of the new information theory, the generalized communication (...)
    Download  
     
    Export citation  
     
    Bookmark   3 citations  
  7. A Semantic Information Formula Compatible with Shannon and Popper's Theories.Chenguang Lu - manuscript
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s (...)
    Download  
     
    Export citation  
     
    Bookmark  
  8. 美感奥妙和需求进化(Mystery of Beauty Sense and Evolution of Needs).Chenguang Lu - 2007 - Hefei: China Science and Technology University Press.
    It proposes the Need Aesthetics. It uses the needing relationship to explain Human and birds' evolution of beauty sense, bird's colorful plumage and sexual selection.
    Download  
     
    Export citation  
     
    Bookmark  
  9. Reviewing Evolution of Learning Functions and Semantic Information Measures for Understanding Deep Learning. [REVIEW]Chenguang Lu - 2023 - Entropy 25 (5).
    A new trend in deep learning, represented by Mutual Information Neural Estimation (MINE) and Information Noise Contrast Estimation (InfoNCE), is emerging. In this trend, similarity functions and Estimated Mutual Information (EMI) are used as learning and objective functions. Coincidentally, EMI is essentially the same as Semantic Mutual Information (SeMI) proposed by the author 30 years ago. This paper first reviews the evolutionary histories of semantic information measures and learning functions. Then, it briefly introduces the author’s semantic information G theory with (...)
    Download  
     
    Export citation  
     
    Bookmark