27 Mar 2024 | Kai Niu, Member, IEEE, Ping Zhang, Fellow, IEEE
This paper introduces a systematic framework for semantic information theory (SIT), which extends the classic information theory (CIT). The authors define synonymous mapping as the core concept, where semantic information is mapped to syntactic information. They introduce measures such as semantic entropy, semantic mutual information, semantic capacity, and semantic rate-distortion function. The paper proves three coding theorems for SIT using random coding and typical decoding/encoding, and shows that the limits of SIT are extended by synonymous mapping. The continuous case of semantic information measures is also discussed, including the semantic capacity of a Gaussian channel. The paper concludes by highlighting the potential of SIT to improve communication performance beyond the limits of CIT.This paper introduces a systematic framework for semantic information theory (SIT), which extends the classic information theory (CIT). The authors define synonymous mapping as the core concept, where semantic information is mapped to syntactic information. They introduce measures such as semantic entropy, semantic mutual information, semantic capacity, and semantic rate-distortion function. The paper proves three coding theorems for SIT using random coding and typical decoding/encoding, and shows that the limits of SIT are extended by synonymous mapping. The continuous case of semantic information measures is also discussed, including the semantic capacity of a Gaussian channel. The paper concludes by highlighting the potential of SIT to improve communication performance beyond the limits of CIT.