27 Mar 2024 | Kai Niu, Member, IEEE, Ping Zhang, Fellow, IEEE
A Mathematical Theory of Semantic Communication
This paper proposes a systematic framework for semantic information theory (SIT), extending classic information theory (CIT) to address semantic communication. Semantic communication focuses on conveying meaning rather than just syntax, and this work introduces measures such as semantic entropy, semantic mutual information, semantic capacity, and semantic rate-distortion function. The key concept is synonymous mapping, which links semantic and syntactic information. The paper defines semantic entropy $ H_s(\tilde{U}) $, semantic mutual information $ I^s(\tilde{X}; \tilde{Y}) $, semantic capacity $ C_s $, and semantic rate-distortion function $ R_s(D) $. It proves three coding theorems: semantic source coding, semantic channel coding, and semantic rate-distortion coding. The semantic capacity $ C_s $ is shown to be greater than or equal to the classical channel capacity $ C $, and the semantic rate-distortion function $ R_s(D) $ is less than or equal to the classical rate-distortion function $ R(D) $. The paper also discusses semantic information measures in the continuous case, deriving a new channel capacity formula for band-limited Gaussian channels. The framework provides a foundation for semantic communication, extending CIT and offering potential for improved performance in future communication systems.A Mathematical Theory of Semantic Communication
This paper proposes a systematic framework for semantic information theory (SIT), extending classic information theory (CIT) to address semantic communication. Semantic communication focuses on conveying meaning rather than just syntax, and this work introduces measures such as semantic entropy, semantic mutual information, semantic capacity, and semantic rate-distortion function. The key concept is synonymous mapping, which links semantic and syntactic information. The paper defines semantic entropy $ H_s(\tilde{U}) $, semantic mutual information $ I^s(\tilde{X}; \tilde{Y}) $, semantic capacity $ C_s $, and semantic rate-distortion function $ R_s(D) $. It proves three coding theorems: semantic source coding, semantic channel coding, and semantic rate-distortion coding. The semantic capacity $ C_s $ is shown to be greater than or equal to the classical channel capacity $ C $, and the semantic rate-distortion function $ R_s(D) $ is less than or equal to the classical rate-distortion function $ R(D) $. The paper also discusses semantic information measures in the continuous case, deriving a new channel capacity formula for band-limited Gaussian channels. The framework provides a foundation for semantic communication, extending CIT and offering potential for improved performance in future communication systems.