Computational Information Theory: Bridging Shannon's Information Theory and Computational Complexity
Authors/Creators
Description
We develop Computational Information Theory as a comprehensive bridge between Claude Shannon’s classical information theory and computational complexity theory. Shannon’s framework addresses communication, data compression, and channel capacity, while complexity theory focuses on computational resources and problem difficulty. Despite sharing an informational foundation, these two domains have remained largely separate—until now.
We extend Shannon’s core principles to computational processes by introducing:
-
Computational entropy: HC(S)=log∣S∣H_C(S) = \log |S|HC(S)=log∣S∣ for state spaces SSS.
-
Compression bounds: algorithms operating on a state space SSS in time TTT require an average information flow of at least HC(S)/TH_C(S)/THC(S)/T per computational step.
-
Computational source coding theorems: fundamental limits on information processing efficiency.
-
Dynamic information flow: H(Statet∣Statet−1)H(\text{State}_t \mid \text{State}_{t-1})H(Statet∣Statet−1), a temporal measure capturing how information evolves through computation.
Our framework provides rigorous mathematical foundations with complete definitions, theorems, and proofs. We prove the Computational Compression Bound, showing that NP problems with certificate space 2n2^n2n require n bits of total information processing. We establish Information Flow as a practical analytical tool for complexity lower bounds and demonstrate its application to the conditional proof of P ≠ NP and algorithm design principles.
This unified theory bridges information and computation, yielding new insights:
-
Information-theoretic lower bounds on computational complexity.
-
A unified understanding of time–space–information tradeoffs.
-
A quantifiable link between entropy and computational difficulty.
-
Practical tools for algorithm analysis and optimization.
We provide full mathematical formalism, fundamental theorems, comprehensive examples, and verified applications across algorithm design, complexity theory, and optimization. This work establishes Computational Information Theory as a rigorous and independent field with both theoretical depth and practical reach, opening new avenues for research at the intersection of information theory, complexity, communication, and computation.
Files
3_Computational_Information_Theory_FINAL.pdf
Files
(28.1 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:5c4dbc580e245e02c51e4af07a3f70b3
|
28.1 kB | Preview Download |