BERT-Based Semantic Entropy under Landauer's Principle: Quantifying Energy Cost of Meaning in NLP
Creators
Description
BERT-Based Semantic Entropy under Landauer's Principle: Quantifying Energy Cost of Meaning in NLP
Version 1.0 – Initial Public Release
This work introduces a novel framework that extends Landauer’s principle—originally describing the energy cost of bit erasure—to the semantic level of natural language understanding. By analyzing attention distributions in transformer models like BERT, we define a new metric called semantic entropy, which captures the cognitive and computational effort involved in language processing.
We propose a normalized energy cost model based on semantic entropy and evaluate it across three large corpora: news headlines, literary texts, and dialogues. Our experiments show a strong correlation between semantic entropy and human-perceived complexity, significantly outperforming traditional baselines such as TF–IDF and random-attention models.
The paper also explores hardware implications, including real energy measurements on GPUs and neuromorphic chips. Applications include energy-aware NLP systems, dynamic pricing of API calls based on processing cost, and potential brain-inspired computing benchmarks. We provide open-source code, detailed methodology, and a visual pipeline for reproducibility.
Author: PSBigBig (Independent Developer and Researcher)
📧 hello@onestardao.com
🌐 https://onestardao.com/papers
💻 https://github.com/onestardao/WFGY
Files
BERT-Based_Semantic_Entropy_under_Landauer’s_Principle_v1.0_PSBigBig_Public.pdf
Files
(512.0 kB)
Name | Size | Download all |
---|---|---|
md5:64d696331bafc032659851a9634dcd0e
|
512.0 kB | Preview Download |
Additional details
Related works
- Is supplement to
- Preprint: 10.5281/zenodo.15630969 (DOI)
Dates
- Accepted
-
2025-06-15Version 1.0 – Initial Public Release
Software
- Repository URL
- https://doi.org/10.5281/zenodo.15624323