Published February 12, 2026
| Version v1
Preprint
Open
Hardware Entropy Injection for Behavioral Divergence in LLM Inference: The PSE Framework on IBM POWER8
Description
We present a method for injecting hardware-sourced entropy into LLM inference to produce provable behavioral divergence using the IBM POWER8 mftb (Move From TimeBase) instruction.
Key results:
- Provable divergence: 3 runs with identical seeds produce 3 distinct MD5 hashes
- 0.2% overhead: burst strategy (every 4th token, top-512 only) is nearly free
- 8.81x combined speedup (16.74 to 147.54 t/s) with full PSE stack
- 4 behavioral metrics defined: NOI, DR, ACS, MCI for entropy-mediated quality
Grounded in Hebbian learning theory and biological stochastic resonance. Part of the Proto-Sentient Emergence (PSE) framework.
Notes
Files
PSE_Hardware_Entropy.md
Files
(16.5 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:a072fff368e56ebce6234acff3d1dd38
|
16.5 kB | Preview Download |
Additional details
Related works
- Is supplemented by
- Software: https://github.com/Scottcjn/ram-coffers (URL)
- References
- Publication: 10.5281/zenodo.18321905 (DOI)
- Publication: 10.5281/zenodo.18623594 (DOI)