Published February 6, 2026 | Version v1
Preprint Open

Blindsight Transport: Lossless Hidden State Communication Between Language Model Agents

Authors/Creators

Description

Multi-agent systems built on large language models (LLMs) predominantly communicate
through natural language text. We demonstrate that this text-based communication acts as
a severe information bottleneck, analogous to the children’s game of Chinese Whispers, where
each retransmission introduces compounding distortion. We propose Blindsight Transport, a
method that replaces text-based inter-agent communication with direct hidden state transfer
through the transformer’s residual stream. By splitting a transformer into an early-layer
courier agent (layers 0–1) and a late-layer receiver agent (layers 2–N), information flows as
raw activations rather than generated text. We prove that this yields KL divergence of exactly
zero from baseline—a lossless channel—whereas standard text-based handoff produces KL
divergences of 2.4–9.3 across all test conditions. We present seven experiments using GPT-2
family models (82M–355M parameters): (1) a Chinese Whispers comparison showing 8/8
test wins for hidden state transport, (2) multi-agent chains of 2–10 agents where text-based
communication causes immediate catastrophic signal destruction while hidden state transport
maintains KL = 0 regardless of chain length, (3) lossless serialization of hidden states to disk,
(4) scaling verification across model sizes, (5) cross-model transfer between architecturally
distinct models achieving 29–46× improvement over text, (6) cross-dimensional transfer via
learned linear projection achieving 75× improvement over text, and (7) qualitative analys

Files

Blindsight_Transport.pdf

Files (438.9 kB)

Name Size Download all
md5:0ed14605b7582a87e5bead5d15107bdc
438.9 kB Preview Download