There is a newer version of the record available.

Published July 14, 2025 | Version v8
Preprint Open

Delta Compression: Towards Efficient Semantic Compression via Hypernetwork-Generated Parameter Deltas

Authors/Creators

Description

We propose Delta Compression, a novel approach that aims to achieve efficient semantic information compression by representing information not as static data, but as a parameter delta generated by a hypernetwork. This delta, when applied to a deterministic base model trained solely for information reconstruction, enables lossless retrieval of complex information from compact representations. Our architecture consists of a hypernetwork encoder that converts input information into a PEFT-style delta, extending recent advances in "meaning-to-delta" conversion, such as Drag-and-Drop LLMs (DnD), into the broader domain of semantic compression.

To evaluate this approach, we present a proof-of-concept experiment in which 70 Japanese-language texts—covering diverse topics and totaling approximately 70,000 tokens—were encoded using a minimal LoRA adapter (rank 1, float16, ~396KB). The original uncompressed text data occupied approximately 416KB. The information was reconstructed from the delta with perfect fidelity. These results demonstrate the viability of using hypernetwork-generated parameter deltas as a compact and prompt-addressable representation of semantic content.

Files

Delta Compression 0714.pdf

Files (101.4 kB)

Name Size Download all
md5:9d046c4b3e497a9d960e281ae21fdc10
101.4 kB Preview Download