Pattern Loss at Dimensional Boundaries: The 86% Scaling Law
Description
Information degrades predictably when crossing dimensional boundaries—from DNA’s 1D code building 3D proteins to neural networks transforming data across dimensional spaces—yet this fundamental cost has never been quantified. While the “curse of dimensionality” describes problems qualitatively and dimensionality reduction techniques project high-dimensional data to lower dimensions, no prior work has measured information loss during the embedding of discrete patterns from dimension N to dimension N + 1.
This study introduces the Φ metric (Φ = R · S + D), which decomposes pattern information into structural (spatial organization) and statistical (state distribution) components. Using middle-placement embedding in cellular automata grids as controlled computational environments, 1,500 random binary patterns were systematically embedded across five grid sizes through three dimensional transitions: 1D→2D, 2D→3D, and 3D→4D. For each pattern, information retention was measured using Φ before and after embedding.
Robust information loss of 86.01% ± 2.39% is observed across all dimensional transitions, with a remarkably low coefficient of variation of 2.8% across 1,500 patterns. Component analysis reveals that structural information (R · S) collapses by 99.6% while statistical information (D) decreases by 82–83%, explaining the overall 86% loss through near-total destruction of spatial organization accompanied by partial preservation of state distributions. After initial embedding, Φstabilizes at approximately 0.169, suggesting an information floor for sparse patterns in higher dimensions.
Robustness tests confirm the finding holds across grid sizes (15–25) with weak scale-dependence (+0.6% per unit increase in N), and is consistent across tested cellular automata rule variants (Conway’s Life B3/S23 and HighLife B36/S23 differ by only 0.64%). The effect represents a fundamental property of middle-placement dimensional embedding geometry for randomly generated binary patterns in cellular automata grids, rather than pattern-specific or rule-specific behavior within this framework.
These findings establish theoretical efficiency bounds for machine learning representations, quantify the curse of dimensionality in physics, and provide a scaling law for complexity science.
By delivering the first quantitative measurement of information dynamics at dimensional boundaries, this work reveals dimensional transitions as sites of predictable information transformation in discrete computational systems.
Keywords: dimensional boundaries, information loss, cellular automata, complexity science, dimensional embedding, information theory, curse of dimensionality, entropy, spatial information, scaling laws, information geometry, pattern analysis, complex systems, nonlinear dynamics
Notes
Files
Thornhill_2026_Dimensional_Boundary_Loss.pdf
Files
(673.9 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:aebe8d72557087cf4b59d35ff6f5d4d8
|
673.9 kB | Preview Download |
Additional details
Related works
- Is identical to
- Preprint: 10.2139/ssrn.6149167 (DOI)