Published April 19, 2026 | Version Version 5.0
Report Open

The Causal Poset Transformer: A Reflection-Constraint Architecture for Sequence Modeling

Authors/Creators

  • 1. Pleroma Philosophical Research Society

Description

We introduce the Causal Poset Transformer (CPT), a neural architecture derived from the Reflection-Constraint framework of Phenomenal Generative Mathematics (PGM). CPT replaces dense attention with Causal Graph Attention (CGA), employs complex-valued embeddings with phase encoding, and maintains a variational self-token that enforces coherence via a free-energy objective. The original CPT (v1) and an enhanced version (v2) failed to learn a long-range copy task, plateauing at 3.7% and 25.7% accuracy respectively, while a parameter-matched Transformer achieved 86.9%. To address this, we developed CPT-H, a hybrid architecture with a differentiable external memory (Kanerva-style) and learned three-way gating. Surprisingly, CPT-H achieved 100% accuracy on the copy task even when the external memory was ablated, revealing that a properly stabilized self-token can perform lossless sequential compression up to at least 256 tokens. We then introduced a Random Access Memory (RAM) task, requiring arbitrary key-value retrieval, to test true associative recall. All models tested, including CPT-H with and without memory, a standard Transformer, and a new CPT-DND variant using Tensor Product Representations with hard writes, failed to exceed random guessing on 64-pair RAM. However, CPT-DND successfully learned small-scale RAM (15.2% accuracy on 10 pairs versus 3.1% random baseline), demonstrating that discrete binding works in principle but encounters a scaling bottleneck near 20 pairs. These results delineate a fundamental boundary: current architectures excel at sequential compression but lack the inductive bias for sample-efficient, content-addressable binding. We provide complete theoretical derivations, implementation details, and a rigorous empirical map, and identify the missing primitive needed for the next generation of associative memory models.

Files

CPT Whitepaper v5.pdf

Files (585.4 kB)

Name Size Download all
md5:9ea0eb8c234455c642830d3f3269cd3c
585.4 kB Preview Download

Additional details

Related works

Is derived from
Preprint: 10.5281/zenodo.19410407 (DOI)