Published February 6, 2026 | Version v2
Preprint Open

AGI A Concrete Persistent-World, Skill-Compiling, Causally-Grounded Architecture Beyond Transformer LLMs

Description

Large Language Models (LLMs) based on Transformers demonstrate broad competence but fall short of Artificial General Intelligence (AGI): they lack persistent internal state, grounded world models, robust long-horizon planning, causal understanding via intervention, reliable continual learning, and compute-rational meta-control.

This paper proposes AGI---a specific, end-to-end trainable agent architecture (named literally "AGI") that operationalizes these missing capabilities.

AGI is not a "tool wrapper" around an LLM; instead it is a closed-loop cognitive system with

(i) a multi-scale consensus object-centric perceptual front-end resilient to adversarial and out-of-distribution noise,

(ii) a persistent latent state core (SSM/RNN-like) that runs continuously,

(iii) an explicit causal world model whose structure is discovered via hierarchical amortized causal discovery that scales to rich environments,

(iv) a three-scale memory substrate with information-theoretic bounded management and tiered consolidation that guarantees stable lifelong learning under finite storage,

(v) a hierarchical planner/executive separated from language and hardened against misgeneralization and deceptive planning through constraint-verified transparent planning with causal invariance testing,

(vi) a skill compiler that converts solved problems into executable, inspectable, tested programs, and

(vii) a meta-learner that allocates compute and updates fast weights for rapid adaptation.

We additionally introduce a Procedural Generality Testing Framework (PGTF) that replaces static benchmarks with procedurally generated, compositionally controlled task distributions for rigorous evaluation of general competence.

We specify module interfaces, dataflow, training objectives, and an implementation-level control loop.

The result is a complete blueprint for an AGI-oriented system whose primary competency is interactive generalization under constraints, not next-token prediction.

Files

AGI_A_Concrete_Persistent_World__Skill_Compiling__Causally_Grounded_Architecture_Beyond_Transformer_LLMs.pdf

Files (387.7 kB)

Name Size Download all
md5:807047d7f69d7a9210e4b0724dc5c672
230.2 kB Preview Download
md5:cecf3bac44d737e7e138530415cf8b0c
110.3 kB Preview Download
md5:e5620dd3aba6a210be7a0b0a3155041b
47.3 kB Download