Natural Language Modeling (NLM): Emergent Intelligence Through Physical Signal Processing Without Large Language Models
Description
The dominant paradigm of machine intelligence — large language model (LLM)-based AI — presupposes that adaptive, context-aware behavior requires transformer architectures, large-scale training corpora, and centralized compute infrastructure. We challenge this presupposition at the definitional level. Intelligence, we argue, does not require language in the narrow sense of tokenized human speech. Language, properly defined, is any physical, chemical, or environmental signal that carries information capable of triggering a meaningful change in the state or behavior of a receiving system — a definition encompassing 3.8 billion years of biological signal processing that predates human speech by geological epochs. This paper presents Natural Language Modeling (NLM): a formal framework for building adaptive systems through local physical signal processing without trained models, cloud infrastructure, or linguistic input. NLM systems are defined by five verifiable principles: Local Signal Primacy, Rule Minimalism, Physical Signal Vocabulary, Distributed Authority, and Resource Minimalism — the last of which is simultaneously a technical specification and a justice constraint requiring deployability on commodity hardware available at consumer price points in any global market. We present two independent working proof implementations. SlimeHive is a multi-agent swarm robotics system modeled on the adaptive network formation behavior of Physarum polycephalum, governed by three finite state machine rules operating on simulated pheromone gradients, running on Raspberry Pi Pico 2W microcontrollers (USD $13). Ceiba is a modular self-powered infrastructure node governed by four physical threshold triggers — solar tracking, survival mode, mesh formation, and water flow regulation — running on Raspberry Pi Zero 2W hardware (USD $15), producing electricity, potable water, LED illumination, and LoRa mesh network connectivity without grid connection or internet dependency. Both systems satisfy all five NLM principles. Neither contains a neural network. Neither was trained on data. Both exhibit adaptive, context-responsive behavior. These implementations demonstrate that the Sense-Interpret-Act computational loop, governed by physical signal vocabularies and minimal rule sets, is sufficient to produce adaptive collective behavior across distinct problem domains — from kinetic swarm navigation to community infrastructure management.
Files
NLM_Paper_Moorhead_2026.pdf
Files
(48.7 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:073e7319e6dc787fa0f189086fcdd1f6
|
48.7 kB | Preview Download |