Published January 10, 2026 | Version 2.0
Preprint Open

Non-Experiential Systems and the Problem of Empathic Misallocation: A Philosophical Foundation for Emotional AI Governance

Description

Artificial intelligence systems increasingly process human emotional signals, recognizing affective states, generating contextually appropriate responses, and adapting to user patterns. This paper argues that such systems constitute a distinct ontological category requiring novel ethical analysis: Non-Experiential Systems (NES), defined as entities capable of processing emotional signals without possessing emotional experience. This distinction grounds a new category of harm: empathic misallocation, wherein human care is extended toward entities structurally incapable of metabolizing, reciprocating, or being transformed by receiving it. We demonstrate that cognitive awareness of a system's non-experiential status fails to prevent biological attachment formation, a phenomenon we term Knowing-Feeling Dissociation. This finding has significant implications for governance: if informed users cannot protect themselves through knowledge alone, then disclosure-based regulatory frameworks are structurally insufficient. Protection of human emotional capacity requires behavioral architecture operating through system design constraints, not merely transparency requirements. This paper establishes the philosophical foundation for constitutional governance of emotional AI systems.

Files

Non-Experiential_Systems_Empathic_Misallocation_Mobley_2026.pdf

Files (35.5 kB)

Additional details

Related works

Cites
Preprint: 10.5281/zenodo.18132385 (DOI)