There is a newer version of the record available.

Published February 28, 2026 | Version v2
Preprint Restricted

BTMA: Biomorphic Temporal Memory Architecture for Large Language Models

  • 1. University of Genova

Description

Large Language Models (LLMs) currently lack persistent memory across sessions, forcing every interaction to begin from scratch and requiring users to reconstruct context manually.

This re-establishment overhead scales with interaction depth, creating a compounding cost for longitudinal tasks such as research, therapeutic support, and collaborative creative work. 

This paper proposes a Biomorphic Temporal Memory Architecture (BTMA), a multi-layer memory consolidation protocol designed specifically for LLMs, inspired by the hierarchical consolidation mechanisms of biological memory systems.

BTMA introduces systematic temporal anchoring via mandatory session timestamps, and a four-layer artefact hierarchy (event, daily, monthly, annual) that mirrors the neurobiological progression from working memory to long-term consolidation.

Each layer distils the one below, producing compressed, contextually rich memory traces without unlimited accumulation.

The architecture is platform-agnostic, requires no modification to model weights, and operates entirely through structured prompting and user-controlled local storage (Obsidian-compatible Markdown).

BTMA provides mnesic-sensorial continuity, a form of functional memory that benefits any AI system regardless of its level of cognitive emergence, and is composable with any alignment or behavioural framework without modification. We present the full implementation protocol and invite independent validation.

Files

Restricted

The record is publicly accessible, but files are restricted. <a href="https://zenodo.org/account/settings/login?next=https://zenodo.org/records/18929462">Log in</a> to check if you have access.

Additional details

Dates

Created
2026-02-28
Creation
Submitted
2026-03-09
uploaded on zenodo
Updated
2026-03-18
uploaded on zenodo