Published February 1, 2026 | Version v1
Preprint Open

Sleep-Wake Consolidation for Lifelong Conversational Memory in Local Language Models

Authors/Creators

  • 1. Independent

Description

We introduce a sleep-wake architecture for lifelong conversational memory in local language models running on consumer hardware. During wake, the system extracts facts from conversation and stores them in context. During sleep, it consolidates these facts into model weights via LoRA fine-tuning using spaced-repetition-inspired training data. We validate on a 3B parameter model (Llama-3.2-3B-Instruct-4bit) running on an 8GB MacBook Air M3, demonstrating that sleep cycles produce measurable memory formation with a narrow viable learning rate window (~1e-4) and a spaced repetition effect where repeated sleep cycles improve recall. This establishes the basic feasibility of sleep-wake memory consolidation in local LLMs.

Notes

Part of the Sleeping LLM research series on sleep-wake memory consolidation for lifelong learning in language models.

Files

1-Sleep-Wake-Consolidation.pdf

Files (103.3 kB)

Name Size Download all
md5:c2ba9bc18fe3615ca0909436256cafc2
103.3 kB Preview Download

Additional details

Related works

Is continued by
Preprint: 10.5281/zenodo.18778762 (DOI)