Published May 1, 2026 | Version v1
Preprint Open

MemoryOS: A Neurologically-Inspired Episodic Memory Architecture for Large Language Models

Authors/Creators

  • 1. Independent Researcher

Description

Large language models lack persistent memory across conversations. Retrieval-Augmented Generation (RAG) partially addresses this but treats all memories as equally weighted, applies no decay over time, and retrieves through flat similarity search rather than associative reasoning. We present MemoryOS, a neurologically-inspired episodic memory architecture that addresses these limitations through four components: (1) an importance-weighted episodic encoder using a neural network trained to score memories by surprise and emotional weight, (2) a persistent memory store with Ebbinghaus forgetting curve decay, (3) an associative memory graph where edges encode semantic, keyword, and temporal relationships between memories, and (4) a graph-walk retrieval engine that traverses associations rather than performing flat vector search. We evaluate MemoryOS against a standard RAG baseline across three simulated user profiles and 15 query types. MemoryOS achieves MRR of 0.7111 versus 0.6667 for RAG, a relative improvement of +6.7%, and wins 5 of 15 individual queries compared to 2 wins for RAG. Live demo: https://memory-os-tau.vercel.app

Files

MemoryOS.pdf

Files (173.1 kB)

Name Size Download all
md5:b90f4363e93395568b64996ec84436d6
173.1 kB Preview Download