Published February 12, 2026 | Version 5.2
Preprint Open

DeepDrift: Zero-Training Hidden-State Monitoring for Robustness in Vision, Language, and Generative Models

Authors/Creators

  • 1. Independent Researcher

Description

We introduce DeepDrift, a unified internal monitoring framework for deep neural networks based on Semantic Velocity — the ℓ2‑norm of consecutive hidden‑state differences.

Key results:
• LLM hallucination detection: AUC 0.891, lead time 7.2 tokens
• RL failure prediction: AUC 0.985 (DQN), AUC 1.000 (PPO+noise), lead time 168 steps
• ViT semantic OOD (CIFAR‑100 → SVHN): AUROC 0.817 [0.788, 0.846]
• Diffusion memorization: 3× earlier than validation loss divergence
• External benchmark on CLIP, DINOv2, ConvNeXt confirms generalization

The method requires zero gradient‑based training, operates as a plug‑in PyTorch monitor with <1.5% overhead, and is available as open‑source software (pip install deepdrift). All experiments are reproducible with provided scripts.

GitHub: https://github.com/Eutonics/DeepDrift
PyPI: pip install deepdrift

Raw experiment metrics are included as JSON files:
- vit_ood_metrics.json: AUROC, velocity profiles, bootstrap CIs for ViT-B/16 (CIFAR-100 → SVHN)
- rl_cartpole_metrics.json: AUC, Cohen's d, lead time, phase portrait data for CartPole (PPO, DQN, PPO+noise)

Files

DeepDrift Zero-Training Hidden-State Monitoring.pdf

Files (3.4 MB)

Name Size Download all
md5:e62337df9196d8a78c38e551412cbd7b
3.4 MB Preview Download
md5:716f8faea46d07fca65daa0f143dbf2f
2.2 kB Preview Download
md5:cfbaa4a1117a3ad9461bbd31c37b75f2
1.9 kB Preview Download

Additional details

Dates

Available
2026-02-12

Software

Repository URL
https://github.com/Eutonics/DeepDrift
Programming language
Python