Confidence-Weighted Plasticity
Authors/Creators
Description
In modular learning systems, credit assignment across components with different maturities remains challenging. Staged training imposes arbitrary phase boundaries; end-to-end training risks co-adaptation pathologies. We propose \textit{reliability-weighted plasticity}, a mechanism where each component's willingness to change is governed by its demonstrated predictive accuracy. Building on JEPA (Joint Embedding Predictive Architecture) principles—where each module maintains a predictor whose residual provides a native surprise signal—we derive confidence from normalised prediction error rather than sample exposure. Routing weights distribute gradients toward less reliable components, while a global plasticity term provides developmental slowdown as the system matures. The mechanism requires no external schedule: developmental phases emerge from differential reliability, and plasticity reopens automatically under distribution shift when predictions fail. We present the mathematical formulation, implementation considerations, and discuss the architectural requirement that each component maintain a local prediction task.
Files
Confidence_Weighted_Plasticity.pdf
Files
(159.1 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:0175ff574bc294c581e92bdda59b6f3f
|
159.1 kB | Preview Download |
Additional details
Related works
- Is supplemented by
- Software: https://github.com/EridosAI/Bernard (URL)
Dates
- Issued
-
2026-02-05
Software
- Repository URL
- https://github.com/EridosAI/Bernard
- Programming language
- Python
References
- LeCun, Y. (2022). A Path Towards Autonomous Machine Intelligence. OpenReview.
- Assran, M. et al. (2023). Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture. CVPR.
- Friston, K. (2010). The free-energy principle: a unified brain theory? Nature Reviews Neuroscience.
- Kirkpatrick, J. et al. (2017). Overcoming catastrophic forgetting in neural networks. PNAS.