Published February 10, 2026 | Version 1.0.0
Model Open

SCBI: Stochastic Covariance-Based Initialization

  • 1. Independent Researcher

Description

A novel neural network weight initialization method achieving 87× faster convergence on regression tasks and 33% lower initial loss on classification tasks compared to standard Xavier/He initialization.

SCBI (Stochastic Covariance-Based Initialization) leverages stochastic bagging and ridge-regularized Normal Equation solving to provide data-driven initialization that places weights near the optimal solution before training begins. This eliminates the "cold start" problem in high-dimensional regression and classification tasks.

Key Features:
- 87× faster convergence on regression
- 33% improvement on classification initial loss
- Zero hyperparameter tuning required
- Universal: works for regression and classification
- GPU-accelerated implementation in PyTorch

This package includes:
- Complete implementation (scbi.py)
- Research paper (scbi_paper.pdf)
- Working examples and benchmarks
- Full documentation and quick start guide
- MIT licensed

The method is particularly effective for:
- Tabular data with high-dimensional features
- First layer initialization in deep networks
- Classification heads in transfer learning
- Linear and logistic regression models

Files

scbi research paper .zip

Files (631.0 kB)

Name Size Download all
md5:99638488c8e9d2dfa41195e2d02fde38
18.9 kB Download
md5:bb2817a639e9fc8dafb9b72c72c7d1da
1.1 kB Download
md5:416e01134f66e3d511e8f9c37c5cfd00
206.9 kB Preview Download
md5:27586c3927323fe95bf4a83e99545195
389.6 kB Preview Download
md5:82673d7fc4674d4f0bf55647d137af80
12.7 kB Download
md5:0992d4d929af74f86201f41b611c9352
1.8 kB Download

Additional details

Software

Repository URL
https://github.com/fares3010/SCBI
Programming language
Python
Development Status
Active