Published January 5, 2026
| Version v0.1.7
Software
Open
chaobrain/braintools: Version 0.1.7
Description
Major Features
New Training Framework (braintools.trainer)
- PyTorch Lightning-like training API for JAX-based neural network training with comprehensive features:
- LightningModule: Base class for defining training models with
training_step(),validation_step(), andconfigure_optimizers()hooks - Trainer: Orchestration class for managing training loops, epochs, and device placement
- TrainOutput/EvalOutput: Structured output types for training and evaluation results
- LightningModule: Base class for defining training models with
Callbacks System
- 10+ built-in callbacks for customizing training behavior:
ModelCheckpoint: Automatic model saving based on monitored metricsEarlyStopping: Stop training when metrics plateauLearningRateMonitor: Track and log learning rate changesGradientClipCallback: Gradient clipping for training stabilityTimer: Track training timeRichProgressBar/TQDMProgressBar: Visual progress indicatorsLambdaCallback/PrintCallback: Custom callback utilities
Logging Backends
- 6 pluggable logging backends:
TensorBoardLogger: TensorBoard integrationWandBLogger: Weights & Biases integrationCSVLogger: Simple CSV file loggingNeptuneLogger: Neptune.ai integrationMLFlowLogger: MLFlow integrationCompositeLogger: Combine multiple loggers
Data Loading Utilities
- JAX-compatible data loading with distributed support:
DataLoader/DistributedDataLoader: Efficient batch loadingDataset,ArrayDataset,DictDataset,IterableDataset: Dataset abstractionsSampler,RandomSampler,SequentialSampler,BatchSampler,DistributedSampler: Sampling strategies
Distributed Training
- Multi-device and multi-host training strategies:
SingleDeviceStrategy: Single device executionDataParallelStrategy: Data parallelism across devicesShardedDataParallelStrategy/FullyShardedDataParallelStrategy: Memory-efficient sharded trainingAutoStrategy: Automatic strategy selectionall_reduce,broadcast: Distributed communication primitives
Checkpointing
- Comprehensive checkpoint management:
CheckpointManager: Manage multiple checkpoints with retention policiessave_checkpoint/load_checkpoint: Save and restore model statesfind_checkpoint/list_checkpoints: Checkpoint discovery utilities
Progress Bar System
- Multiple progress bar implementations:
SimpleProgressBar: Basic text-based progressTQDMProgressBarWrapper: TQDM-based progressRichProgressBarWrapper: Rich library-based progress
Improvements
API Documentation
- Enhanced module documentation: All public modules now include comprehensive docstrings with examples, parameter descriptions, and usage guidelines directly in
__init__.pyfiles - Reorganized imports: Cleaner and more consistent import structure across all modules
Breaking Changes
Removed braintools.param Module
- The entire
braintools.parammodule has been removed, including:- Data containers (
Data) - Parameter wrappers (
Param,Const) - State containers (
ArrayHidden,ArrayParam) - Regularization classes (
GaussianReg,L1Reg,L2Reg) - All transform classes (
SigmoidT,SoftplusT,AffineT, etc.) - Utility functions (
get_param(),get_size())
- Data containers (
- Users relying on these features should migrate to alternative implementations or pin to version 0.1.6
Files
chaobrain/braintools-v0.1.7.zip
Files
(53.3 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:5afea2069c5e2fa640ed37e3aafa6774
|
53.3 MB | Preview Download |
Additional details
Related works
- Is supplement to
- Software: https://github.com/chaobrain/braintools/tree/v0.1.7 (URL)
Software
- Repository URL
- https://github.com/chaobrain/braintools