Published July 7, 2021
| Version v0.8.0
Software
Open
GilesStrong/lumin: v0.8.0 - Mistake not...
Creators
Description
v0.8.0 - Mistake not... Important changes
- GNN architectures generalised into feature extraction and graph collapse stages, see details below and updated tutorial
Breaking Additions
GravNet
GNN head andGravNetLayer
sub-block Qasim, Kieseler, Iiyama, & Pierini, 2019- Includes optional self-attention
SelfAttention
andOffsetSelfAttention
- Batchnorm:
LCBatchNorm1d
to run batchnorm over length x channel data- Additional
bn_class
arguments to blocks, allowing the user to choose different batchnorm implementations - 1, 2, & 3D Running batchnorm layers from fastai (https://github.com/fastai/course-v3)
GNNHead
encapsulating head for feature extraction, usingAbsGraphFeatExtractor
classes, and graph collapsing, usingGraphCollapser
classes- New callbacks:
AbsWeightData
to weight folds of data based on their inputs or targetsEpochSaver
to save the model to a new file at the end of every epochCycleStep
combines OneCycle and step-decay of optimiser hyper-parameters
- New CNN blocks:
AdaptiveAvgMaxConcatPool1d
,AdaptiveAvgMaxConcatPool2d
,AdaptiveAvgMaxConcatPool3d
use average and maximum pooling to reduce data to specified number sizes per channelSEBlock1d
,SEBlock2d
,SEBlock3d
apply squeeze-excitation to data channels
BackwardHook
for recording telemetric data during backwards passes- New losses:
WeightedFractionalMSE
,WeightedBinnedHuber
,WeightedFractionalBinnedHuber
- Options for log x & y axis in
plot_feat
Removals
- Scheduled removal of depreciated methods and functions from old model and callback system:
OldAbsCallback
OldCallback
OldAbsCyclicCallback
OldCycleLR
OldCycleMom
OldOneCycle
OldBinaryLabelSmooth
OldBinaryLabelSmooth
SequentialReweight
SequentialReweightClasses
OldBootstrapResample
OldParametrisedPrediction
OldGradClip
OldLsuvInit
OldAbsModelCallback
OldSWA
OldLRFinder
OldEnsemble
OldAMS
OldMultiAMS
OldBinaryAccuracy
OldRocAucScore
OldEvalMetric
OldRegPull
OldRegAsProxyPull
OldAbsModel
OldModel
fold_train_ensemble
OldMetricLogger
fold_lr_find
old_plot_train_history
_get_folds
- Unnecessary
pred_cb
argument intrain_models
Fixes
- Bug when trying to use batchnorm in
InteractionNet
- Bug in
FoldFile.save_fold_pred
when predictions change shape and try to overwrite existing predictions
Changes
padding
argument in conv 1D blocks renamed to pad- Graph nets: generalised into feature extraction for features per vertex and graph collapsing down to flat data (with optional self-attention)
- Renamed
FowardHook
toForwardHook
- Abstract classes no longer inherit from ABC, but rather have
metaclass=ABCMeta
in order to be compatible with py>=3.7 - Updated the example of binary classification of signal & background to use the model and training resulting from https://iopscience.iop.org/article/10.1088/2632-2153/ab983a
- Also changed the multi-target regression example to use non-densely connected layers, and the multi-target classification example to use a cosine annealed cyclical LR
- Updated the single-target regression example to use
WeightedBinnedHuber
as a loss - Changed
from torch.tensor import Tensor
tofrom torch import Tensor
for compatibility with latest PyTorch
Depreciations
OldInteractionNet
replaced in favour ofInteractionNet
feature extractor. Will be removed in v0.9
Files
GilesStrong/lumin-v0.8.0.zip
Files
(18.4 MB)
Name | Size | Download all |
---|---|---|
md5:a4a7d06cd3b8e7a68ef10701099f7538
|
18.4 MB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/GilesStrong/lumin/tree/v0.8.0 (URL)