Published July 8, 2021
| Version 0.6.0
Software
Open
Project-MONAI/MONAI: 0.6.0
Creators
- Nic Ma1
- Wenqi Li2
- Richard Brown3
- Yiheng Wang
- Behrooz1
- Benjamin Gorman4
- Hans Johnson5
- Isaac Yang
- Eric Kerfoot3
- charliebudd
- Mohammad Adil1
- Yiwen Li
- Yuan-Ting Hsieh (謝沅廷)6
- Arpit Aggarwal7
- masadcv
- Cameron Trentz4
- adam aji8
- myron
- Mark Graham3
- Ben Murray
- Gagan Daroach
- Petru-Daniel Tudosiu9
- Matt McCormick10
- Ali Hatamizadeh1
- Ambros
- Balamurali11
- Christian Baker12
- Holger Roth1
- Jan Sellner
- 1. NVIDIA
- 2. @NVIDIA
- 3. King's College London
- 4. University of Iowa
- 5. The University of Iowa
- 6. Nvidia
- 7. Broad Institute of MIT and Harvard
- 8. @SonoVol
- 9. @AmigoLab
- 10. @Kitware
- 11. HTIC
- 12. @KCL-BMEIS
Description
Added
- Overview document for feature highlights in v0.6
- 10 new transforms, a masked loss wrapper, and a
NetAdapter
for transfer learning - APIs to load networks and pre-trained weights from Clara Train Medical Model ARchives (MMARs)
- Base metric and cumulative metric APIs, 4 new regression metrics
- Initial CSV dataset support
- Decollating mini-batch as the default first postprocessing step
- Initial backward compatibility support via
monai.utils.deprecated
- Attention-based vision modules and
UNETR
for segmentation - Generic module loaders and Gaussian mixture models using the PyTorch JIT compilation
- Inverse of image patch sampling transforms
- Network block utilities
get_[norm, act, dropout, pool]_layer
unpack_items
mode forapply_transform
andCompose
- New event
INNER_ITERATION_STARTED
in the deepgrow interactive workflow set_data
API for cache-based datasets to dynamically update the dataset content- Fully compatible with PyTorch 1.9
--disttests
and--min
options forruntests.sh
- Initial support of pre-merge tests with Nvidia Blossom system ### Changed
- Base Docker image upgraded to
nvcr.io/nvidia/pytorch:21.06-py3
fromnvcr.io/nvidia/pytorch:21.04-py3
- Optionally depend on PyTorch-Ignite v0.4.5 instead of v0.4.4
- Unified the demo, tutorial, testing data to the project shared drive, and
Project-MONAI/MONAI-extra-test-data
- Unified the terms:
post_transform
is renamed topostprocessing
,pre_transform
is renamed topreprocessing
- Unified the postprocessing transforms and event handlers to accept the "channel-first" data format
evenly_divisible_all_gather
andstring_list_all_gather
moved tomonai.utils.dist
### Removed- Support of 'batched' input for postprocessing transforms and event handlers
TorchVisionFullyConvModel
set_visible_devices
utility functionSegmentationSaver
andTransformsInverter
handlers ### Fixed- Issue of handling big-endian image headers
- Multi-thread issue for non-random transforms in the cache-based datasets
- Persistent dataset issue when multiple processes sharing a non-exist cache location
- Typing issue with Numpy 1.21.0
- Loading checkpoint with both
model
andoptmizier
usingCheckpointLoader
whenstrict_shape=False
SplitChannel
has different behaviour depending on numpy/torch inputs- Transform pickling issue caused by the Lambda functions
- Issue of filtering by name in
generate_param_groups
- Inconsistencies in the return value types of
class_activation_maps
- Various docstring typos
- Various usability enhancements in
monai.transforms
Files
Project-MONAI/MONAI-0.6.0.zip
Files
(12.2 MB)
Name | Size | Download all |
---|---|---|
md5:97b37d5d0f3d184b88c922da386a8f18
|
12.2 MB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/Project-MONAI/MONAI/tree/0.6.0 (URL)