Published December 29, 2022
| Version v0.1.17
Software
Open
okunator/cellseg_models.pytorch: v0.1.17
Description
0.1.17 — 2022-12-29
Features
- Add transformer modules
- Add exact, slice, and memory efficient (xformers) self attention computations
- Add transformers modules to
Decodermodules - Add common transformer mlp activation functions: star-relu, geglu, approximate-gelu.
- Add Linformer self-attention mechanism.
- Add support for model intialization from yaml-file in
MultiTaskUnet. - Add a new cross-attention long-skip module. Works with
long_skip='cross-attn'
- Added more verbose error messages for the abstract wrapper-modules in
modules.base_modules - Added more verbose error catching for xformers.ops.memory_efficient_attention.
Files
okunator/cellseg_models.pytorch-v0.1.17.zip
Files
(8.6 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:6ff99ef066b66b5ae1ec17182e1e523e
|
8.6 MB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/okunator/cellseg_models.pytorch/tree/v0.1.17 (URL)