There is a newer version of the record available.

Published December 29, 2022 | Version v0.1.17
Software Open

okunator/cellseg_models.pytorch: v0.1.17

Authors/Creators

  • 1. University of Helsinki

Description

0.1.17 — 2022-12-29 Features
  • Add transformer modules
  • Add exact, slice, and memory efficient (xformers) self attention computations
  • Add transformers modules to Decoder modules
  • Add common transformer mlp activation functions: star-relu, geglu, approximate-gelu.
  • Add Linformer self-attention mechanism.
  • Add support for model intialization from yaml-file in MultiTaskUnet.
  • Add a new cross-attention long-skip module. Works with long_skip='cross-attn'
Refactor
  • Added more verbose error messages for the abstract wrapper-modules in modules.base_modules
  • Added more verbose error catching for xformers.ops.memory_efficient_attention.

Files

okunator/cellseg_models.pytorch-v0.1.17.zip

Files (8.6 MB)

Name Size Download all
md5:6ff99ef066b66b5ae1ec17182e1e523e
8.6 MB Preview Download

Additional details