Published May 11, 2023 | Version v1
Other Open

SleepTransformer model weights

Creators

  • 1. Queen Mary University of London

Description

This is the model weights of the SleepTransformer trained on the SHHS data described in the paper:

Huy Phan, Kaare Mikkelsen, Oliver Y. Chén, Philipp Koch, Alfred Mertins, and Maarten De Vos. SleepTransformer: Automatic Sleep Staging With Interpretability and Uncertainty Quantification. IEEE Transactions on Biomedical Engineering (TBME), vol. 69, no. 8, pp. 2456-2467, 2022

Model configurations:

  • sequence length: 21
  • number Transformer encoder blocks in the epoch encoder: 4
  • number Transformer encoder blocks in the sequence encoder: 4

This model should be used in combination with the code available here: https://github.com/pquochuy/SleepTransformer

Files

sleeptransformer_shhs_model_weights.zip

Files (39.7 MB)

Name Size Download all
md5:cd912d7882eb951196ee503526725673
39.7 MB Preview Download