There is a newer version of the record available.

Published April 9, 2024 | Version v0.4
Software Open

PyPOTS: A Python Toolbox for Data Mining on Partially-Observed Time Series

Authors/Creators

Description

  1. apply the SAITS embedding strategy to models Crossformer, PatchTST, DLinear, ETSformer, FEDformer, Informer, and Autoformer to make them applicable to POTS data as imputation methods;
  2. fix a bug in USGAN loss function;
  3. gather several Transformer embedding methods into the packagepypots.nn.modules.transformer.embedding;
  4. add the attribute best_epoch for NN models to record the best epoch num and log it after model training;
  5. make the self-attention operator replaceable in the class MultiHeadAttention for Transformer models;
  6. rename the argument d_inner of all models in previous versions into d_ffn. This is for unified argument naming and easier understanding;
  7. remove deprecated functions save_model() and load_model() in all NN model classes, which are now replaced by save()and load();

What's Changed

  • Removing deprecated functions by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/318
  • Add Autoformer as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/320
  • Removing deprecated save_model and load_model, adding the imputation model Autoformer by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/321
  • Simplify MultiHeadAttention by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/322
  • Add PatchTST as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/323
  • Renaming d_inner into d_ffn by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/325
  • Adding PatchTST, renaming d_innner into d_ffn, and refactoring Autofomer by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/326
  • Add DLinear as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/327
  • Add ETSformer as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/328
  • Add Crossformer as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/329
  • Add FEDformer as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/330
  • Add Crossformer, Autoformer, PatchTST, DLinear, ETSformer, FEDformer as imputation models by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/331
  • Refactor embedding package, remove the unused part in Autoformer, and update the docs by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/332
  • Make the self-attention operator replaceable in Transformer by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/334
  • Add informer as an imputation model by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/335
  • Speed up testing procedure by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/336
  • Add Informer, speed up CI testing, and make self-attention operator replaceable by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/337
  • debug USGAN by @AugustJW in https://github.com/WenjieDu/PyPOTS/pull/339
  • Fix USGAN loss function, and update the docs by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/340
  • Add the attribute best_epoch to record the best epoch num by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/342
  • Apply SAITS embedding strategy to new added models by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/343
  • Release v0.4, apply SAITS embedding strategy to the newly added models, and update README by @WenjieDu in https://github.com/WenjieDu/PyPOTS/pull/344

Full Changelog: https://github.com/WenjieDu/PyPOTS/compare/v0.3.2...v0.4

Notes

If you use PyPOTS, please cite it as below.

Files

WenjieDu/PyPOTS-v0.4.zip

Files (437.2 kB)

Name Size Download all
md5:0b63bc522d2269bcf4b7858fd4935790
437.2 kB Preview Download

Additional details

Related works

Is supplement to
Software: https://github.com/WenjieDu/PyPOTS/tree/v0.4 (URL)