Software Open Access

Transformers: State-of-the-Art Natural Language Processing

Wolf, Thomas; Debut, Lysandre; Sanh, Victor; Chaumond, Julien; Delangue, Clement; Moi, Anthony; Cistac, Perric; Ma, Clara; Jernite, Yacine; Plu, Julien; Xu, Canwen; Le Scao, Teven; Gugger, Sylvain; Drame, Mariama; Lhoest, Quentin; Rush, Alexander M.

v4.11.3: Patch release

This patch release fixes a few issues encountered since the release of v4.11.2:

  • [DPR] Correct init (#13796)
  • Fix warning situation: UserWarning: max_length is ignored when padding=True" (#13829)
  • Bart: check if decoder_inputs_embeds is set (#13800)
  • include megatron_gpt2 in installed modules (#13834)
  • Fixing 1-length special tokens cut. (#13862)
  • Fixing empty prompts for text-generation when BOS exists. (#13859)
  • Fixing question-answering with long contexts (#13873)
  • Fixing GPU for token-classification in a better way. (#13856)
  • Fixing Backward compatiblity for zero-shot (#13855)
  • Fix hp search for non sigopt backends (#13897)
  • Fix trainer logging_nan_inf_filter in torch_xla mode #13896 (@ymwangg)
  • [Trainer] Fix nan-loss condition #13911 (@anton-l)
If you use this software, please cite it using these metadata.
Files (11.9 MB)
Name Size
huggingface/transformers-v4.11.3.zip
md5:10944a2bb1ce2360bfe7f72fcf1b8a98
11.9 MB Download
30,769
1,125
views
downloads
All versions This version
Views 30,769883
Downloads 1,12534
Data volume 8.0 GB403.5 MB
Unique views 25,965759
Unique downloads 52531

Share

Cite as