Published October 1, 2020
| Version v4.11.3
Software
Open
Transformers: State-of-the-Art Natural Language Processing
Description
v4.11.3: Patch release
This patch release fixes a few issues encountered since the release of v4.11.2:
- [DPR] Correct init (#13796)
- Fix warning situation: UserWarning: max_length is ignored when padding=True" (#13829)
- Bart: check if decoder_inputs_embeds is set (#13800)
- include megatron_gpt2 in installed modules (#13834)
- Fixing 1-length special tokens cut. (#13862)
- Fixing empty prompts for text-generation when BOS exists. (#13859)
- Fixing question-answering with long contexts (#13873)
- Fixing GPU for token-classification in a better way. (#13856)
- Fixing Backward compatiblity for zero-shot (#13855)
- Fix hp search for non sigopt backends (#13897)
- Fix trainer logging_nan_inf_filter in torch_xla mode #13896 (@ymwangg)
- [Trainer] Fix nan-loss condition #13911 (@anton-l)
Notes
Files
huggingface/transformers-v4.11.3.zip
Files
(11.9 MB)
Name | Size | Download all |
---|---|---|
md5:10944a2bb1ce2360bfe7f72fcf1b8a98
|
11.9 MB | Preview Download |
Additional details
Related works
- Is supplement to
- https://github.com/huggingface/transformers/tree/v4.11.3 (URL)