5553107
doi
10.5281/zenodo.5553107
oai:zenodo.org:5553107
Debut, Lysandre
Sanh, Victor
Chaumond, Julien
Delangue, Clement
Moi, Anthony
Cistac, Perric
Ma, Clara
Jernite, Yacine
Plu, Julien
Xu, Canwen
Le Scao, Teven
Gugger, Sylvain
Drame, Mariama
Lhoest, Quentin
Rush, Alexander M.
Transformers: State-of-the-Art Natural Language Processing
Wolf, Thomas
url:https://github.com/huggingface/transformers/tree/v4.11.3
info:eu-repo/semantics/openAccess
Other (Open)
v4.11.3: Patch release
<p>This patch release fixes a few issues encountered since the release of v4.11.2:</p>
<ul>
<li>[DPR] Correct init (#13796)</li>
<li>Fix warning situation: UserWarning: max_length is ignored when padding=True" (#13829)</li>
<li>Bart: check if decoder_inputs_embeds is set (#13800)</li>
<li>include megatron_gpt2 in installed modules (#13834)</li>
<li>Fixing 1-length special tokens cut. (#13862)</li>
<li>Fixing empty prompts for text-generation when BOS exists. (#13859)</li>
<li>Fixing question-answering with long contexts (#13873)</li>
<li>Fixing GPU for token-classification in a better way. (#13856)</li>
<li>Fixing Backward compatiblity for zero-shot (#13855)</li>
<li>Fix hp search for non sigopt backends (#13897)</li>
<li>Fix trainer logging_nan_inf_filter in torch_xla mode #13896 (@ymwangg)</li>
<li>[Trainer] Fix nan-loss condition #13911 (@anton-l)</li>
</ul>
If you use this software, please cite it using these metadata.
Zenodo
2020-10-01
info:eu-repo/semantics/other
3385997
v4.11.3
1669997035.270021
11868649
md5:10944a2bb1ce2360bfe7f72fcf1b8a98
https://zenodo.org/records/5553107/files/huggingface/transformers-v4.11.3.zip
public
https://github.com/huggingface/transformers/tree/v4.11.3
Is supplement to
url
10.5281/zenodo.3385997
isVersionOf
doi