Software Open Access
Wolf, Thomas; Debut, Lysandre; Sanh, Victor; Chaumond, Julien; Delangue, Clement; Moi, Anthony; Cistac, Perric; Ma, Clara; Jernite, Yacine; Plu, Julien; Xu, Canwen; Le Scao, Teven; Gugger, Sylvain; Drame, Mariama; Lhoest, Quentin; Rush, Alexander M.
The Perceiver model was released in the previous version:
Eight new models are released as part of the Perceiver implementation:
PerceiverForMultimodalAutoencoding, in PyTorch.
The Perceiver IO model was proposed in Perceiver IO: A General Architecture for Structured Inputs & Outputs by Andrew Jaegle, Sebastian Borgeaud, Jean-Baptiste Alayrac, Carl Doersch, Catalin Ionescu, David Ding, Skanda Koppula, Daniel Zoran, Andrew Brock, Evan Shelhamer, Olivier Hénaff, Matthew M. Botvinick, Andrew Zisserman, Oriol Vinyals, João Carreira.
- Add Perceiver IO by @NielsRogge in https://github.com/huggingface/transformers/pull/14487
Compatible checkpoints can be found on the hub: https://huggingface.co/models?other=perceiver
Version v4.14.0 adds support for Perceiver in multiple pipelines, including the fill mask and sequence classification pipelines.Keras model cards
The Keras push to hub callback now generates model cards when pushing to the model hub. Additionally to the callback, model cards will be generated by default by the model.push_to_hub() method.
AutoTokenizer. by @Narsil in https://github.com/huggingface/transformers/pull/14711
run_clm_flax.pyrespect global seed by @bminixhofer in https://github.com/huggingface/transformers/pull/13410
Simplify T5 docs by @xhlulu in https://github.com/huggingface/transformers/pull/14776
Update Perceiver code examples by @NielsRogge in https://github.com/huggingface/transformers/pull/14783