Published November 1, 2019
| Version 1.0
Dataset
Open
Transfer fine-tuned BERT models by paraphrases
Creators
- 1. Osaka University
- 2. Artificial Intelligence Research Center (AIRC), AIST
Description
Transfer fine-tuned BERT models by phrasal paraphrases.
- transferFT_bert-base-uncased.pkl bases on the bert-base-uncased model
- transferFT_bert-large-uncased.pkl bases on the bert-large-uncased model
For usage, please refer to our GitHub page.
https://github.com/yukiar/TransferFT
For details of these models, please refer to our paper.
Yuki Arase and Junichi Tsujii. 2019. Transfer Fine-Tuning: A BERT Case Study. in Proc. of Conference on Empirical Methods in Natural Language Processing (EMNLP 2019).
Notes
Files
Files
(1.8 GB)
Name | Size | Download all |
---|---|---|
md5:5bb112fb21e6006c538074056336cfc0
|
438.0 MB | Download |
md5:e22a668fb03289c9c360f2448117e05e
|
1.3 GB | Download |