Dataset Open Access

Quantity doesn't buy quality syntax with neural language models

van Schijndel, Marten; Mueller, Aaron; Linzen, Tal

This repository contains the 125 LSTM models analyzed in van Schijndel, Mueller, and Linzen (2019) "Quantity doesn't buy quality syntax with neural language models". Each archive contains 25 models trained on a specific number of training tokens. All models were trained to use the vocabulary in vocab.txt.

The naming convention for each model is:
LSTM_[Hidden Units]_[Training Tokens]_[Training Partition]_[Random Seed]-d[Dropout Rate].pt

Hidden Units: The number of hidden units per layer (there are two layers in each model) {100, 200, 400, 800, 1600}
Training Tokens: The number of tokens used to train each model {2m, 10m, 20m, 40m, 80m}
Training Partition: Five distinct training partitions were created for each amount of training data {a, b, c, d, e}
Random Seed: The random seed used to train each model*
Dropout Rate: All models used a dropout rate of 0.2

*A scripting bug led to a random seed of 0 for all models trained on less than 40 million tokens. This does not substantively affect the analyses since each model is distinct in terms of the model configuration or training data, so we opted to not retrain the models with unique random seeds to save time and computational resources.

Files (13.3 GB)
Name Size
LSTM_10m.tar.gz
md5:114e1a60f9e4f62f3369f9451a40e075
2.7 GB Download
LSTM_20m.tar.gz
md5:71105f301b90694faa68b2a80e3195c1
2.7 GB Download
LSTM_2m.tar.gz
md5:ec59aa70584f58e737507980772cb882
2.6 GB Download
LSTM_40m.tar.gz
md5:bf23aaa6824fe7c2c387bea3a0b6edff
2.7 GB Download
LSTM_80m.tar.gz
md5:3af698b7986b7aee77940127e71a8e76
2.7 GB Download
vocab.txt
md5:1e19d25411fc98a8f7953a0cd3fdf82a
256.5 kB Download
  • van Schijndel, Mueller, and Linzen (2019) https://www.aclweb.org/anthology/D19-1592/

1,315
5,601
views
downloads
All versions This version
Views 1,315611
Downloads 5,6015,568
Data volume 747.5 GB659.9 GB
Unique views 1,180564
Unique downloads 5,0985,082

Share

Cite as