Published March 19, 2021 | Version 1.0
Dataset Open

Are Neural Language Models Good Plagiarists? A Benchmark for Neural Paraphrase Detection

  • 1. Bergische Universität Wuppertal

Description

Full-Text PDF
Title: Are Neural Language Models Good Plagiarists? A Benchmark for Neural Paraphrase Detection
Authors: Jan Philip Wahle, Terry Ruas, Norman Meuschke, and  Bela Gipp
Contact email: wahle@uni-wuppertal.de; ruas@uni-wuppertal.de
Venue: JCDL
Year: 2021
================================================================
Dataset Description:

Training:
1,474,230 aligned paragraphs (98,282 original, 1,375,948 paraphrased with 3 models and 5 hyperparameter configurations each 98,282) extracted from 4,012 (English) Wikipedia articles.

Testing:
BERT-large (cased): 
    arXiv             - Original - 20,966;     Paraphrased - 20,966; 
    Theses          - Original - 5,226;      Paraphrased - 5,226;
    Wikipedia      - Original - 39,241;     Paraphrased - 39,241;
    
RoBERTa-large (cased): 
    arXiv             - Original - 20,966;     Paraphrased - 20,966; 
    Theses          - Original - 5,226;      Paraphrased - 5,226;
    Wikipedia      - Original - 39,241;     Paraphrased - 39,241;

Longformer-large (uncased): 
    arXiv             - Original - 20,966;     Paraphrased - 20,966; 
    Theses          - Original - 5,226;      Paraphrased - 5,226;
    Wikipedia      - Original - 39,241;     Paraphrased - 39,241;

================================================================

Dataset Structure:

[og] folder: original. The original documents are split by the data source with the following folders:

  • [arxiv]
  • [thesis]
  • [wikipedia]
  • [wikipedia_train]

[`model_name`_mlm_prob_`probability`] (e.g., bert-large-cased_mlm_prob_0.15): contains all paraphrased examples using the model with name `model_name` and Masked Language Modeling probability `probability`.
Each paraphrase model/probability folder contains the corresponding paraphrased documents according to [of]

  • [arxiv]
  • [thesis]
  • [wikipedia]
  • [wikipedia_train]
  • hparams.yml
  • hparams.yml contains the hyperparameters to reconstruct the dataset using the official repository.

================================================================

Files:
On the lowest folder level, each `.txt` file contains exactly one paragraph. The filename contains either "ORIG" for original, or "SPUN" for paraphrased.

================================================================

Code:
To avoid misuse of the code for constructing machine-paraphrased plagiarism, you must consent to our Terms and Conditions and send the signed version via mail to one of the contact addresses above to obtain access to our repository (see TermsAndConditions.pdf).

 

    

Files

MPC.zip

Files (2.2 GB)

Name Size Download all
md5:c1e5b6cd47b65dcd04286ab78aa1023f
2.2 GB Preview Download
md5:04bf27dc75dedabbe8719ea31ba47279
201.9 kB Preview Download

Additional details

Related works

Is published in
Conference paper: 10.1007/978-3-030-96957-8_34 (DOI)