Published April 30, 2020 | Version v1
Poster Open

(Re)Building Trust? Investigating the effects of badges on perceived trustworthiness in journal articles

  • 1. University of Tübingen
  • 2. Leibniz Institute for Psychology Information

Description

Perceived credibility of empirical findings depends on the ability of recipients to evaluate the research process (Lupia, 2018). Such evaluations depend, among others on the level of transparency the scientists engaged in. In line with this, the growing open science movement tries to incentivize open science practices (i.e. open data, open analysis script, open materials) that yield transparent and reproducible research (Munafò et al., 2017). However, first investigations revealed no evidence of an effect of visible open science practices (OSP) on the trustworthiness of research on a discipline level (Wingen, Berkessel, & Englich, 2019). In our study, we want to shift the focus from abstract descriptions of OSP on a discipline level to concrete applications of OSP in journal article research reports such as open practice badges. Badges are commonly placed prominently on the title page of a research report and signal which OSP the authors implemented. We thus investigate if visible OSP foster perceived trustworthiness when reading journal articles of empirical studies. As they influence the perception and processing of information, we further will investigate the role of epistemic beliefs in this context.

Preregistered Hypotheses:
Confirmatory, H1: Visible OSP (vs. not visible vs. visibly non-OSP) influence perceived trustworthiness (subscale integrity) of the empirical study. Our assumption: The more openness, the more trustworthy with small to moderate effects: μ1 < μ2 < μ3
Confirmatory, H2: The higher (topic specific) multiplistic epistemic beliefs, the lower perceived trustworthiness (subscale integrity). Negative correlation.
Additional exploratory analyses will be conducted.

The design includes three conditions:
visible open science practices condition: Subjects receive a title page of an empirical study (Title, Abstract, Keywords, Introduction, ...) together with three open science badges. The badges are explained using hints in style of speech bubbles.
practices not visible condition: Subjects receive the same title page with no further information on OSP. For comparability purposes, speech bubbles are used as well, giving information on keywords, volume/ issue and abstract.
visible non-open science practices condition: Subjects receive the same title page with open science badges and speech bubbles indicating that the authors did not engage in the OSP open data, open analysis script and open materials.
Two of the three conditions are randomized within persons.
Measured variables are perceived trustworthiness: Muenster Epistemic Trustworthiness Inventory (Hendriks, Kienhues, & Bromme, 2015). Topic specific multiplistic epistemic beliefs: subscale from Merk et al. (2017). Treatment check: perceived openness/ transparency of the empirical study and on the perception of the speech bubbles. An additional small set of demographic variables and covariates will be assessed.

As conditions are rotated (participants receive 2 out of 3 conditions), we conducted Bayes factor Design Analysis for two t-tests. Required sample size for small to medium effects with a stopping rule of Bayes Factor of 10 (1/10 respectively) and 80% power are N=220. Data will be collected from pre-service teachers using monetary incentives. For the data analyses, we plan to use an Informative Hypotheses Approach using the `bain` package.

Data collection is underway, results will be presented at the conference.

Files

OSC2020_15-1_Poster.pdf

Files (2.0 MB)

Name Size Download all
md5:3054952be475de7d803eb7642f654261
1.9 MB Preview Download
md5:63ee5f8c7ee62c8ca8bd54ea0cc4744a
121.6 kB Preview Download