There is a newer version of this record available.

Report Open Access

Automating ERA Benchmarks: An on-demand pilot system for calculating ERA-like benchmarks using open data and transparent analysis

Tonti-Filippini, Julian; Napier, Kathryn; Neylon, Cameron

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Tonti-Filippini, Julian</dc:creator>
  <dc:creator>Napier, Kathryn</dc:creator>
  <dc:creator>Neylon, Cameron</dc:creator>
  <dc:description>To enhance confidence in decision making, research administrators and funding agencies require insight into the performance of research-active institutions. Focusing on 42 Australian higher education providers and 236 fields of research, the Excellence in Research for Australia process (ERA) reports on research activity relative to local and global benchmarks. The ERA report is compiled for release every three to five years and uses a citation-focused methodology that depends on institutional self-reporting of research outputs. It is of interest to explore additional data sources and analysis methods to complement the ERA process. To facilitate this, the Curtin Open Knowledge Initiative (COKI) has constructed a pilot system, demonstrating the feasibility of conducting an on-demand, ERA-like analysis for research-active institutions (globally), using journal-level metadata from the ARC and article-level metadata from publicly available datasets.

Given a sufficiently comprehensive dataset, containing output-affiliation links, output citation data, and the journal assignment that was planned for ERA 2023, we show that the COKI pilot system is able to generate ERA-like benchmarks and indicators, aligned with ERA 2018 methodology and proposed ERA 2023 methodology. Analysis is conducted for ANZSRC fields of research between the years 2011-2021 and includes calculation of dynamic RCI boundaries, the proposed high-performance indicator, and citation centiles.

Determining the actual institutional scores, used to inform the citation-based ERA panels, is also feasible given a dataset comparable to that submitted by institutions for previous ERA rounds (containing outputs and FoR apportionments for each institution). We demonstrate this is possible in principle using open data on institutional affiliation of outputs (a ‘byline approach’), together with the ERA 2018 and ERA 2023 journal lists to assign outputs to FoRs. In a fully automated system this demonstration data would be replaced with either institutional submissions based on a census date, or an algorithmic FoR assignment process.</dc:description>
  <dc:subject>research evaluation</dc:subject>
  <dc:title>Automating ERA Benchmarks: An on-demand pilot system for calculating ERA-like benchmarks using open data and transparent analysis</dc:title>
All versions This version
Views 1,9671,877
Downloads 1,059997
Data volume 2.6 GB2.5 GB
Unique views 1,6991,641
Unique downloads 875833


Cite as