The IRIS Replication Award and Collaboration in the Second Language Research Community

July 18th, 2017, David Mellor


Rij et al. 2009

The IRIS Digital Repository recently announced an award for replication studies. We caught up with members of the IRIS team, Sophie Thompson, Luke Plonsky, and Emma Marsden, to talk  about the repository, their community, and why they value replications.

COS: Can you tell us a bit about IRIS: What is its purpose, who are its users, and what community does it serve?

IRIS is a repository of research materials and data for use in second language research. IRIS helps researchers to share materials, to increase the impact and exposure of their work, and to replicate and build on the work of their colleagues. IRIS aims to assist the field in overcoming the lack of replication studies on language learning and teaching research. The resource is completely free and currently contains more than 1,400 materials. IRIS is used by teachers, researchers and students to help investigate a wide range of topics and theoretical domains and is methodologically very inclusive.

COS: What impact is IRIS having in the community that is serves?

Since its launch IRIS has had 18,278 downloads and 34,308 hits. We also have a Facebook page which is used to share new material uploads, IRIS news, and news about open science in general. Nearly all top journals in the field also encourage their authors to share their materials on IRIS to increase their impact and availability. Materials from 161 different journals are held on IRIS from over 1,200 authors. These numbers show that IRIS has a big impact on its target community of second language researchers, students and teachers.

COS: What do you see as a vision for the future of IRIS? How would you like to see it evolve in the future and how will researchers interact with it?

Over the last year IRIS has developed special collections holding all available examples of a specific type of research instrument. Currently the collections held are self-paced reading instruments and grammaticality/acceptability judgement tests. We plan to develop more special collections and would encourage researchers to consider holding special collections on IRIS.

IRIS has also recently started to hold data. Whilst materials sharing is now more widespread in the field, data sharing is not currently common practice. We aim to change this, to increase transparency and to facilitate meta-analyses as well as re-analysis and other forms of secondary analysis. IRIS is also the only repository in our area that qualifies for journals to award COS badges to researchers. We will continue to encourage journals to ask their authors to share through IRIS and further the open science message with the help of COS.

iris-2Philp & Iwashita 2013 https://www.iris-database.org/iris/app/home/detail?id=york%3A854602&ref=search


COS: You recently announced the IRIS Replication Award. Can you briefly tell us about that and what you will be awarding and how a researcher would win one of these awards?

The IRIS Replication Award aims to encourage and reward researchers carrying out rigorous replication research in the field. We will award £300 annually to a published (or accepted) self-labelled replication using materials held on IRIS. The first round is now open and will close on 30th November 2018. Submissions will be welcome from any area of second language research. All methodological approaches and theoretical perspectives are welcome. To choose a winner we will be considering the strength of the justification for the replication, the soundness and transparency of methods and analysis, and how well the discussion and conclusions are supported by the methods, data and analysis. We will announce the first award by 1 January 2019.  

COS: Why do you value replications and why give an award to those who publish them?

Replication increases the external validity and reliability of research, and helps test theories, constructs and measurements. Yet, we’ve found that in the field of second language research the rate of self-labelled replication published in journals is about 0.26, about one in every four hundred articles. And even among these, the replications were largely conceptual, involving numerous changes between the original and the replication studies, making comparisons difficult. We’d like to encourage researchers to carry out closer replications, and to feel confident in self-labelling them as such. This helps the field track research agendas and their theoretical and methodological precedents. There has been some stigma attached to replication, even though it is so important for making progress in understanding the variables that affect language learning and the effects of instruction.

So, the emphasis to date has been on carrying out one-shot studies, but this is beginning to shift. There is a realisation that we need more intensive and extensive research, so we can arrive at studies that clarify the size, stability, and generalizability of previous findings. This change is one we seek to facilitate and encourage. It is for this reason that we have developed the IRIS Replication Award, to encourage and reward those researchers who are carrying out rigorous replications in order to move the field forward to greater accuracy and generalisability of findings.  


Recent Blogs

The Content of Open Science

What Second Graders Can Teach Us About Open Science

What's Going on With Reproducibility?

Open Science and the Marketplace of Ideas

3 Things Societies Can Do to Promote Research Integrity

How to Manage and Share Your Open Data

Interview with Prereg Challenge Award Winner Dr. Allison Skinner

Next Steps for Promoting Transparency in Science

Public Goods Infrastructure for Preprints and Innovation in Scholarly Communication

A How-To Guide to Improving the Clarity and Continuity of Your Preregistration

Building a Central Service for Preprints

Three More Reasons to Take the Preregistration Challenge

The Center for Open Science is a Culture Change Technology Company

Preregistration: A Plan, Not a Prison

How can we improve diversity and inclusion in the open science movement?

OSF Fedora Integration, Aussie style!

Replicating a challenging study: it's all about sharing the details.

Some Examples of Publishing the Research That Actually Happened

How Preregistration Helped Improve Our Research: An Interview with Preregistration Challenge Awardees

Are reproducibility and open science starting to matter in tenure and promotion review?

The IRIS Replication Award and Collaboration in the Second Language Research Community

We Should Redefine Statistical Significance

Some Cool New OSF Features

How Open Source Research Tools Can Help Institutions Keep it Simple

OSF Add-ons Help You Maximize Research Data Storage and Accessibility

10 Tips for Making a Great Preregistration

Community-Driven Science: An Interview With EarthArXiv Founders Chris Jackson, Tom Narock and Bruce Caron

A Preregistration Coaching Network

Why are we working so hard to open up science? A personal story.

One Preregistration to Rule Them All?

Using the wiki just got better.

Transparent Definitions and Community Signals: Growth in the Open Science Community

We're Committed to GDPR. Here's How.

Preprints: The What, The Why, The How.

The Prereg Challenge Is Ending. What's Next?

We are Now Registering Preprint DOIs with Crossref

Using OSF in the Lab

Psychology's New Normal

How Open Commenting on Preprints Can Increase Scientific Transparency: An Interview With the Directors of PsyArxiv, SocArxiv, and Marxiv

The Landscape of Open Data Policies

Open Science is a Behavior.

Why pre-registration might be better for your career and well-being

Interview: Randy McCarthy discusses his experiences with publishing his first Registered Report

Towards minimal reporting standards for life scientists

Looking Back on the Prereg Challenge and Forward To More Credible Research

This website relies on cookies to help provide a better user experience. By clicking Accept or continuing to use the site, you agree. For more information, see our Privacy Policy and information on cookie use.