Two weeks ago, I set out on a quest
to increase the number of journals which publish reading-related research and
offer the publication format of Registered Reports (RRs: https://cos.io/rr/). I wrote a tweet about my
vague idea to start writing to journal editors, which was answered by a detailed
description of 7 easy steps by Chris Chambers (reproduced in full below,
because I can’t figure out how to embed tweets):
1)
Make a list of journals that you want to see
offer Registered Reports
2)
Check osf.io/3wct2/wiki/Journal
Responses/ to see if each journal has already been publicly approached
3)
If it hasn’t, adapt template invitation letter
here: osf.io/3wct2/wiki/Journal
Requests/
4)
Assemble colleagues (ECRs & faculty, as many
as possible) & send a group letter to chief editor & senior editors.
Feel free to include [Chris Chambers] as a signatory and include [him] and
David Mellor (@EvoMellor) in CC.
5)
Login with your OSF account (if you don’t have
one then create one: https://osf.io/)
6)
Once logged in via OSF, add the journal and
status (e.g., “Under consideration [date]”) to osf.io/3wct2/wiki/Journal
Responses/. Update status as applicable. Any OSF user can edit.
7)
Repeat for every relevant journal & contact
[Chris] or David Mellor if you have any questions.
So far, I have completed Steps 1-3,
and I’m halfway through Step 4: I have a list of signatories, and will start
emailing editors this afternoon. It could be helpful for some, and interesting
for others to read about my experiences with following the above steps, so I
decided to describe them in a series of blog posts. So, here is my experience
so far:
My
motivation
I submitted my first RR a few
months ago. It was a plan to conduct a developmental study in Russian-speaking
children, which would assess what kind of orthographic characteristics may pose
problems during reading acquisition. There were two issues that I encountered
while writing this report: (1) For practical reasons, it’s very difficult to
recruit a large sample size to make it a well-powered study, and (2) it’s not
particularly interesting for a general audience. This made it very difficult to
make a good case for submitting it to any of the journals that currently offer
the RR format. Still, I think it’s important that such studies are being
conducted, and that there is the possibility to pre-register, in order to avoid
publication bias and unintentional p-hacking
(and to get peer review and conditional acceptance before you start data
collection – this, I think, should be a very good ‘selfish’ reason for everyone
to support RRs). So it would be good if some more specialist journals started
accepting the RR format for smaller studies that may be only interesting to a
narrow audience.
Making
lists
The 7-step process involves making
two lists: One list of journals where I’d like to see RRs, and another list of
signatories who agree that promoting RRs is a good thing. I created a Google
Doc spreadsheet, which anyone can modify to add journals to one sheet and their
name to another sheet, here.
A similar list has been created, at around the same time, by Tim Schoof, for
the area of hearing science, here.
Getting
signatories
The next question is how to recruit
people to modify the list and add their names as signatories. I didn’t want to
spam anyone, but at the same time I wanted to get as many signatures as
possible, and of course, I was curious how many reading researchers would
actively support RRs. I started off by posting on twitter, which already got me
around 15 signatures. Then I wrote to my current department, my previous
department, and made a list of reading researchers that came to my mind. Below
is the email that I sent to them:
Dear fellow reading/dyslexia
researchers,
Many of you have probably
heard of a new format for journal articles: Registered Reports (RR). With RRs,
you write a study and analysis plan, and submit it to a journal before you
collect the data. You get feedback from the reviewers on the design of the
study. If the reviewers approve of the methods, you get conditional acceptance.
This means that the study will be published regardless of its outcome.
Exploratory analyses can still be reported, but they will be explicitly
distinguished from the confirmatory analyses relating to the original
hypotheses.
The RR format is good for
science, because it combats publication bias. It is also good for us as
individual researchers, because it helps us to avoid situations where we invest
time and resources into a study, only to find out in retrospect that we
overlooked some design flaw and/or that non-significant findings are
uninformative, yielding the study unpublishable. You can find more information
and answers to some FAQs about RRs here: https://cos.io/rr/.
There are some journals which offer the RR format, but not many of them are relevant to a specialised audience of reading researchers. Therefore, I'd like to contact the editors of some more specialised journals to suggest accepting the RR format. I started off by making a list of journals which publish reading-related research, which you can view and modify here: https://docs.google.com/spreadsheets/d/1Ewutk2pU6-58x5iSRr18JlgfRDmvqgh2tfDFv_-5TEU/edit?usp=sharing
To increase the probability of as many journals as possible deciding to offer RRs (alongside the traditional article formats) in the future, I would like to ask you three favours:
1) If there are any journals where you would like to see
RRs, please add them to the list (linked above) before the 15th of June.
2) If you would like to be a signatory on the emails to the editors, please let me know or add your name to the list of signatories in the second sheet (same link as the list). I will then add your name to the emails that I will send to the editors. Here is a template of the email: https://osf.io/3wct2/wiki/Journal%20Requests/
3) If you are part of any networks for which this could be relevant, please help to spread the information and my two requests above.
2) If you would like to be a signatory on the emails to the editors, please let me know or add your name to the list of signatories in the second sheet (same link as the list). I will then add your name to the emails that I will send to the editors. Here is a template of the email: https://osf.io/3wct2/wiki/Journal%20Requests/
3) If you are part of any networks for which this could be relevant, please help to spread the information and my two requests above.
If you have any questions or
concerns about RRs, I would be very happy to discuss.
Thank you very much in
advance!
Kind regards,
Xenia.
The list of reading researchers
that I could think of contained 62 names, including collaborators, friends,
people I talked to at conferences and who seemed to be enthusiastic about open
science, some big names, people who have voiced scepticism about open science;
in short: a mixture of early-career and senior researchers, with varying
degrees of support for open science practices. After I sent these emails, I
gained about 15 more signatures.
After some consideration, I also
wrote to the mailing list of the Scientific Studies of Reading. I hesitated
because, again, I didn’t want to spam anyone. But in the end, I decided that there
is value in disseminating the information to those who would like to do
something to support RRs, even if it means annoying some other people. There
were some other societies’ mailing lists that I would have liked to try, but I
where I couldn’t find a public mailing list and did not get a response from the
contact person.
After having done all this, I have
45 signatures, excluding my own. In response to my emails and tweets, I also
learned that two reading-related journals are already considering implementing
RRs: The Journal
of Research in Reading and Reading
Psychology.
Who
supports RRs, anyway?
I would find it incredibly interesting
to answer some questions using the signature “data” that I have, including: At
what career stage are people likely to support RRs? Are there some countries
and universities which support RRs more than others? I will describe the
observed trends below. However, the data is rather anecdotal, so it should not
be taken any more seriously than a horoscope.
Out of the 46 signatories, I
classified 17 as early career researchers (PhD students or post-docs), and 29
as seniors (either based on my knowledge of their position or through a quick
google search). This is in contrast to the conventional wisdom that young
people strive for change while older people cling on to the existing system.
However, there are alternative explanations: for example, it could be that ECRs
are more shy about adding their name to such a public list.
The signatories gave affiliations
from 11 different countries, namely UK (N = 10), US (N = 7), Canada (N = 6),
Australia (N = 5), Belgium (N = 4), Germany (N = 4), the Netherlands (N = 4),
Italy (N = 2), Norway (N = 2), Brazil (N = 1), and China (N = 1).
The 46 signatories came from 32
different affiliations. The most signatures came from Macquarie University, Australia
(my Alma Mater, N = 5). The second place is shared between Dalhousie
University, Canada, and Université Libre de Bruxelles, Belgium (N = 3). The
shared third place goes to Radboud University, the Netherlands; Royal Holloway
University, London, UK; Scuola Internationale Superiore di Studi Avanzati
(SISSA), Italy; University of Oslo, Norway; University of York, UK, and University
of Oxford, UK (N = 2).
All of these numbers are difficult
to interpret, because I don’t know exactly how widely and to which networks my
emails and tweets were distributed. However, I can see whether there are any
clear trends in the response rate among the researchers I contacted directly,
via my list of researchers I could think of. This list contained 62 names of 22
ECRs and 40 senior researchers. Of these, 5 ECRs and 6 senior signed, which
indeed is a higher response rate amongst ECRs than senior researchers.
Before sending the group email, I’d
tried to rate the researchers, based on my impression of them, of how likely
they would be to sign in support of RRs. My original idea was to write only to
those who I thought are likely to sign, but ultimately I decided to see if any
of those whom I’d consider unlikely would positively surprise me. I rated the
researchers on a scale of 1 (very unlikely) to 10 (I’m positive they will
sign). If I didn’t know what to expect, I left this field blank. This way, I
rated 26 out of the 62 researchers, with a mean rating of 6.1 (SD = 2.1, min =
2, max = 10). Splitting up by ECR and senior researchers, my average ratings
were similar across the two groups (ECRs: mean = 5.9, SD = 1.5, min = 3, max =
7; seniors: mean = 6.2, SD = 2.4, min = 2, max = 10).
How accurate were my a priori
guesses? Of the 26 people I’d rated, 9 had signed. Their average rating was 7.3
(SD = 1.1, min = 6, max = 9). The other 17 researchers had an average rating of
5.4 (SD = 2.2, min = 2, max = 10). So it seems I was pretty accurate in my
guesses about who would be unlikely to sign (i.e., there were no ‘pleasant
surprises’). Some of the people I considered to be highly likely to sign did
not do so (‘unpleasant surprises’), though I like to think that there are
reasons other than a lack of support for the cause (e.g., they are on
sabbatical, forgot about my email, or they think there are better ways to
promote open science).
What
now?
Now I will start sending out the
emails to the editors. I’m very curious how it will go, and I’m looking forward
to sharing my experiences and the reactions in an upcoming blogpost.
No comments:
Post a Comment