I followed my partner to a workshop in plasma physics. The
workshop was held in a mountain resort in Poland – getting there was an
adventure worthy, perhaps, of a separate blog post.
“I’m probably the only non-physicist in the room”, I say, apologetically,
at the welcome reception, when professors come up to me to introduce
themselves, upon seeing an unfamiliar face in the close-knit fusion
community.
Remembering this XKCD comic,
I ask my partner: “How long do you think I could pretend to be a physicist
for?” I clarify: “Let’s say, you are pretending to be a psychological
scientist. I ask you what you’re working on. What would you say?”
“I’d say: ‘I’m working on orthographic depth and how affects
reading processes, and also on statistical learning and its relationship to
reading’.” Pretty good, that’s what I would say as well.
“So, if you’re pretending to be a physicist, what would you
say if I ask you what you’re working on?”, he asks me.
“I’m, uh, trying to implement… what’s it called… controlled
fusion in real time.”
The look on my partner’s face tells me that I would not do
very well as an undercover agent in the physics community.
The attendees are around 50 plasma physicists, mostly
greying, about three women among the senior scientist, perhaps five female
post-docs or PhD students. Halfway through the reception dinner, I am asked
about my work. In ten sentences, I try to describe what a cognitive
scientist/psycholinguist does, trying to make it sound as scientific and
non-trivial as possible. Several heads turn, curious to listen to my
explanation. I’m asked if I use neuroimaging techniques. No, I don’t, but a lot
of my colleagues and friends do. For the questions I’m interested in, anyway, I
think we know too little about the relationship between brain and mind to make
meaningful conclusions.
“It’s interesting”, says one physicist, “that you could
explain to us what you are doing in ten sentences. For us, it’s much more
difficult.” More people join in, admitting that they have given up trying to
explain to their families what it is they are doing.
“Ondra gave me a pretty good explanation of what he is
doing”, I tell them, pointing at my partner. I sense some scepticism.
Physics envy is a term coined by psychologists (who else?),
describing the inferiority complex associated with striving to be taken serious
as a field in science. Physics is the prototypical hard science: they have long
formulae, exact measurements where even the fifth decimal places matter, shiny multi-billion-dollar
machines, and stereotypical crazy geniuses who would probably forget their own
head if it wasn’t attached to them. Physicists don’t
always make it easy for their scientific siblings (or distant cousins)* but, admittedly, they do have a right to be smug towards psychological scientists, given the
replication crisis that we’re going through. The average physicist,
unsurprisingly, finds it easier to grasp concepts associated with mathematics
than the average psychologist. This means that physicist have, in general, a better
understanding of probability. When I tell physicists about some of the absurd
statements that some psychologists have made (“Including unpublished studies in
the meta-analysis erroneously biases an effect size estimate towards zero.”;
“Those replicators were just too incompetent to replicate our results. It’s
very difficult to create the exact conditions under which we get the effect:
even we had to try it ten times before we got this significant result!”),
physicists start literally rolling on the floor with laughter. “Why do you even
want to stay in this area of research?” I was asked once, after the physicist I
was talking to had wiped off the tears of laughter. The question sounded
neither rhetorical nor snarky, so I gave a genuine answer: “Because there are a
lot of interesting questions that can be answered, if we improve the
methodology and statistics we use.”
In physics, I am told, no experiment is taken seriously
until it has been replicated by an independent lab. (Unless it requires some unique equipment, in which case it can't be replicated by an independent lab.) Negative results are still considered informative, unless they are due to experimental errors. Physicists still have issues with
researchers who make their results look better than they actually are by
cherry-picking the experimental results that fit best within one’s hypothesis
and with post-hoc parameter adjustments – after all, the publish-or-perish
system looms over all of academia. However, the importance of replicating
results is a lesson that physicists have learnt from their own replication crisis:
in the late 1980s, there was a
shitstorm about cold
fusion, set off by experimental results that were of immense
public interest, but theoretically implausible, difficult to replicate, and
later turned out to be due to sloppy research and/or scientific
misconduct. (Sounds familiar?)
Physicists take their research very seriously, probably to a
large extent because it is often of great financial interest. There are those
physicists who work closely with industry. Even for those who don’t, their work
often involves very expensive experiments. In plasma physics, a shot on the
machine of Max Planck Institute for Plasma Physics, ASDEX-Upgrade, costs several
thousand dollars. The number of shots required for an experiment depends on the research aims, and whether there is other data available, but can go up to 50 or more. This gives very strong
motivation to make sure that one’s experiment is based on accurate calculations
and sound theories which are supported by replicable studies. Furthermore, as
there is only one machine – and only a handful of similar machines all over
Europe – it needs to be shared with all other internal and external projects.
In order to ensure that shots (and experimental time) are not wasted, any team
wishing to perform an experiment needs to submit an application; the call for
proposals opens only once a year. A representative of the team will also need
to do a talk in front of the committee, which consists of the world’s leading
experts in the area. The committee will decide whether the experiment is likely
to yield informative and important results. In short, it is not possible – as
in psychology – to spend one’s research career testing ideas one has on a whim,
with twenty participants, and publish only if it actually ‘works’. One would be
booed of the stage pretty quickly.
It’s easy to get into an us-and-them mentality and feelings
of superiority and inferiority. No doubt all sciences have something of
importance and of interest to offer to society in general. But it is also
important to understand how we can maximise the utility of the research that we
produce, and in this sense we can take a leaf out of physicists’ books. The
importance of replication should be adopted also into the psychological
literature: arguably, we should simply forget all theories that are based on
non-replicable experiments. Perhaps more importantly, though, we should start
taking our experiments more seriously. We need to increase our sample sizes;
this conclusion seems to be gradually coming through as a consensus in
psychological science. This means that also our experiments will become more
expensive, both in terms of money and time. By conducting sloppy studies, we
may still not loose thousands of dollars of taxpayers’ (or, even worse,
investors’) money for each blotched experiment, but we will waste the time of
our participants, the time, nerves and resources of researchers who try to make
sense of or replicate our experiments, and we stall progress in our area of
research, which has strong implications for policy makers in areas ranging from
education through improving social equality, prisoners’ rehabilitation, and political/financial
decision making, to mental health care.
--------------------------------------
* Seriously, though, I haven’t met a physicist who is as bad
as the linked comic suggests.
Acknowledgement: I'd like to thank Ondřej Kudláček, not only for his input into this blogpost and discussions about good science, but also for his unconditional support in my quest to learn about statistics.
No comments:
Post a Comment