This piece is republished with permission from Millenials Strike Back, the 56th edition of Griffith Review in which Generation Y writers address the issues that define and concern them.
The following is an extract taken from Modern Science, Modern Life.
Research underpinning fundamental scientific concepts or mechanisms of disease is referred to as “basic science”.
I detest the term.
It conjures up images of mundane, uninteresting, simple lab work, but this is rarely the case. No two days are the same.
And more importantly, basic science provides the crucial foundations for research pathways and is essential for identifying opportunities for innovation.
Perhaps it should be called discovery science? You can’t always see the potential applications for basic research; indeed, the applications may not even exist in our lifetime. Isaac Newton surely did not anticipate his universal law of gravitation being involved in the implementation of satellite technology.
Funding scientific research
Unfortunately, basic science remains one of the least attractive kinds of science to fund, especially in Australia. Our country is lagging behind as a result.
I wonder how much better we’d do if our National Health and Medical Research Council funded more than the current 18% of submitted research proposals. Of this funding, basic science receives proportionately little.
While investing in science that has more obvious and direct commercial outputs appears to make more economic sense than investing in basic science, you can’t take market logic and apply it to science. Some of its greatest achievements began with an accidental discovery or an unexpected result. This is the beauty of science.
For example, the discovery that stomach ulcers were caused by a bacterium called Helicobacter pylori was, in part, a beautiful accident. Australian Nobel laureates Barry Marshall and Robin Warren stumbled across the existence of this bacteria after their lab technician forgot to discard the experiment before the Easter holiday period.
Marshall and Warren wanted to confirm their observations that bacteria were present in the location of the stomach ulcer, so they had been collecting samples from people with diagnosed ulcers. The lab technician had seeded those samples onto a culture plate with a nutritious jelly and left them to grow for two days (as per standard bacterium-growing protocols). Nothing grew, and they didn’t find the evidence they were hoping for.
As it turns out, leaving them in the incubator for five days was key. It was the necessary step they didn’t know was missing.
The pressure to perform and publish also stifles the research landscape. A scientist’s worth is apparently quantifiable. We are judged on the volume and impact of our work.
The number of papers we write and the number of times those papers are cited are turned into a single number: an h-index. Technical skills, teaching and mentoring aptitude, passion, and experimental rigor don’t feature in the metrics. Some of the most brilliant scientists I have encountered exhibit all of these qualities, but do not have glowing h-indices to show for it.
Scientists and funding bodies generally acknowledge that the h-index is imperfect; however, the score still carries considerable weight, and can be key in deciding funding success, fellowships, promotions and, ultimately, a person’s ability to continue being a scientist.
In my eyes, this definition of success is wrong. A scientist with a high h-index, but who performs poor-quality research, does not embody success. Another problem is that scientific journals have an aversion to publishing negative results or minor findings, which, in turn, impacts researchers’ h-indices. A scientist who has spent years on an experiment that fails to yield a positive result may not have the opportunity to publish their work because journals want a juicy story: a new pathway discovered, a paradigm shift, something done with flashy new technology, or a potential cure.
This can come at a huge cost when the perceived value of the headline usurps the quality of the data or its interpretation, and, after many failed attempts at replication, the data gets retracted. This pressure on scientists to report significant results, especially unusual or breakthrough findings, in turn exposes the research itself to bias.
Public views of science
The bias in scientific reporting also flows on to the public. Journalists trawl academic journals for articles they can turn into splashy headlines and too often report half-truths, premature assumptions, and over-exaggerated extrapolations of data.
According to the media, there’s a new “treatment” reported for Alzheimer’s Disease every month. In reality, there is still no cure for Alzheimer’s Disease in humans.
Furthermore, if an experiment doesn’t have a positive result and therefore is not published, others are likely to waste time, money and resources repeating that work in the future. However, scientists are nothing if not problem solvers and pushed back against this tendency in recent years. For example, the journal PLOS ONE started a collection for all negative, null and inconclusive results, aptly titled The Missing Pieces.
It is refreshing to see that the requirement for significant results is no longer the only path to publishing research, but there’s still a long way to go.
A lonely road
A year ago, I was treading a very lonely path through science.
There were no funds for me to research full time in Hobart and I wasn’t able to move away for similar work elsewhere for family reasons. Instead I was fortunate to be able to do another job that I love: I taught full-time at a university while caring for my parents, and spent almost two years doing neuroscience research for free. You could say that I made it hard for myself, but I was determined.
I received a small grant for the materials necessary to complete the work. A portion of the grant was intended as a stipend; however, with the increasing cost of materials, I forfeited this to buy what I needed to perform what I saw as essential research.
I was also the only scientist working on peripheral nerves – in this case, nerves in the skin – in the institute’s laboratory at the time. I couldn’t benefit from collective knowledge, nor could I share the workload. Most weeks I worked around 80 hours, and often more. I didn’t resent this because I thought it was what I needed to do to keep up in the industry, but I later discovered that my efforts had instead disadvantaged my research career: taking time off work to be a carer or to have a child would have been accounted for in my research profile as a “career break”.
Oblivious to this, I’d tried to do it all while the research clock kept ticking and my h-index was diluted. I could have easily dropped off the radar.
I had been fighting so hard for the career I love, but the seemingly endless setbacks left me heartbroken and demoralised. I lamented on social media:
If only we, as scientists, could be judged on our passion and enthusiasm, our zest for driving new lines of inquiry, on our ability to ask the challenging questions, and for our genuine scientific skills.
Science is supremely beautiful, but I know it can be brutal and unforgiving if you stray from the well-worn pathways. Many people struggle, not fortunate enough to secure a job, a grant or a mentor to keep their passion alive. The issues with research practice and publication can be infuriating, particularly when the path you want to follow hasn’t been paved yet.
I am one of the lucky ones. My supervisor Professor Howells is a true advocate for junior researchers and is both my hero and my mentor. Rather than beating my own passage through the challenges of research, we face them as a team.
And it made all the difference: I’ve secured considerable funds to keep my research work going for the next three years. It’s safe to say that my heart is filled with hope.