How Far Can We Trust Science?

Conor Matthews
12 min readJul 20, 2023

--

An Entry Into The Hubert Butler Essay Prize 2023

“If you only have a hammer, every problem looks like a nail.”

These words by psychologist Abraham Maslow summarises his Law of The Instrument; the overreliance on familiar methods for every issue, even when inappropriate. Often the irony is rich. Poverty is tackled by the same economic approaches that cause it. Scarcity is “solved” by concentrating resources. Likewise, scientific ethics are determined by those who benefit from it.

It’s easy to teach science as inherently unbiased and objective. The very principles of the scientific method, peer review, and the history of progression have painted an image of revision, standards, and truths. But science takes all variables into account; the slightest external factor can skew and misrepresent results. It’s disingenuous to act as though those who fund, present, and conduct the research are not prone to bias and deception. It’s important we address the myth of objectivity, the exploitation by authorities, and the insecurity of nuance and scientific illiteracy.

– — –

The myth of objectivity follows the tradition of asserting authority by continuity. In the same ways countries portray themselves as inheriting the nobility of empires, authority uses the air of infallibility to protect their supremacy, even if the scientific community was once a victim of them.

The archetype of the Wise Man is one respected throughout history and cultures as druids, shamans, and alchemists; a single person entrusted with knowledge was always afforded respect and status. These people are still products of their time. Admirable does not mean incorruptible. Thomas Edison killed animals during experiments. Nikola Tesla supported Eugenics. Albert Einstein was a serial adulterer. It’s bad enough when the character doesn’t match the legend, but worse when the expertise of one field is masked by a halo effect, as though you’re always right. You don’t need a degree or experience to have an opinion, but you can’t excuse lack of evidential defence with “just trust me”.

A notable example is psychologist Jordan Peterson, who has pivoted into media punditry, touching on everything from self-help, culture, history, theology, gender, and “The Woke”. His field of study was clinical psychology, yet that hasn’t stopped him from claiming to be an expert on topics listed above. When he is promoted as an expert, a large number of people are ready to accept that because of his title as a doctor (a title due to be revoked at the time of writing).

The Sciences have amazing appeal, especially when it comes to categorisation. It’s comforting to think everything can be labelled and filed away. We are pattern recognition animals. We make sense of the world through structure. The issue is that it’s easy-to-use science as an all powerful force to silence criticism and ambiguity. We want certainty at all costs. This leads to very little room for nuance and technicality. If I said the sky was blue, you’d say I am correct. But I’m not. The sky is not blue. In fact, there is no sky. The sky we see is sunlight being refracted through gas in our atmosphere, which happens to then look blue. Even that is an oversimplification. Try explaining gender without defaulting to the dumbed down binary. Try explaining how our solar system looks nothing like in text books. We don’t even know what gravity is!

Due to the complexity, esoteric nature, and demands of any niche of science, we need to dumb down and be selective with what we accept as common knowledge. In order to understand, we must be biased, favouring conformity over accuracy. In order to agree on what is right, we must ignore what we get wrong.

The irony is that through science’s constant development, revision, experimentation, and review, it’s in a fluid state of change. New discoveries, theories, and experiments are made every day. It is this objectivity that causes this confusing, seemingly wish-washy approach. This is why you may hear conflicting studies within weeks of each other. They are responding to one another, fact checking to solidify a consensus. Science is a process of nuance.

There is a phenomenon called “The Observer Effect”; a measurable effect an observer has on the outcome of events. Another name for this is “The Observer Bias”, which is apt because science doesn’t happen in a vacuum. Research is funded, permitted, and publicised… or it isn’t. Rarely is it wholly impartial boards and administrators making decisions. It is for-profit companies funding treatment production, alternative fuels, renewable energy, and technological advancements. This can mean studies into the effects of cigarettes, pollutants, and tech addiction can be tailored when companies causing these issues are footing the bill.

On the occasions where direct state funding has been involved, scientific endeavours have been used for war, suppressed for unflattering results, or acted on performatively. If the Sciences are anything like their cousins in the Arts, then they’ll be familiar with the depressing nature of grant applications. Due to the long, arduous periods of studies, it’s possible to be cancelled or defunded because of a change in government. We vote in people to oversee scientific development they know little about and may even be opposed to. The extreme example of this was in 1930’s Germany, where after a golden age of Gender Studies and Sexual Health, entire books and institutions were burned to the ground.

It’s difficult to be objective when you remember who’s in charge.

– — –

The appeal to science as a rhetorical argument, often masking pseudoscience in palpable dressing, has been used to justify the worst acts of humanity. Racism and slavery have been justified with phrenology. Homophobia and transphobia have been assisted with “conversion therapy”. And misogyny and sexual assault are framed as part of a biological hierarchy. There is little excuse needed to bolster fallacies, but what of the modern obsession with sounding scientific?

The modern idolatry of rationality can be traced back to the mid-2000s with the rise of New Atheism. The budding age of the Internet gave a platform to less mainstream and more aggressive voices, especially in the era of Evangelical Republicanism in the United States. New Atheism, the surge of interest online for non-religious identities, with a heavy emphasis on attacking organised religion, has a counter-cultural flavour to it. It was also profitable. Richard Dawkins, Bill Maher, Christopher Hitchens, and George Carlin built careers on their outspoken beliefs. It didn’t take long for “Atheist Destroys Religion” videos to become a viral sensation, consisting of these supposedly wordly free thinkers insulting their opponents and constantly relying on an appeal to rationality. The language and performance was adopted, and now, ironically, it’s conspiracy theorists and religious conservatives appealing to rationality in arguments.

Just because you can make a point doesn’t mean you have a point. Yes, we live in a world governed by laws of physics, but we’re also governed by culture, relationships, desires, the weather, boredom, etc. Just because you can explain away love as chemicals and neurons doesn’t mean you’re stupid for feeling it. Just because you can prove someone is less intelligent, doesn’t make them less human. Being technical doesn’t make you moral.

The profit motive for the Sciences is unmatched. Science spans industries, academia, politics, and day-to-day life. Putting an exact value on it is like evaluating thinking, language, or digestion; it’s an absurdity. However much there is to make from R&D, it is corporations and businesses driving much of it.

Ideally, every scientist in the world would be working on cures for cancer or AIDS. We had a glimpse into this utopia as the first batches of Covid vaccine were synthesised within a year of the first outbreaks. Contrary to belief, most funded research is in private industries, making breakthroughs in more important matters like shampoos, microwave meals, and video games. It is a misuse of talent and knowledge, and a privatisation of research that has overlapping applications. The Peak Oil Hypothesis, the point where the oil market would collapse, was researched by Shell and BP in the 1970s, unintentionally uncovering early indications that fossil fuels were harming the environment. This research would have given environmentalists much needed data to warn the public about climate change decades in advance, but it couldn’t be

touched because it was registered as intellectual property, despite being unused. The same thing happened with cancer research and cigarette producers, and with addiction research and social media, gaming, and gambling platforms. To bury the data is one thing, but to keep it from helping others is just inhumane.

When science isn’t contested, it’s utilised by the very people who stood against it. Evolution was deemed blasphemous for lowering man to the level of an animal. The trade off for initial acceptance was its co-opting as the basis of racial pseudoscience, comparing people to apes. Space exploration was discredited as frivolous and indulgent public spending. Now, after decades of programs with NASA and ESA, for-profit companies are taking those government contracts, turning space in a tourist spot. Even now, when psychological and neurological studies on creativity were seen as dead ends, AI generative text and imagery based on that research has become a gold mine, stealing from creatives and displacing workers. It has gotten so prevalent so quickly that I feel the need to state that this essay was made without AI. That is the world we’re in now; I need to prove I am a human.

Adjacent to these private acts of negligence is the info-tainment generated. The interest in STEM careers has led to a leisurely pursuit of science as a hobby. TED Talks have become synonymous with thought-provoking content. Similar educational content has sprung up on YouTube and TikTok, and often panel shows will discuss the latest news grabbing discoveries and open up for comments from viewers. Who could imagine people would actively pursue learning from the comfort of their home? The problem here, however, comes in two forms.

First is the profit motive inherent in media. While there are altruistic examples, they are still competing against less well-meaning producers. TED Talks, while free online, charge thousands to attend live lectures, catering to the affluent. They are still making editorial decisions on what does and doesn’t get presented, taking the audience, brand, and publicity into account. This is why TED Talks are criticised for framing societal and environmental issues as individual failings to a largely corporate audience.

And second, a false equivalency is presented in discussions. In the digital era, there has been a noticeable shift from reporting the news to generating it. Constant discussion must be made, but there are only so many scientists and communicators available. Most news stations and publications will have only one science or tech correspondent. The solution is to open the topic to pundits or viewers told to “have your say”. Why the Hell does Angus from Aberdeen or Padma in Belfast get to weight in on something they didn’t know about only five minutes ago? The very same researchers who conduct these studies can’t be everywhere all at once to refute every misconception of their work. The result is the illusion of comprehension.

– — –

In the latter half of the 2000s, psychologists Matt Motyl and Brian Nosek hypothesised political beliefs could affect perceptions of the world, explaining rising political extremity. Their experiment had participants, ranging between extremes and moderates of the political spectrum, rank shades of grey on a scale (1–10, black-white). Results showed those of political extremes were unable to accurately rank the shades, while the moderates were the most accurate. People who viewed the world in black and white LITERALLY could only see black and white. They weren’t colour blind; they just couldn’t see the nuance in shades of grey. This was a psychological treasure trove; a headline catching study promising funding, book deals, and fame as soon as they published.

But they didn’t. Instead, concerned by the rise of sensationalist scientific papers, they repeated the experiment. It was a fluke! There was no discernable link. They had disproved their own hypothesis. They told “Perspectives On Psychological Science” in 2012 (and I am quoting them directly here) “Why the fXXk did we do a direct replication!”

Their story stands not only as a model of integrity in the face of temptation, but as an example of the tightrope studies must walk; publish for funding, but know your work may be misrepresented. A study into possible links between chocolate consumption and rates of cancer can become, through sensationalism and misreporting, “Chocolate Causes Cancer!” It’s easy to imagine how Motyl and Nosek’s study could have become an inaccurate fluff piece degrading further with each retelling. While researching for this essay, it was a bit laborious to find sources for the above anecdote. Part of the reason was because there’s a strong incentive to only publish confirmations to your own hypothesis. If Motyl and Nosek hadn’t shared the story, understandably embarrassed, we never would have known the pressure many are under to publish at all costs.

If history has taught us anything, it’s that in the face of insurmountable force, wealth, and power, the only viable way to level the playing field is the acquisition of knowledge. Whether it is the printing press, the Renaissance, the Enlightenment, or the Internet, the effects of sharing knowledge and education have contributed more to the advancement of everyone than any adherence to hierarchical rigidity ever has. The issue, however, is the stubborn need from everyone for familiarity and expediency to everyday life.

The average person is practical, but more short-sighted than pragmatic. It’s a mistake to take little interest in how our world works. Even our politicians are woefully ignorant to the STEM sectors they are in charge of regulating. And then, of the people who do make it a civic duty to be informed, they don’t always land on the most accurate or even logical information; hence, we get conspiracy theories.

We’ve seen how a lack of scientific literacy can not only endanger us but also erode trust and respect in the Sciences, as was the case with the initial response to the 2020 Corona Virus outbreak. Originally, masks were deemed unnecessary for combating the spread. This advice was later reversed, and masks were heavily promoted. This came across as flip-flopping and contradictory. People don’t understand that at the time there wasn’t any evidence masks helped because we never had a scenario to base claims off. The absence of evidence is not the evidence of absence. Though in hindsight it would have been right to immediately recommend mask usage, it would have been completely baseless, and would have led to further confusion. It didn’t help either that part of this decision was to ensure there wasn’t a surge in demand for medical supplies. While logical, people don’t think like that; they want directness. This is the one time in this whole essay where I do think the blame for mistrust can be placed on the scientific community.

So what can be done?

You may be familiar with the adage repeated by school students; “When am I going to use any of this?” Besides being depressingly telling, it is a fair question. While the education system exposes children to as many avenues for further education as possible, there are no obvious soft skills taught in Science. English has critique and expression. Maths has problem solving and abstract thought. Science has the squandered potential to teach fact checking and scepticism. It behoves us to teach these mental tools, especially for a STEM led future.

That’s all well and good, you may say, but not everyone is going to study science when there are bills to pay, especially if it goes over our heads. True, hence why, like the school curriculum above, there should be a greater emphasis placed upon soft skill professions acting as mediators between the public and the scientific community. We need a new generation of communicators.

The issue with the current batch has already been explored earlier; they’re either specialists speaking outside their fields or they are beholden to patrons. Couple that with the fact there is no professional certification for “communicators”, and you have the illusion of accuracy. Most countries in Europe have civic, legal, medical, and even consumer advice freely available, yet nowhere outside school can people be better informed on discoveries and inventions. I’m not advocating for an analogue Google, but rather a patient, empathetic, and non-judgemental profession with the skills need to understand the nuances of science and convey them in easily understood ways.

One role model for such a position is the online personality Hank Green. An early adopter on the Internet, Hank Green is better known for debunking misinformation on TikTok, responding directly to them, ensuring those who see the videos will also be shown his corrections. He not only fact checks, but delivers it in a kind and light-hearted manner. He’s also a co-founder of “Crash Course”, a free educational YouTube channel covering everything from history, literature, politics, and science. Hank, along with many others, are acting as the next generation’s Carl Sagan and Bill Nye, proving that the biggest part of enlightening people is just treating them like people.

– — –

A hammer is the perfect metaphor for science. It can be seen throughout history, it transcends borders, and can be used by anyone. It has also been used for ill. We don’t say hammers are tools of murder or torture if they are used as such. We understand it is not the tool that is at fault, it’s the user. We can’t just ban tools, but we can decide how they should be used. The same goes for the Sciences. By better understanding the Sciences not only can we better engage with them but we can safeguard against the concentration of knowledge by those who hide behind the excuse “trust the science”.

We do trust the science. We don’t trust you.

Photo by Milad Fakurian on Unsplash

--

--

No responses yet