How can scientists become more trustworthy? Photo credit: CDC via Unsplash
Across the Western world, social trust in certain institutions has plummeted in recent decades.
Americans have record-low trust in everything, from organised labour to organised religion. A 2024 survey found that trust in institutions broadly—including government, corporations, media, and nonprofits—is lower in the UK than any of the other 25 countries surveyed.
74% of people trust scientists to tell them about new innovations and technologies…
Nevertheless, trust in science is faring decently well. Across the countries surveyed, 74% of people trust scientists to tell them about new innovations and technologies, much better than the figures for CEOs, journalists, or government leaders. Roughly three-quarters of Brits and Americans have at least some trust in scientists.
Three-quarters of the population trusting scientists is not always good enough. A rising number of UK and US children are not being vaccinated, and even low numbers can lead to major public health risks.
The era of the social Internet has been particularly destructive for social trust. Scrolling through 10-second TikToks is an entirely different way of getting information from watching the TV and reading the newspaper. Its incentives are aligned differently. The most successful creators exploit the algorithm and human psychology: anger is better than information at winning people’s attention.
…anger is better than information at winning people’s attention.
What can scientists do to try to get the public on their side and promote public health? During the COVID-19 pandemic, we saw first-hand how rapidly evolving science and new forms of media could interact. Most scientists did good work. Still, like politicians who partied while others socially distanced, some scientists betrayed the public’s trust.
The proximal origin of SARS-CoV-2 was published in Nature in March 2020, right as the pandemic was leading to lockdowns across the world. The authors, top virologists in the US, UK, and Australia, wrote that they did ‘not believe that any type of laboratory-based scenario [was] plausible’. The paper was used by government officials, scientists, and journalists to discredit the ‘lab-leak’ theory of Covid’s origins as a baseless conspiracy theory.
Privately, the researchers weren’t quite so sure. In a Slack message a day after the paper’s preprint was published, lead author Kristian Andersen wrote to colleagues that, ‘we unfortunately just can’t rule out a potential accidental infection from the lab’ but worried about the paper ‘backfiring’ if the conclusions were left too open-ended. In April, after the full paper was published, Andersen wrote that, ‘I’m still not fully convinced that no culture [in a lab] was involved’.
Today, there is ongoing debate about the likelihood that Covid originated in a lab, but in 2020 the researchers worked with “higher-ups” in the US and UK governments to manipulate journalists to make it seem that a lab-leak was a discredited conspiracy theory.
This is one example of scientists letting their political opinions lead their research, but it’s not the only one. In recent years, some American researchers have spread misleading or debunked research about topics, from the US maternal mortality rate to the effects of gender therapy on minors.
This is not what “experts” are supposed to do. In theory, an apolitical expert is supposed to provide relevant and objective facts, and the public and politicians are supposed to consider those facts accordingly.
Clearly that model isn’t entirely working right now, especially in America. There, voters chose a Republican Party that refuses to act on climate change, and a president who is appointing vaccine sceptics to two top health positions.
In these circumstances, the temptation to spin information is understandable. If the old model of the expert has broken down, why not try something new? But science becoming more partisan cannot be part of the response. In the Edelman Trust Barometer, 67% of Americans said that science had become too politicised.
In 2020, Nature endorsed Joe Biden for president. In 2023, Nature Human Behaviour published a study finding that the endorsement didn’t change readers’ views of Biden or Trump. Reading the study did have an effect, though: it made Trump supporters significantly less trusting of Nature and less interested in its information on Covid-19 and vaccines.
…Nature published an editorial defending its endorsements, even if they were counterproductive.
Instead of responding to the evidence and acting differently, Nature published an editorial defending its endorsements, even if they were counterproductive. They argued that Trump’s record on climate change, the pandemic, and nuclear security was so dangerous that the endorsement was necessary, especially since they ‘always offer evidence to back [it] up’.
If readers were not convinced by the endorsement and it negatively affected the scientific community’s credibility, what does it matter that it offered evidence? In October, Nature endorsed Kamala Harris, arguing that Trump ‘fosters mistrust of scientists’. Nature’s stated goal was to promote trust in science, but in response they issued an endorsement that, according to their own scientific standards, was likely to decrease trust in science.
What should scientists do in our low-trust era? There are no easy answers. A small minority might never trust vaccines, even if every scientist became a perfectly non-partisan truth-teller and an adept public communicator. Unfortunately, it seems to be easier to get people angry and paying attention on the internet by lying than by telling the truth. Also, most scientists will never be put in a fraught position like public health officials. Condensed matter physicists and materials scientists would have a hard time producing a politically contentious study, even if they tried.
There are no easy answers, but any solution must start with scientists being rigorously transparent and honest. Trust is a two-way street. If scientists want the public’s trust, they must trust the public with the whole truth.
**some ideas expressed in this article are opinion, and may not represent the opinion of The Oxford Scientist as a whole**