Evidence-based science communication

Science can be communicated in many ways. But what evidence do we have that any of our communication is effective in improving the understanding of the practice of science? I posed the same question many years ago, and the answer remains very little. The pandemic has underscored the problem as the messaging of the Government, and the health experts are not always aligned. When engaging students or the public in science, the approach has a scattergun texture: the unstated hope that some shots hit the right target.

Sless and Shrensky (2001) characterise the situation perfectly:

“…the evidence for the effectiveness of (science) communication is about as strong as the evidence linking rainmaking ceremonies to the occurrence of rain.”

Some years ago, a leading American science communication, Dr Rick Borchelt, led a team of top science journalists and communicators to produce a best-practices roadmap for NASA’s Marshall Spaceflight Centre. A key finding of the 2001 report was the surprising insight that science, a data-driven enterprise, does not demand evidence of the effectiveness of the science communication that scientists are asked to undertake.

Why does so little evidence – beyond superficial evaluation – exist? This may be in part because the tools and/or questions tend to rely on the perceptions of students or general public audiences for a specific event. What is measured is what the respondents say they do rather than what they actually do – and often rely on post-surveys with response rates of 20-30%. The situation remains unchanged in informal science education for both student and public audiences since this article first appeared in 2009.

A big change since the original article has been the dramatic increase in the use of social media and the appearance of ‘citizen journalists’. Despite the explosion of information driven by social media, the problem remains the same – a lack of evidence of the effectiveness of science communication in the public domain.

While we hope science communication (cause) is making a difference (effect), we don’t know, and perhaps we never will for a single instance of cause and effect. Take smoking, for example – research and multiple communications led eventually to action by Governments and an effect (over a long period of time) among the public. Today fewer people smoke, and those that do, at least in Australia, cannot avoid seeing the effects of every packet of cigarettes. What this tells us is that while we cannot pin down a single cause to an effect, we can look at the streams of data over time that tells us about behavioural responses at various points prompted by the players in the provision of scientific information – scientists, the medical field and the government, the latter, especially in the regulatory and tax fields. We can look back and see that collectively the effect was fewer people are smoking because they understand it is very unhealthy and can lead to chronic issues, including cancer.

There is another issue. In science communication, we are not short of data but may not have the right data. Social media, for example, provides huge amounts of data on accessing information but does it tell us anything about the effect(s)?

Underlying the effectiveness of science communication is in understanding the way science fits into the worldviews of multiple audiences. The most interesting to me are the students becoming adult citizens in this new information-rich era. How do we affect (or not) their worldviews of science in the education pipeline? This is where there is an opportunity to change alternative conceptions of how science is practised. This is why my research tends to centre on university students, including when considering the lack of data on the effectiveness of science communication among public audiences.

These university students I work with are the product of high school science and maths. Enrolling in a first-year university science or maths course with Year 10 science and elementary mathematics is possible. This is far from ideal, and these students will inevitably struggle. Ironically, science is one of the easiest courses to get into. One research question I have had for many years – including in my doctorate – is whether students leave high school ignorant of the process of science.

In school science (to Year 10 at least), a key outcome of most science curricula, including the Australian Science Curriculum, is to produce scientifically literate adults. Of course, the term ‘scientifically literate’ has no internationally agreed definition – but there is a lot of commonality in how it is defined.

If we assume scientific literacy is achieved per high school aims in science education, then we should see low levels of scientific illiteracy among Australians educated to at least Year 10 in science. However, that is not at all supported by the available data, such as it is.  While surveys have been few in Australia, there is no evidence that the level of public scientific literacy is any different from that found in the US and Europe using a common scientific literacy tool long used by the US, Europe and other nations. This common tool indicates less than one in three of the US and European adult public are considered “scientifically literate” as measured by ten multiple choice questions about everyday science topics and an open-ended question “what does it mean to study something scientifically?”

In 2005 and 2006, I applied the above adult measure of scientific literacy to 692 students in 11 Australian high schools and three Welsh high schools. The results showed that less than one in ten students in Years 9 and 10 would be considered scientifically literate by the adult measure. The results from the brightest science students in the test cohort were separated, and their scores collated – the scientific literacy rate rose to 20% or one in five students. The same test among 150 non-science first-year university students was 15%.

The answers among the 692 respondents to the open-ended question “what does it mean to study something scientifically?” was somewhat concerning, irrespective of what “scientific literacy” actually means. Of those able to give at least a minimally acceptable response, only two respondents used the word ‘predict’ or “predictive”, and only 4% used the word “discovery” or “explore” or any word or phrase that can be judged to mean either of those words. However, the question itself is abstract and that perhaps explains the result. Non-abstract versions of this question suggest that this dismal result is somewhat suspect.

While the above statistics are breathtaking (if they can be believed), another statistic eclipsed them and did not concern whether the open-ended question is abstract. It relates to the student’s understanding of the process of science. ALL survey respondents in my doctoral study reflected a linear view of science practice. In other words, students cite a linear hypothesis-data collection-analysis-interpretation-conclusion model of the scientific method rather than the non-linear cycles that occur before moving to any conclusions – results that almost always lead to more questions. The Australian Bureau of Statistics identified this key difference in approach between school science and science as practised. Perhaps one of the critical differences opens the door to alternative conceptions about scientific endeavour.

The obvious question remains:

“How do we know an Australian science education has effectively prepared the student to be a scientifically literate adult?”

While the Government would point to Australia being near the top in international student testing by the Organisation of Economic Co-Operation and Development, there is a problem. Actually three problems:

  1. There is a big difference in scores between Australia and the top country
  2. The score Australia has attained has been steadily dropping for the past decade
  3. The score is average, obscuring that the further a student lives from an Australian city, the worse the score. Students in the country can be up to two years behind their city peers in science education – in theory some students can leave school with a Year 8 understanding of science

Qualitative data in the form of 21 in-depth interviews with students randomly chosen from the 692 students’ doctoral study revealed that students generally viewed science as a static, boring subject that required infinite detail. None saw science as the creative, probabilistic, predictive, empirical, consensus-driven human activity that it is, just as the survey results indicated.  Again, this may be because school science does not reflect how science is done. Wong and Hodson (2010) demonstrated that science curricula and textbooks starkly contrast the actual practice of science.

The above hints at the possibility that fundamental issues in the understanding of science evolved in the education pipeline could then be perpetuated in the public domain when the student becomes an adult.  It may be that the public consumption of science communication may not occur as we imagine it should because of these persistent alternative conceptions about the way science is carried out. If this is true, then assumptions about the effectiveness of any science communication intervention drawn from just numbers (e.g. the number of people attending Science Week in Australia) are suspect at best and probably on very rocky ground. Sless and Shrensky’s observation remains spot-on.

So does the student view of a linear approach to science, where all the facts are known or can be known, really translate into Australia’s adult public domain? Does this hypothesis that our students may be leaving school less scientifically literate than we think they hold any water?  We do not currently know. However, some case studies suggest a lack of understanding of how science works among Australian public audiences to some degree.  For example, a focus group of Australian-educated Melbourne adults discussed genetically modified goods. It came to the vehement conclusion that GM foods can potentially cause Thalidomide-type deformed babies. A discredited study on the effects of vaccinating babies still causes Australian parents to decline some types of vaccination. More broadly, the climate change debate is running a course that bears similarity to our realisation that cigarettes are dangerous to human health. When will we decide collectively that climate change appears to be very bad for planetary health and measures must be taken?

All of the above points to real and continuing issues in the science education pipeline and in science communication in the public domain and how interrelated they may be. We have an ongoing problem. Understanding how audiences understand science (or what adult worldviews of science may be) better than we currently do may help in the conversations between science and the public.

I wrote this in 2009, and it remains true today in 2023:

“Significant change needs to happen if we are going to do any more than what seems to be intuitively good to do – from high school science education to science communication in the public domain. We need to develop research tools to understand the issues and to apply them in ways that let us predict the outcomes of science curricula and science communication activities, based on evidence, and to improve the effectiveness of both.”

Any industrialised, hi-tech nation must invest in the public understanding of science and the science education pipeline from primary school to university graduation.  In the western world, there are increasing shortages of suitably qualified graduates in Science, Technology, Engineering, and Maths (STEM). In Australia, businesses of all sizes estimate 75% of jobs will soon need at least some STEM expertise.  The predictions are that the problem will eventually affect Australia’s economy and security unless the situation is slowed or reversed.

At the very least, all adults deserve the right to leave school to assess the value and depth of available evidence, make decisions based on that evidence and be prepared to reassess conclusions in the light of new substantial evidence. Our future and our children’s future may well depend on it in increasingly STEM-based economies.

References:

Sless, D. and Shrensky, R. (2001). Science Communication in Theory and Practice. Eds Stocklmayer, S., Gore, M.M. and Bryant, C.R., Kluwer Academic Publishers

Borchelt, R.E. (2001). Communicating the future: Report of the research roadmap for public communication of science and technology in the twenty-first century. Science Communication, (23)2:194-211

Wong, S.L. and Hodson, D. (2009) From the horse’s mouth: What scientists say about scientific investigation and scientific knowledge.  Science Education, Wiley Periodicals, (93)1:109-130