Science can be communicated in many ways. But what evidence do we have that any of our communication is effective in improving the understanding of the practice of science? I posed the same question seven years ago and the answer remains the same – very little. When it comes to engaging students or the public in science, the approach has a scattergun texture: the unstated hope that some of the shot hits the right target.
Sless and Shrensky (2001) characterise the situation perfectly
“…the evidence for the effectiveness of (science) communication is about as strong as the evidence linking rainmaking ceremonies to the occurrence of rain.”
Some years ago a leading American science communication, Dr Rick Borchelt, led a team of top science journalists and communicators to produce a best-practices roadmap for NASA’s Marshall Spaceflight Centre. A key finding of the 2001 report was the surprising insight that science, given it is a data-driven enterprise, does not demand evidence of the effectiveness of the science communication that scientists are asked to undertake.
Why does so little evidence – beyond superficial evaluation – exist? This may be in part because the tools and/or questions tend to rely on the perceptions of students or general public audiences for a specific event. What is measured is what the respondents say they do rather than what they actually do – and often rely on post surveys with response rates of 20-30%. The situation largely remains unchanged in informal science education for both student and public audiences over the years since this article first appeared in 2009.
A big change since the original article has been the dramatic increase in use of social media and the appearance of ‘citizen journalists’. In spite of the explosion of information driven by social media, the problem remains the same – a lack of evidence of the effectiveness of science communication in the public domain.
While we hope science communication (cause) is making a difference (effect) we don’t really know, and perhaps we never will for a single instance of cause and effect. Take smoking for example – research and multiple communications led eventually to action by Governments and an effect (over a long period of time) among the public. Today fewer people smoke, and those that do, at least in Australia, cannot avoid seeing the effects on every packet of cigarettes. What this tells us is that while we cannot pin down a single cause to an effect, we can look at the streams of data over time that tell us about behavioural responses at various points prompted by the players in the provision of scientific information – scientists, the medical field and the government, the latter especially in the regulatory and tax fields. We can look back and see that collectively the effect was less people are smoking because they understand it is very unhealthy and can lead to chronic issues, including cancer.
There is another issue. In science communication we are not actually short of data, but we may not have the right data. Social media, for example, provides huge amounts of data on accessing information but does it tell us anything about the effect(s)?
Underlying the effectiveness of science communication is in understanding the way science fits into the worldviews of multiple audiences. The most interesting to me are the students becoming adult citizens in this new information rich era. How do we effect (or not effect) their worldviews of science in the education pipeline. This is the place where there is opportunity to change alternative conceptions of how science is practised. This is why my research tends to centre on university students, including when thinking about the lack of data on the effectiveness of science communication among public audiences.
These university students that I work with are the product of high school science and maths. It is possible to enrol in a first year university science or maths course with Year 10 science and elementary mathematics. This is far from ideal and these students will inevitably struggle. Ironically, science is one of the easiest courses to get into. One research question I have had for many years – including in my doctorate – is whether students leave high school ignorant of the process of science.
In school science (to Year 10 at least) a key outcome of most science curricula, including the Australian Science Curriculum, is to produce scientifically literate adults. Of course the term ‘scientifically literate’ has no internationally agreed definition – but there is a lot of commonality in how it is defined.
If we assume scientific literacy is achieved as per high school aims in science education then we should see low levels of scientific illiteracy among Australians educated to at least Year 10 in science. However that is not at all supported by the available data, such as it is. While surveys have been far and few between in Australia, there is no evidence that the level of public scientific literacy is any different to the level found in the US and in Europe using a common scientific literacy tool long used by the US, Europe and other nations. This common tool indicates less than one in three of the US and European adult public are considered “scientifically literate” as measured by ten multiple choice questions about every-day science topics and an open-ended question “what does it mean to study something scientifically?”
In 2005 and 2006 I applied the above adult measure of scientific literacy to 692 students in 11 Australian high schools and three Welsh high schools. The results showed less than one in ten of the students at Year 9 and 10 would be considered scientifically literate by the adult measure. The results from brightest science students in the test cohort were separated out and their scores collated – the scientific literacy rate rose to 20%, or one in five students. The same test among 150 non-science first year university students was 15%.
The answers among the 692 respondents to the open-ended question “what does it mean to study something scientifically?” was somewhat concerning, irrespective of what “scientific literacy” actually means. Of those able to give at least a minimally acceptable response, only two respondents used the word ‘predict’ or “predictive” and only 4% used the word “discovery” or “explore” or any word or phrase that can be judged to mean either of those words. However, the question itself is abstract and that perhaps explains the result. Non-abstract versions of this question suggest that this dismal result is suspect to some degree.
While the above statistics are breathtaking (if they can be believed), another statistic eclipsed them, and does not concern whether the open ended question is abstract or not. It relates to the student understanding of the process of science. ALL survey respondents my doctoral study reflected a linear view of the practice of science. In other words, students cite a linear hypothesis-data collection-analysis-interpretation-conclusion model of the scientific method rather than the non-linear cycles that actually occur before moving to any conclusions – results that almost always lead to more questions. The Australian Bureau of Statistics identified this key difference in approach between school science and science as it is practised. It is perhaps one of the critical differences that opens the door to alternative conceptions about the science endeavour.
The obvious question remains:
“How do we know an Australian science education has effectively prepared the student to be a scientifically literate adult?”
While the Government would point to Australia being near the top in international student testing by the Organisation of Economic Co-Operation and Development, there is a problem. Actually three problems:
- There is a big difference in scores between Australia and the top country
- The score Australia has attained has been steadily dropping for the past decade
- The score is an average one, obscuring the fact that the further a student lives from an Australian city the worse the score. Students in the country can be up to two years behind their city peers in the science education – in theory it is possible for some students to leave school with a Year 8 understanding of science
Qualitative date in the form of 21 in-depth interviews with students randomly chosen from the 692 students doctoral study revealed that students generally viewed science as a static, boring subject that required infinite detail. None saw science as the creative, probabilistic, predictive, empirical, consensus-driven human activity that it is, just as the survey results indicated. Again, this may be because school science does not reflect how science is done. Wong and Hodson (2010) demonstrated that science curricula and textbooks stand in stark contrast to the actual practice of science.
The above hints at the possibility that there may be fundamental issues in the understanding of science evolved in the education pipeline that could then be perpetuated in the public domain when the student becomes an adult. It may be that the public consumption of science communication may not occur as we imagine it should because of these persistent alternative conceptions about the way science is carried out. If this is true, then assumptions about the effectiveness of any one science communication intervention drawn from just numbers (e.g. the number of people attending Science Week in Australia) is suspect at best, and probably on very rocky ground. Sless and Shrensky’s observation remains spot-on.
So does the student view of a linear approach to science, where all the facts are known or can be known, really translate out into the adult public domain in Australia? Does this hypothesis that our students may be leaving school less scientifically literate than we think they are hold any water? We do not currently know. However, there are case studies that suggest a lack of understanding of how science works does exist among Australian public audiences to some degree. For example, a focus group of Australian educated Melbourne adults discussed genetically modified goods and came to the vehement conclusion that GM foods have the potential to cause Thalidomide-type deformed babies. A discredited study on the effects of vaccinating babies still causes Australian parents to decide to decline some types of vaccination. More broadly we have the climate change debate that is running a course that bears similarity to our realisation that cigarettes are in fact dangerous to human health. When will we decide collectively that climate change appears to be very bad for planetary health and measures must be taken?
All of the above points to real and continuing issues both in science education pipeline and in science communication in the public domain and how interrelated they may be. We have an ongoing problem. Understanding how audiences understand science (or what adult worldviews of science may be) better than we currently do may help in the conversations between science and the public.
I wrote this in 2009 and it remains true today in 2016:
“Significant change needs to happen if we are going to do any more than what seems to be intuitively good to do – from high school science education to science communication in the public domain. We need to develop research tools to understand the issues and to apply them in ways that let us predict the outcomes of science curricula and science communication activities, based on evidence, and to improve the effectiveness of both.”
Any industrialised, hi-tech nation needs to invest in the public understanding of science and in the science education pipeline from primary school through to university graduation. In the western world there are increasing shortages of suitably qualified graduates in Science,Technology, Engineering, and Maths (STEM). In Australia, businesses of all sizes estimate 75% of jobs will soon need at least some STEM expertise. The predications are that the problem will eventually affect Australia’s economy and security unless the situation is slowed or reversed.
At the very least, all adults deserve the right to leave school being able to assess the value and depth of available evidence, to make decisions based on that evidence and to be prepared to reassess conclusions in the the light of new substantial evidence. Our future, and our children’s future, may well depend on it in economies that are increasingly STEM-based.
June, 2017 Note to the above: Two of my doctoral students are working to improve on the above, one in science communication in formal and informal education, and the other in science communication in social media. You can find Isabelle Kingsley’s work here, and Yi-Ling Hwong’s work here.
Sless, D. and Shrensky, R. (2001) Science Communication in Theory and Practice, eds Stocklmayer, S., Gore, M.M. and Bryant, C.R., Kluwer Academic Publishers
Borchelt, R.E. (2001) Communicating the future: Report of the research roadmap for public communication of science and technology in the twenty-first century, Science Communication, (23)2:194-211
Wong, S.L. and Hodson, D. (2009) From the horse’s mouth: What scientists say about scientific investigation and scientific knowledge, Science Education, Wiley Periodicals, (93)1:109-130