The QUEST quality indicators were designed to be useful for all types of science communication stakeholders. A recent meeting with science education professionals showed how the indicators can be adapted when needed.
Recently, we introduced 12 indicators for science communication quality. These were the result of a process involving a wide variety of science communication stakeholders so that the indicators would reflect a unified understanding about quality in science communication. The quest to define such set of indicators was ambitious enough, but we made our task even more difficult by the decision not to narrow the work down to specific types of science communication. That is, we did not seek to define excellent science journalism, or propose tips for a good Twitter post. No, we envisioned that the same principles would apply to any act of science communication, from museums to science comics, from Facebook to TV debates.
Our meetings with different stakeholder groups to introduce the indicators suggest that we are on the right track. The quality framework has received a favourable reception, and the feedback from the various science communication stakeholders – researchers, journalists, science communication professionals – has confirmed that they can relate to the indicators and find them useful.
Inspired by this, we decided to test how science educators would respond to the same quality framework. The opportunity was presented to us by the Estonian Research Council, which organizes an annual summer school for science communicators and science education professionals. The lovely seaside resort Narva-Jõesuu hosted in mid-August a couple of dozen professionals who work in museums, schools and non-formal education institutions, bringing science to young people.
Same indicators for different fields
The first question: are science communication and science education independent fields or rather two sides of the same coin? For example, MoRRI, the EU project that developed RRI indicators, defined the as separate. The discussions at the summer school concluded that, compared to science communication, science education tends to have a somewhat narrower focus by working with young people, but is able to be provide a more systematic approach in developing critical thinking and science literacy. In general, however, both fields share objectives, concerns and also many of the methods.
Running an exercise where the participants applied the indicators on science education activities provided confirmation that the quality framework works in this context as well. But in the context of science education some elements of the framework need to be adapted, not by replacing some of the indicators but rather adjusting each indicator to the context of education.
This idea resonates well with the overall approach of the framework. Despite our aim to design “universal” indicators, we acknowledge that each medium or science communication format has its specific characteristics that allow it to achieve some quality elements more easily than others. This means that different aspects of the same indicator might be more or less important in different contexts. A part of the QUEST project’s work on the quality indicators, presented in the forthcoming deliverable, has been to map those specifics for the three QUEST focus strands: journalism, social media and museums. It was good to learn that the same approach can be used for science education.
How to adapt the indicators
Let’s take the indicator ‘balanced’ as an example. The framework describes the essence of it as: “Comments by independent experts are provided to key claims. Voices of key stakeholders are represented.” In journalism, a balanced presentation is one of the core values, especially when covering controversial or societally impactful topics (while avoiding “false balance”). Since social media posts rarely have the same possibilities to present various viewpoints or voices as a news article, they can strive towards balance by actively inviting the relevant stakeholders to interact with the post. In museums, the balance in the composition of the curatorial team in relation to gender, age, and ethnicity might become the main aspect to determine this aspect of quality.
So, what is the role of balance in science education? It might seem to play a less relevant role in this context and, indeed, can hardly be applied in the same sense as in journalism. Who are the key stakeholders when teaching basic principles of mathematics, for example? Do we need to find an independent expert when wandering in nature and learning about species?
These are fair questions but rather than dismissing ‘balance’ as a useful indicator, we encourage to take the opportunity and look at the activities with new eyes. Perhaps this reveals some hidden imbalances. Maybe the education activity favours boys over girls, maybe we present a too nice image of science. After all, this is the aim of the QUEST quality indicators: to encourage professionals in science communication and science education to evaluate and reconsider their activities, and seek ways to improve them.