Draft Principles of Good Practice for Public Communication ...

19
1 Public consultation Deadline: 28 February 2022 Draft Principles of Good Practice for Public Communication Responses to Mis- and Disinformation Introduction Governments are operating in a rapidly changing media and information ecosystem 1 , which provides unprecedented opportunities to engage with the public but also presents challenges regarding how people consume and share information, affecting who and what they trust. In particular, social media platforms have shown a tendency to facilitate the spread of emotional and polarising content including, critically, mis- and disinformation (Smith, 2019[1]). While these phenomena predate the digital age, new communication technologies have amplified their volume and reach (Lewandosky et al., 2020[2]). The pervasive and global spread of mis- and disinformation 2 , furthermore, poses acute and far-reaching challenges to governments and societies. For instance, the COVID-19 crisis has highlighted how misleading or false information can affect policy uptake and vaccine confidence in ways that can threaten public health and people’s lives. At the same time, disagreements about basic facts can cause or aggravate the erosion of public conversation, contributing to political paralysis and harming public engagement and democracy (Kavanagh, 2018[3]). While the range of policy interventions to combat this challenge continues to grow, and cross disciplines, those that seek to improve media and information ecosystems stand out for their potential contribution to build societal resilience to false narratives online and offline. On the one hand, the practices analysed as part of this work attest to how much governments rely on the public communication function to help prevent, respond to and mitigate the spread of mis- and disinformation. More broadly, a focus on communication goes hand-in-hand with a drive to support all partners in the ecosystem, to make it into a more viable space for the diffusion of reliable news and information. Indeed, governments do not communicate in a vacuum traditional media and fact-checkers, technology companies, civil society, and citizens themselves are 1 This is understood as the combination of communication and media governance frameworks as well as principal actors (i.e. governments, traditional and social media companies and citizen journalists). 2 Misinformation describes situations where false or misleading information is shared but no harm is intended; the sharer may not even be aware the information is false. Disinformation is when false, manipulative and/or misleading information is knowingly shared with the intention of causing harm or influencing the information environment. Disinformation and information influence operations may be spread by foreign or domestic actors. Adapted from Wardle, C. & Derakhshan, H. (2017) “Information Disorder: Toward an interdisciplinary framework for research and policy making”, Council of Europe, Brussels: Council of Europe.

Transcript of Draft Principles of Good Practice for Public Communication ...

1

Public consultation

Deadline: 28 February 2022

Draft Principles of Good Practice for Public Communication Responses to Mis- and Disinformation

Introduction

Governments are operating in a rapidly changing media and information ecosystem1, which provides

unprecedented opportunities to engage with the public but also presents challenges regarding how people

consume and share information, affecting who and what they trust. In particular, social media platforms

have shown a tendency to facilitate the spread of emotional and polarising content including, critically, mis-

and disinformation (Smith, 2019[1]). While these phenomena predate the digital age, new communication

technologies have amplified their volume and reach (Lewandosky et al., 2020[2]).

The pervasive and global spread of mis- and disinformation2, furthermore, poses acute and far-reaching

challenges to governments and societies. For instance, the COVID-19 crisis has highlighted how

misleading or false information can affect policy uptake and vaccine confidence in ways that can threaten

public health and people’s lives. At the same time, disagreements about basic facts can cause or aggravate

the erosion of public conversation, contributing to political paralysis and harming public engagement and

democracy (Kavanagh, 2018[3]).

While the range of policy interventions to combat this challenge continues to grow, and cross disciplines,

those that seek to improve media and information ecosystems stand out for their potential contribution to

build societal resilience to false narratives online and offline. On the one hand, the practices analysed as

part of this work attest to how much governments rely on the public communication function to help prevent,

respond to and mitigate the spread of mis- and disinformation. More broadly, a focus on communication

goes hand-in-hand with a drive to support all partners in the ecosystem, to make it into a more viable space

for the diffusion of reliable news and information. Indeed, governments do not communicate in a vacuum

– traditional media and fact-checkers, technology companies, civil society, and citizens themselves are

1 This is understood as the combination of communication and media governance frameworks as well as principal

actors (i.e. governments, traditional and social media companies and citizen journalists).

2 Misinformation describes situations where false or misleading information is shared but no harm is intended; the

sharer may not even be aware the information is false. Disinformation is when false, manipulative and/or misleading

information is knowingly shared with the intention of causing harm or influencing the information environment.

Disinformation and information influence operations may be spread by foreign or domestic actors. Adapted from

Wardle, C. & Derakhshan, H. (2017) “Information Disorder: Toward an interdisciplinary framework for research and

policy making”, Council of Europe, Brussels: Council of Europe.

2

For Official Use

essential to generate and amplify trustworthy content. Governments are well-positioned to lead multi-

stakeholder efforts that strengthen the information environment by engaging and co-operating with a wide

range of actors. As governments experiment with new, diverse and more effective approaches to

counteract mis- and disinformation, the OECD has conducted a comparative analysis of good practices

and drawn from these a set of draft Principles of Good Practice for Public Communication Responses to

Mis- and Disinformation (hereafter “draft Good Practice Principles”.

In order to properly frame the intent and focus of these draft principles, it is to be noted that there are

important policy and regulatory considerations, as well as practices, emerging in other fields of government

activities, such as those related to competition or to freedoms of speech, that are relevant but beyond the

scope of this work. While not treated directly in this analysis, they inform the overall thinking behind the

Principles of Good Practice and are the subject of other streams of work of the OECD3.

In sum, these draft Principles of Good Practice provide policy-makers with guidance to address the spread

of mis- and disinformation content in a holistic manner, and in turn rebuild trust in the information

ecosystem and support democracy. They relate most directly to public communication interventions, but

are relevant and applicable to a broader range of responses against this phenomenon.

Specifically, the OECD will promote further work on:

Stimulating a multi-disciplinary discussion on what has worked to address low public trust toward

information from official and mainstream sources;

Providing practical guidance for international efforts to promote confidence in COVID-19 vaccines

as an essential step to overcoming the pandemic;

Compiling a range of good practices on government interventions aiming to counter mis- and

disinformation and address underlying causes of mistrust in information;

Helping guide government efforts to co-operate with media, civil society organisations, the private

sector and individuals to support media and information ecosystems that promote openness,

transparency and inclusion.

This document presents a brief overview of the challenges presented by the spread of mis- and

disinformation, as well as the role of governments in responding to these challenges through their public

communication functions. It then outlines the ten draft Principles of Good Practice, highlighting how they

relate to a holistic approach to counteract the spread and impact of false and misleading content. The

document then discusses good practices on which the principles are based, organised according to 3

levels of interventions. The ongoing identification and mapping of practices will expand the knowledge

base and support countries’ attempts to put the principles into use.

The evolving context of mis- and disinformation and diminishing trust

The proliferation of mis- and disinformation concerning public policy, official messages and scientific data

can create significant challenges for governments in their efforts to ensure accurate information reaches

all groups in society. By casting evidence and facts into doubt, harmful and misleading content can work

against policy goals and sow distrust (for example, in the context of COVID-19 responses and overcoming

vaccine hesitancy, elections, or on topics such as migration and climate change) (OECD, 2020[4]). Indeed,

research from Europe and the U.S. suggests that internet and social media use affects levels of trust and

may further polarise pre-existing beliefs and opinions (Ceron, 2015[5]; Klein and Robison, 2019[6]). To this

3 For more information on OECD work on preventing and combating misinformation and disinformation:

https://www.oecd.org/governance/reinforcing-democracy/.

3

end, the spread of mis- and disinformation online may build on and aggravate a deeper-seated crisis of

trust toward governments and sources of news.

According to the Edelman Trust Barometer, only 35% of respondents claim to trust what they see on social

media. Confidence in traditional media is higher, at 53%, but is at a 10-year low and declined by eight

percentage points between 2020 and 2021 (Edelman, 2021[7]).4 Worryingly, research has also noted an

increase in the phenomenon of news avoidance, whereby citizens often deliberately turn away from

information, which can signal disengagement with policy and public issues (Fletcher et al., 2020[8]).

Trust is the foundation on which the legitimacy of public institutions is built. It is a multifaceted concept,

and its influence on the outcomes of public policies is significant and tangible, as is its role in supporting

social cohesion and policy implementation that requires behaviour change from the public, for example.

Research has shown that mistrust is correlated to the spread of misinformation; as such, policy-makers

must consider not only the specific context under which populations are mistrustful, but also the degree of

perceived legitimacy and credibility of relevant institutions (Donovan, 2020[9]).

Effective public communication is instrumental to rebuilding confidence in governments and institutions.

This core government function therefore has both the potential and the responsibility to play a critical role.

It can promote trust by showing that public institutions are responsive, reliable, and value integrity,

openness, and fairness, which are the key drivers of trust identified by the OECD (OECD, 2017[10]). In

addition, public communication can enable and expand opportunities for public participation in policy-

making by providing an avenue for public participation in democratic debates, which can further contribute

to shaping policy outcomes. As discussed further below, to realise its potential for supporting good

governance, the principles of open government, and democracy, public communication must be

transparent, respectful of the values of honesty, integrity and impartiality, and conceived as a means for

two-way engagement with citizens. In this way, it can serve to respond to the challenges of mis- and

disinformation, help build institutional trust and support democratic engagement (Hyland-Wood et al.,

2021[11]).

Beyond the public communication function, ensuring a healthy information ecosystem requires a systemic

and holistic approach (Matasick, Alfonsi and Bellantoni, 2020[12]). This requires looking at the spaces in

which communication is delivered and the actors that can contribute to the flow of trustworthy information.

For example, equipping citizens with the necessary skills and awareness to be responsible producers and

consumers of content and sustaining fact-based, rigorous, and independent journalism are key elements

of building sound media and information environments. Media diversity, particularly at the local level, has

suffered with the rise of social media platforms and the related decline in advertising revenues, whereas

clickbait and sensationalistic reporting has grown (Matasick, Alfonsi and Bellantoni, 2020[12]). This trend

aggravates the vulnerability of the information ecosystem and undermines an essential avenue for

trustworthy news and government accountability.

Interventions will need to support civil society, fact-checkers, and the research community, as well as

involve online platforms, as vital allies in the common effort to tackle information disorders. Governments

are unlikely to overcome the challenge of mis- and disinformation acting alone: co-operation and

engagement with a wide range of stakeholders is essential for an effective whole-of-society response.

Draft Principles of Good Practice

The ten draft Principles of Good Practice are drawn together from the experiences and insights of a wide

range of countries, to help define the factors that underpin diverse responses and make them more

4 Based on a survey of 15 525 respondents from 28 countries, including OECD members Australia, Canada, Colombia,

France, Germany, Ireland, Italy, Japan, Korea, Mexico, Netherlands, Spain, the United Kingdom and the United States.

4

For Official Use

effective. OECD work developing the principles was originally supported by the UK Government as part of

the global vaccine confidence initiative of its G7 presidency. The OECD has subsequently gathered

information, discussed and presented the principles and practices that inform them with Members and

partners via a number of high-level OECD and G7 events focused on mis- and disinformation; relevant

OECD Working Party and Expert Group meetings, and bi-laterally with members and partners. These draft

Principles of Good Practice focus on public communication, as a primary function that enables direct

responses by governments, as well as a wider range of interventions to strengthen media and information

ecosystems. This is exemplified most prominently in efforts to tackle the urgent challenges related to the

COVID-19 response and recovery.

Ultimately, these draft Principles of Good Practice aim to help governments:

Ensure policies and communications are informed by and promote open government principles,

and thus are conducted in a transparent and honest way, use inclusive messages and channels,

engender two-way dialogue via responsive communications that resonate with citizens’ needs,

and work with all relevant public and private stakeholders in the information ecosystem as part of

a whole-of-society approach to reinforce democracy.

Establish mechanisms for effective communication through public-interest driven (intended as

de-politicized), institutionalised and evidence-based approaches.

Focus on the challenges of countering the spread and effects of mis- and disinformation through

building capacity for timely, preventive and future-proofed efforts to respond to problematic

content.

The draft Principles are included in Box 1.

Box 1. The Draft Principles

1. Transparency

Governments strive to communicate in an honest, clear, and open manner, with institutions

comprehensively disclosing information, decisions, processes and data within the limitations of relevant

legislation and regulation. Transparency, including about assumptions and uncertainty, can reduce the

scope for rumours and falsehoods to take root, as well as enable public scrutiny of official information and

open government data.

2. Inclusiveness

Interventions are designed and diversified to reach all groups in society. Official information strives to be

relevant and easily understood, with messages tailored for diverse publics. Channels and messages are

appropriate for intended audiences, and communication initiatives are conducted with respect for cultural

and linguistic differences and with attention paid to reaching disengaged, vulnerable, underrepresented or

marginalised groups.

3. Responsiveness

Governments develop interventions and communications around the needs and concerns of citizens.

Adequate resources and efforts are dedicated to understanding and listening to their questions and

expectations to develop informed and tailored messages. Responsive approaches facilitate two-way

dialogue, including with vulnerable, underrepresented and marginalised groups, and enable an avenue for

public participation in policy decisions.

5

4. Whole-of-society collaboration

Government efforts to counteract information disorders are integrated within a whole-of-society approach,

in collaboration with relevant stakeholders, including the media, private sector, civil society, academia and

individuals. Governments broadly promote the public’s resilience to mis- and disinformation, as well as an

environment conducive to accessing, sharing and facilitating constructive public engagement around

information and data. Where relevant, public institutions co-ordinate and engage with non-governmental

partners with the aim of building trust across society and all parts of the country.

5. Public interest driven

Public communication should strive to be independent from politicization in implementing interventions to

counteract mis- and disinformation. Public communication is conducted as separate and distinct from

partisan and electoral communication, with the introduction of measures to ensure clear authorship,

impartiality, accountability, and objectivity.

6. Institutionalisation

Governments consolidate interventions into coherent approaches guided by official communication and

data policies, standards and guidelines. Public communication offices benefit from adequate human and

financial resources, a well-co-ordinated cross-government approach at national and sub-national levels,

and dedicated, trained and professional staff.

7. Evidence-based

Government interventions are designed and informed by trustworthy and reliable data, testing, behavioural

insights, and build on the monitoring and evaluation of relevant activities. Research, analysis and learnings

are continuously gathered and feed into improved approaches and practices. Governments focus on

recognising emerging narratives, behaviours, and characteristics to understand the context in which they

are communicating and responding.

8. Timeliness

Public institutions develop mechanisms to act in a timely manner by identifying and responding to emerging

narratives, recognising the speed at which false information can travel. Communicators work to build

preparedness and rapid responses by establishing co-ordination and approval mechanisms to intervene

quickly with accurate, relevant and compelling content.

9. Prevention

Government interventions are designed to pre-empt rumours, falsehoods, and conspiracies to stop

potentially harmful information from gaining traction. A focus on prevention requires governments to

identify, monitor and track problematic content and its sources; recognise and proactively fill information

and data gaps to reduce susceptibility to speculation and rumours; understand and anticipate common

disinformation tactics, vulnerabilities and risks; and identify appropriate responses, such as “pre-bunking”.

10. Future-proof

Public institutions invest in innovative research and use strategic foresight to anticipate the evolution of

technology and information ecosystems and prepare for likely threats. Counter-misinformation

interventions are designed to be open, adaptable and matched with efforts to build civil servants’ capacity

to respond to evolving challenges.

6

For Official Use

Selected relevant good practices

The principles are based on the analysis and review of relevant emerging practices in the field of countering

mis- and disinformation and the factors that make them effective, as identified through bilateral discussions

with members and partners, desk research, and engagement with the OECD Working Party on Open

Government (WPOG) and its Experts Group on Public Communication (EGPC). This section and Annexes

together discuss a continually growing selection of good practices, structured across three interlinked

levels: the communication practices that directly aim to counteract mis- and disinformation and provide

trustworthy information; the identification of institutional frameworks, both material and normative, that

guide actions by all relevant government agencies, define their organisation and co-ordination within and

outside of government, and provide the necessary resources to operate; and the interventions to foster an

enabling ecosystem in which governments drive a whole-of-society effort to support the flow of

trustworthy information and data while mitigating the spread of problematic content (see Figure 1).

Together, these good practices illustrate a holistic approach to tackle information disorders and help

demonstrate how the principles can guide the implementation of such interventions.

Figure 1. Leveraging public communication through practices, institutions and an enabling ecosystem

Communication practices

First, public communication can provide essential opportunities to both react to and prevent the spread of

mis- and disinformation, as well as serve as a source of accurate, reliable and timely information and data.

Filling data vacuums, and “managing the creation and dissemination of trusted information so that it is not

excessive, overwhelming or confusing” (WHO, 2020[13]) are core pillars of trustworthy communication.

These anticipate and respond to citizens’ needs, while pre-empting the spread of rumours and falsehoods.

This pillar reflects how governments can use public communication as a tool to prevent, fight and

counteract mis- and disinformation. Effective public communication is an important tool for fostering a well-

Communication practices

Institutional frameworks

Enabling ecosystem

7

informed society, strengthening trust, and reducing the risk for damaging narratives to gain ground. Several

OECD Members have noted that proactive, timely, and transparent communication has become in itself a

primary means of preventing the spread of mis- and disinformation, all while accomplishing other policy

objectives. The finding that doctors, scientists and community members tend to be trusted more when it

comes to vaccine information, for example, points to the essential role of pursuing an inclusive approach

(Privor-Dumm and King, 2020[14]).

Prevention is an area of key importance, as avoiding a falsehood spreading in the first place is better than

correcting it once it has taken root. Indeed, studies have shown that if people are warned about efforts to

sow doubt, they resist being swayed by misinformation or disinformation. To that end, pre-bunking – or

attempting to “inoculate” the public to misleading messages – requires anticipating potential

misunderstandings or disinformation attacks, which means listening to and engaging with the audience to

understand their concerns (Blastland et al., 2020[15]).

Other practices to build information confidence include using appropriate channels and delivering clear

and tailored messages; identifying, tracking and monitoring problematic content and its sources;

communicating honestly and openly, including about uncertainty; and understanding audience needs and

enabling two-way dialogue. The identification of practices in this pillar will clarify how communication can

help minimise negative consequences of problematic content and strengthen information ecosystems (see

additional descriptions of examples of communication practices in Annex I).

Institutional frameworks

Well-designed policies and well-resourced institutions are essential for governments to implement effective

responses to mis- and disinformation. Addressing these complex challenges requires adequate

institutional capacity, which can result from establishing and adopting policy, legal and implementation

frameworks that set the mandate and scope for interventions. Building institutional capacity also entails

designated structures, diverse and specialised skillsets, and appropriate resources to ensure relevant

bodies can deliver on the above mandates. This is the case for the communication practices described

above as it is for the systemic responses discussed in the next section of the document.

Indeed, since the escalation of the mis- and disinformation challenge in the last decade, many governments

have designated new structures within their cybersecurity agencies, electoral oversight bodies,

communication units, and most recently in their public health agencies. The cross-cutting nature of the

information disorders highlights the need for versatile and whole-of-government capacity, as for strategic

foresight to future-proof institutional responses. It is especially important that dedicated institutions

(whether units, agencies, or others) are adaptable and have built-in mechanisms for continuous learning,

evaluation, and training.

The institutional set-up and mandate similarly matters to the effectiveness of public communication

responses. Notably, it requires mechanisms to insulate this function from political pressure, as politicisation

of information and messages can have repercussions on the degree of transparency and the perceived

reliability of government institutions as trustworthy sources (Fairbanks, Plowman and Rawlins, 2007[16];

Heise, 1985[17]). Communicators’ position within the government and internal and external co-ordination,

including with sub-national governments, are other core factors that can allow communication activities to

be conducted at scale and to a high standard.

Relevant policies and practices in this section highlight a range of approaches to build institutional

frameworks that can support the implementation of effective and holistic interventions against mis- and

disinformation (see additional descriptions of examples of relevant institutional frameworks in Annex II).

8

For Official Use

Enabling ecosystem

Beyond communication practices and institutional frameworks, the context in which information is shared

and spread is of key importance in the fight against mis- and disinformation. This media and information

ecosystem features the tools and actors that help spread information (and mis- and disinformation), and

influences how individuals receive and respond to content. Given the speed of changes to the information

ecosystem in which people, organisations and institutions communicate, responses require engagement

between relevant actors at the local, national and international levels. A whole-of-society effort – including

collaborating with, supporting, and benefiting from civil society, academics, the private sector and

individuals – is required to support the timely and effective sharing of information and data and to foster

democratic dialogue.

The threats posed by mis- and disinformation to policy and governance – including repercussions on public

confidence in vaccines, perceptions of migration and security, or understanding of climate change – are at

least in part a symptom of larger societal issues. Understanding these underlying factors is essential to

designing appropriate responses, as is understanding the role each stakeholder can play in delivering

them.

Good practices that foster an enabling ecosystem for communication and dialogue therefore include a wide

range of public interventions aimed at steering a whole-of-society response and building resilience. These

include building transparent and constructive relationships with platforms and fact-checkers; expanding

media and information literacy efforts and enhancing the ability of individuals to recognise and dismiss

false and misleading information; supporting fact-based journalism and a diverse and accessible media

landscape; creating avenues for the public to deliberate on complex issues; and furthering research and

knowledge of the challenges and solutions to sustain a healthy information ecosystem (see additional

descriptions of examples of practices that support an enabling ecosystem to build resilience to the spread

of mis- and disinformation in Annex III).

Next steps

The practices and interventions discussed here are not exhaustive. Legislative and regulatory measures

on the diversity and independence of the media; the strength of a country’s civic space; considerations

concerning content moderation, transparency requirements and business models of social media

platforms; and addressing undue influence are additional factors. While these issues also contribute to an

increasingly resilient environment to harmful and misleading content, the scope of the practices covered

here focus more narrowly on those that directly involve constructive engagement between a wide range of

actors.

Over the coming months, the OECD will continue to expand on the principles, as well as continue to collect

practices that inform their application. The OECD will also continue to engage in multi-disciplinary

discussions on what works and expand the mapping of good practices, to be presented in a report

compiling relevant cases and designed to help governments implement the principles.

Annex I – Communication practices

Good practice Description

Identifying, tracking and monitoring problematic content and its sources

Monitoring open and public communication channels to spot problematic content and emerging narratives is a key feature of developing effective communication responses to mis- and disinformation. Monitoring, especially for sensitive issues, should be conducted routinely and frequently to ensure the timeliness of possible responses. It should be comprehensive in its focus, by covering all open platforms possible, within the limits of data

9

privacy (for more on data management, see the OECD Good Practice Principles for Data Ethics in the Public Sector). Effective monitoring may also require capacity to understand languages and cultural features to offer an inclusive analysis.

The use of a wide range of both free and licensed analytics tools facilitates the work of communicators. More sophisticated approaches are needed for activities such as identifying co-ordinated inauthentic behaviour.5 The growing prominence of audio and video content similarly calls for innovation to process misinformation across these mediums. Nevertheless, monitoring cannot be fully automated – instead it requires critical analysis, fact-checking and human oversight.

Filling information voids and anticipating questions on issues vulnerable to misinformation

Public communication is an important prevention tool when it is used proactively to shape the information landscape on sensitive subjects. Comprehensive, transparent, and timely dissemination of useful information and data, based on a careful assessment of gaps and listening exercises (see below), is necessary to understand and respond to the public.

The need to provide proactive, transparent and timely communication6 was confirmed at the beginning of the COVID-19 pandemic, where information voids on causes and treatments for the disease were linked to the spread of false health advice.7 This indicates that authorities were not adequately anticipating and responding to demands for information, data, or the needs of society.8

Similarly, gaps in available data can be exploited to manipulate or misrepresent facts. Here too filling voids is important, chiefly though the timely sharing of open government data. Data dashboards and visualisation tools can facilitate the work of communicators, journalists and civil society actors alike in using and re-using data, and translating it for consumption by wider publics.

Providing easily accessible, clear and timely information and data, governments can more effectively prevent the spread of falsehoods and build credibility.9 It is important to note that making information and data available on government websites or portals only goes part of the way in filling information voids. It is necessary to amplify the reach of these facts, by disseminating them across multiple channels and encouraging media and stakeholders to refer to them, for example.

Conducting advanced audience and behavioural insights

Gathering and analysing data on audience groups is especially important for devising effective communication, both pre-emptive and reactive, against mis- and disinformation.

Not all groups in society are equally at risk of being exposed to false content, and not all are equally vulnerable to believing it.10 Indeed, research indicates that age, ethnicity, and

5 Specialist platforms with advanced analytical capabilities and AI-based tools are emerging in this space. For example

see Graphika, https://graphika.com/how-it-works.

6 OECD Survey “Understanding Public Communication” (2020); written submission of the Government of Spain to the

OECD Secretariat (2020).

7 Brennen et al. (2020) “Types, sources, and claims of COVID-19 misinformation”, Reuters Institute

(https://reutersinstitute.politics.ox.ac.uk/types-sources-and-claims-covid-19-misinformation).

8 Lovari et al. (2020), “Re-connecting Voices. The (New) Strategic Role of Public Sector Communication After Covid-

19 Crisis” Partecipazione e Conflitto, Vol 13, N. 2, pp. 970-989.

9 Southwell, B., Thorson, E. A. & Sheble, L. (2017), Misinformation and Mass Audiences. Austin, TX: University of

Texas Press; and Gigerenzer, G., Gaissmaier, W., Kurz-Milcke., E., Schwartz L.M., & Woloshin S. (2007). Helping

Doctors and Patients Make Sense of Health Statistics. Psychological Science in the Public Interest, 8,53-96.

10 Nielsen et al. (2020), “Communications in the coronavirus crisis: lessons for the second wave”, Reuters Institute, 27

October 2020 (https://reutersinstitute.politics.ox.ac.uk/communications-coronavirus-crisis-lessons-second-wave)

10

For Official Use

political orientation are among the predictors for engaging with misinformation.11 Factors such as news avoidance and low trust can also impact specific segments of society12, requiring novel approaches.

Behavioural insights (or BI) are particularly useful to understand the mechanics of how people interact with content and how misleading information can stick, as well as the potential effectiveness of different communication approaches. Cognitive and psychological factors such as information overload, confirmation biases and a tendency to believe information that is repeated13 all impact the design of communications.

Behavioural experiments are an increasingly common way of gaining insights on the types of content and format that may be most effective in changing attitudes and nudging behaviours (for instance discouraging the careless sharing of content by social media users).

Aggregate-level insights can be gathered through online platforms’ data and analytics, as well as more traditional tools such as polling and focus groups. It is important to have frequent and up-to-date insights on patterns in information consumption, perceptions, and demographic factors that can support more precise targeting of responses. Such efforts can assist with targeting content, especially paid posts, to specific publics based on a nuanced understanding of their perceptions and needs. In turn, this can allow for a more citizen-centric and effective approach to communication.

Debunking false content Debunking, or rebutting, a false claim requires speed, accuracy and a careful consideration of the context. One factor to consider regarding whether debunking a falsehood might be appropriate regards measures of engagement (i.e. that show it is gaining traction with users and being shared) over simple counts of views.

Debunking should be applied strategically and guided by the prevention of harm. Considering where in the “life-cycle”14 of media manipulation the misinformation falls will be important to target responses appropriately. To that end, establishing and regularly recalibrating guiding thresholds for debunking can facilitate decision-making in this respect.

A growing body of behavioural science and cognitive psychology literature indicates that content to debunk a falsehood should include the following elements: stating the accurate fact upfront, noting what was incorrect and explaining why, reinforcing the fact.15 This can

11 See for example, Grinberg et al. (2019) “Fake news on Twitter during the 2016 U.S. presidential election”, Science,

Vol. 363, Issue 6425: 374-378.

12 Nielsen et al. (2020), “Communications in the coronavirus crisis: lessons for the second wave”, Reuters Institute,

27 October 2020 (https://reutersinstitute.politics.ox.ac.uk/communications-coronavirus-crisis-lessons-second-wave).

13 Related literature and evidence is summarised in Shane, T. (2020) “The psychology of misinformation: Why we’re

vulnerable”, First Draft, 30 June 2020 (https://firstdraftnews.org/latest/the-psychology-of-misinformation-why-were-

vulnerable/).

14 Donovan, Joan (2020) “The Life Cycle of Media Manipulation,” The Verification Handbook 3, 2020.

https://datajournalism.com/read/handbook/verification-3/investigating-disinformation-and-media-manipulation/the-

lifecycle-of-media-manipulation and Donovan, Joan; Friedberg, Brian; Lim, Gabrielle; Leaver, Nicole; Nilsen, Jennifer;

and Dreyfuss, Emily, (2021), Mitigating Medical Misinformation: A Whole-of-Society Approach to Countering Spam,

Scams, and Hoaxes, https://mediamanipulation.org/sites/default/files/2021-03/Mitigating-Medical-Misinformation-

March-29-2021_0.pdf.

15 For more evidence on constructing effective debunking messaging see Chan et al. (2017), “Debunking: A Meta-

Analysis of the Psychological Efficacy of Messages Countering Misinformation” Psychol Sci. 28(11): 1531–1546

doi: 10.1177/0956797617714579.

11

be more effective when integrated into existing communication strategies and messaging on relevant topics.

Pre-bunking Pre-bunking is an approach that consists of inoculating the public to potential mis- and disinformation. At its core, pre-bunking is about “warning people of the possibility of being exposed to manipulative misinformation, combined with training them in advance on how to counter-argue if they did encounter it,” with the idea that such activities will reduce susceptibility to misinformation.16

Pre-bunking has been applied in the context of dedicated games, it is now being adopted by social media platforms to warn users about false claims they might encounter on all posts relating to vulnerable topics.17

The same techniques can be integrated in proactive communication, by noting common deceptive tactics (without re-stating any rumours) and pre-emptively stating facts and arguments, similarly to the guidance on debunking. Pre-bunking activities are strengthened if implemented in tandem with filling information voids (see above).

Inclusive-minded content design and delivery

Specific efforts are necessary to ensure communication responses reach everyone, and are particularly tailored to groups that are less likely to be exposed to or trust official information.

Drawing on advanced audience insights (see above), communicators can devise and test communication approaches specific to hard-to-reach groups and underserved communities. In the case of the COVID-19 response, this has often involved bringing in credible and trusted messengers, whether members of a given community, scientists and doctors, or influencers.

The use of alternative channels is also an important practice to support greater inclusion. Beyond leveraging digital channels such as social media, and especially its targeted advertising functions, communicators can also think about planting their messages in non-traditional and even “hostile” outlets.

As COVID-19 demonstrated, overlooking some groups can have grave downsides in efforts to contain the pandemic. Working to ensure everyone is reached with verified information and advice is essential to mitigating the negative effects of viral mis- and disinformation while fostering a more cohesive society.

Leveraging trusted messengers from across society

Government communication channels, on their own, have their limitations in terms of public reach and trust, and are often not the most effective or persuasive messengers. Their perceived association with a given political figure or party can change how trusted institutions are, particularly amid a climate of increasing polarisation of policy debates (see below).

In the context of immunizations, for example, research shows that emphasizing medical consensus about vaccine safety is “likely to be an effective pro-vaccine message.” To that end, clinicians and public health officials can play a critical role in highlighting and communicating the “high degree of medical consensus” on vaccine safety18 to expand

16 Jon Roozenbeek and Sander van der Linden (2021), “Don’t Just Debunk, Prebunk: Inoculate Yourself Against Digital

Misinformation”, Character & Context Blog, Society for Personality and Social Psychology (https://www.spsp.org/news-

center/blog/roozenbeek-van-der-linden-resisting-digital-misinformation) .

17 Ingrham, David (2020), “Twitter launches 'pre-bunks' to get ahead of voting misinformation”, NBC News, 26 October

2020 (https://www.nbcnews.com/tech/tech-news/twitter-launches-pre-bunks-get-ahead-voting-misinformation-

n1244777)

18 van der Linden, S.L., Clarke, C.E. & Maibach, E.W. Highlighting consensus among medical scientists increases

public support for vaccines: evidence from a randomized experiment. BMC Public Health 15, 1207 (2015).

https://doi.org/10.1186/s12889-015-2541-4.

12

For Official Use

beyond traditional messengers and help ensure the public has confidence in the source of information.

Through audience insights, communicators can gather a better understanding of which types of spokespeople or messengers are trusted by different segments of the population. By partnering with them in a transparent and open way, and empowering them to communicate shared messages, governments can amplify their reach even among disengaged of publics.

Increasing visibility of verified information

In competing for audiences’ finite attention span, public communicators can work to ensure official and verifiable content is more visible on online platforms. More specifically, this means ensuring the prominent placement of such content on screens and in social media feeds.

This includes both improvements to the ranking of official webpages through Search Engine Optimisation (SEO), and the paid promotion of official social media pages and posts to target publics, powered by advanced analytics.

Similarly, social media platforms, traditional news outlets and government websites alike have been including banners or other highly visible sections of their pages linking to dedicated areas where accurate and up-to-date information on current issues subject to rumours – whether the COVID-19 pandemic or elections – are provided.

Communicating openly about uncertainty

Uncertainty about what is true, or how correct estimations are with regards to a policy issue (e.g. COVID-19 or climate change), can often be a reason for withholding information. Conversely, fears that presenting information as uncertain could undermine people’s trust in it has encouraged officials and experts to project certainty about unknown facts.

Research has increasingly dismissed the notion that presenting evidence as uncertain or being open about what is not yet known leads to a loss in trust.19 Moreover, withholding information, however tentative, on less known subjects opens the door to rumours and speculations, as documented widely during the pandemic and other crises. When it comes to scientific matters, the problem is especially acute.

Communicating uncertainty is therefore important and necessary to prevent and curb misinformation. In this respect, communicators ought to be transparent about the degree of certainty about any claims, levels of risk, and margins of errors in any data presented, including open government data.

Communicating effectively under uncertainty is likewise essential to maintain trust. Indeed openness, honesty and good intentions, alongside expertise, are drivers of trustworthiness identified in literature.20 Conversely, both omissions and projections of false certainty tend to be perceived as malicious.

Public officials and communicators should therefore strive to be open about motivations, conflicts and limitations, not only with regards to uncertain facts or events, but in all communications.

Balancing data and facts with emotional and real-life storytelling

Behavioural science research has highlighted that people can differ considerably in terms of the content they find compelling or persuasive.

While in policy or scientific discussion there is a prevalence of argumentation through reasoning, statistics and facts, these can often fail to resonate with some publics. In such

19 Anne Marthe van der Bles, Sander van der Linden, Alexandra L. J. Freeman, David J. Spiegelhalter (2020), “The

effects of communicating uncertainty on public trust in facts and numbers”, Proceedings of the National Academy of

Sciences , 117 (14) 7672-7683; https://doi.org/10.1073/pnas.1913678117.

20 White, M. P. & Eiser, J. R. in Trust in Risk Management: Uncertainty and Scepticism in the Public Mind. Ch. 4, 95–

117 (Taylor & Francis, 2010).

13

cases, simple facts are often not sufficient to persuade and motivate desired behaviour changes.

For this reason, communication practitioners have long learned to combine reason with emotion, and facts with storytelling. Notably, approaching the subject of vaccinations in terms of the positive social norm and the protection of one’s community can be a powerful addition to highlighting their safety. Personal accounts and imagers of others being vaccinated can similarly resonate more than statistics.21

Indeed, emotions and storytelling are key factors exploited in disinformation campaigns, which often draw on unverifiable or distorted anecdotes to sow fear and distrust, whether towards vaccines, migrants, or other issues.

Annex II – Institutional frameworks

Good practice Description

Adopting relevant definitions Defining the problem to solve is an essential start to developing targeted and consistent responses. In the context of information disorders22 there are multiple key terms to describe diverse challenges, with the most commonly used terms being misinformation and disinformation.

Adopting official definitions and using them consistently across official materials can help institutionalise approaches and ensure they accurately address the multiple causes and manifestations of the problem. However, as of 2020 just over half (54%) of Centres of Government surveyed by the OECD had adopted a definition of at least one of the terms

“disinformation”, “misinformation”, or “malinformation”.23

In its work, the OECD has drawn on the following definitions used by Wardle and

Derakshan:24

Misinformation: “when false information is shared, but no harm is meant”. This consists typically of rumour or misleading content shared unknowingly by individuals.

Disinformation: “when false information is knowingly shared to cause harm”. Disinformation can be commonly traced back to actors or movements with malicious motives and can be part of concerted large-scale campaigns.

Malinformation: “when genuine information is shared to cause harm, often by moving what was designed to stay private into the public sphere”. This occurs for instance, in the case of leaks.

21 Lewandowsky, S., Cook, J., Schmid, P., Holford, D. L., Finn, A., Leask, J., Thomson, A., Lombardi, D., Al-Rawi, A.

K., Amazeen, M. A., Anderson, E. C., Armaos, K. D., Betsch, C., Bruns, H. H. B., Ecker, U. K. H., Gavaruzzi, T., Hahn,

U., Herzog, S., Juanchich, M., Kendeou, P., Newman, E. J., Pennycook, G., Rapp, D. N., Sah, S., Sinatra, G. M.,

Tapper, K., Vraga, E. K (2021). The COVID-19 Vaccine Communication Handbook. A practical guide for improving

vaccine communication and fighting misinformation. Available at: https://sks.to/c19vax.

22 See Wardle, C. (2020) “Understanding Information disorder”, First Draft, 22 September 2020, available at

https://firstdraftnews.org/long-form-article/understanding-information-disorder/

23 OECD (forthcoming 2021), Public Communication: the Global Context and the Way Forward, OECD Public

Governance Reviews, OECD Publishing, Paris.

24 Wardle C., Derakshan H. (2017), Information Disorder: Towards an interdisciplinary framework for research and

policy making, Council of Europe report, DGI (2017) 09.

14

For Official Use

Adoption of dedicated policy frameworks, strategies, toolkits to guide responses

Building and consolidating responses to mis- and disinformation requires defining the approach to – and scope of – the interventions to implement in a way that gives clear guidance on measurable objectives, roles and responsibilities, and specific actions to take in the short, medium and long-term.

In practice, this can take the form of a framework, a strategy or programme, or policy and legislation. Although many countries are moving in this direction, OECD survey data from 2020 revealed that only 38% of Centres of Government who responded had developed such

a document, which were often specific to a specific agency.25

Due to the cross-cutting and multi-disciplinary nature of this challenge, there is an important case for formal guiding documents that encompass all dimensions of the response across platform governance, communication, media and information literacy, and beyond (see below). Besides the framework proposed in the OECD’s 2020 working paper on the

subject,26 actors including the European Union have adopted integrated policy approaches

that span multiple key areas of intervention.

Formalising policies and approaches offers the benefit of shifting from an ad hoc and fragmented approach to counteract mis- and disinformation, to a structured one with a longer-term horizon and a mandate to allocate resources to this endeavour. It provides for a clarity of purpose and can set concrete metrics for measuring impact (see below).

Establishing dedicated structures to support whole-of-government capacity

Structures, staff and resources ought to be in place to ensure all government departments at all levels are equipped to respond rapidly and effectively, and that counter-misinformation efforts are mainstreamed across government.

Presently, responsibilities for tackling this challenge can often be dispersed and fall on departments and teams in addition to their existing function. In some cases, specialised units exist within a specific ministry or agency but do not serve the rest of the government.

Building capacity horizontally across departments can be demanding and difficult to attain, at least in the near-term. For this reason, some countries have centralised counter-disinformation capacity under dedicated structures that are designed to provide support to all government institutions.

A degree of centralisation of resources, in the form of a dedicated unit or agency, can help build the right expertise and capacity efficiently, and putting it to the service of all other parts of government.

Effective structures may combine some specialised central structures with a measure of horizontal co-ordination to ensure a concerted effort that draws on both technical expertise on disinformation and issue-specific knowledge (e.g. in public health), and avoids gaps in implementation.

Introducing diverse and specialised skillsets and providing training

The rapidly changing technologies and challenges facing civil servants tasked with tackling misinformation require a focus on introducing and maintaining specialized skillsets. Through simulations, trainings and joint exercises, civil servants can gain flexibility to evolving needs and help share expertise across the government.

Advanced competencies in data science are increasingly central to establishing sophisticated methods that make tracking and analysing content, or spotting co-ordinated inauthentic behaviour and illicit adverts more manageable at scale. They are also key for developing digital tools that support continuous monitoring and specific actions.

Specialised communication competencies for pre-bunking, filling information voids, or debunking similarly require dedicated training, as well as the application of behavioural science-based interventions and related expertise.

25 OECD (forthcoming), Public Communication: the Global Context and the Way Forward, OECD Publishing, Paris.

26 Matasick, Alfonsi and Bellantoni (2020).

15

Preparedness for election-related mis- and disinformation is likewise a growing priority, requiring training for a wide range of relevant actors tasked with ensuring the integrity of electoral processes.

Measuring and evaluating the impact and reach of responses

As noted in the forthcoming OECD International Report on Public Communication, there is a need for improved evaluation mechanisms to understand the impact of communication initiatives, for example on vaccine confidence, to inform future courses of action, reflect on lessons learnt, and build resilience for future crisis.

Systematically generating and collecting data to measure public communication initiatives, including media monitoring, review of social media impressions and other online analytics, are useful tools. Considerations include how to measure both the reach of messages, as well as effect on policy goals (for example, when it comes to vaccine uptake, social distancing, mask wearing, etc.). Evaluation should use a range of such data to assess results and enable the incorporation of lessons learned into the decision-making process.

Distancing political and institutional communication and mitigating risks of interference

Politicians, political actors and parties do not always command the same level of trust across all groups in society, and can alienate segments of the public that have diverging political views.27 Institutions and civil servants, on the other hand, tend to be perceived as more impartial (see good practice in “Institutional frameworks” below).

It is therefore important to make explicit distinctions between content of a political versus institutional nature, put in place mechanisms to insulate public communication institutions from political pressure, and to clarify the separation between the two functions in relevant documents and structures. Setting clear standards, directives, guidance or procedures on managing the distinction between the political and institutional realms of communications is a useful and necessary practice to ensure this key function is conducted in the interest and service of the public.

Keeping institutional platforms and social media handles clear of political or potentially politicised content, developing clear guidelines for appropriate use and management, and establishing clear authorship and branding are other mechanisms to facilitate this distinction.

Ultimately, it is likely impossible to completely separate politics from communications on institutional platforms. However, actions that limit association of these channels with specific political ideologies can make them more valuable assets in combating misinformation amid increasingly polarised information ecosystems.

Establishing cross-government co-ordination mechanisms

Establishing effective co-ordination mechanisms within governments is an important tool to help combat false information. Governments should therefore identify mechanisms and structures to build relationships across government.

Creating multi-faceted public communication teams would bring different perspectives to the design and implementation of communication initiatives, as well as help facilitate rapid responses to crises and misleading or incorrect information concerning specific topics, such as health, elections, national and cyber security, etc.

Building channels for international co-operation and intelligence-sharing

The lack of clarity on problems and solutions, combined with the complex, global and rapidly evolving nature of the problems faced, calls for a more conscious effort to facilitate collaboration between various actors. As noted by the European Commission, “the best

responses are likely to be those driven by multi-stakeholder collaborations”.28

Co-ordinated and comprehensive approaches will support efforts to make society more resilient to disinformation. Because these challenges most often require responses that involve a mix of governments, CSOs, the public, and private companies, initiatives that bring

27 Edelman (2021), Edelman Trust Barometer 2021, (https://www.edelman.com/sites/g/files/aatuss191/files/2021-

03/2021%20Edelman%20Trfust%20Barometer.pdf).

28 European Commission (2018a), A multi-dimensional approach to disinformation, High-level Group on fake news

and online disinformation, Directorate-General for Communication Networks, Content and Technology, March 2018.

16

For Official Use

together a wide range of actors can play a critical role in raising awareness, sharing knowledge and information and collecting data. Such efforts can also promote agile and effective collaboration across policy and communications departments.

Annex III – Enabling ecosystems

Good practice Description

Investing to enhance media

and digital literacy across

society

Ultimately, individuals need to be able to differentiate between accurate and false or misleading information. Promoting media and digital literacy is, therefore, a key policy response to tackling the challenges posed by disinformation and taking advantage of the potential benefits to online and social platforms.

Media literacy refers to citizens’ abilities to analyse, evaluate and create content. Media literacy efforts seek to empower the public to become critical consumers of news to help ensure the media can fulfil its role to improve democratic governance.29 Digital literacy, on the other hand, refers to individuals’ abilities to engage with content in digital formats, as well as a basic understanding of common digital technologies.

To that end, long-term efforts to strengthen media and digital literacy will be useful in building resilient societies that can fight back against mis- and disinformation.

Providing support to

traditional, public service,

local and community media

Supporting the operation of high-quality, public service, local, community and independent media can help support a media and information ecosystem conducive to democratic engagement. Despite ever increasing opportunities for the public to engage with news and information online, many traditional media markets are shrinking rapidly. The challenges faced by many media outlets is a particular concern given the role media pluralism plays in supporting well-functioning democracy, good governance and reduced corruption.

Counteracting the threats to trusted local, independent and/or public service news sources may include identifying independent, non-partisan and effective measures to support local media. Governments could also support initiatives, both domestically and via international development support, that provide training to citizen journalists and to traditional outlets on how to manage public engagement to foster participation in news production through citizen and community journalism.

Clear and independent governance structures can maintain impartiality and help ensure a proactive and effective response.

Building constructive and

transparent relationships

with civil society and private

sector actors and leveraging

external tools, expertise and

solutions

The lack of clarity on problems and solutions, as well as the global and rapidly evolving nature of the problems faced, calls for a more conscious effort to facilitate collaboration between various actors. The interlinked nature of the responses to mis- and disinformation, furthermore, highlight the need to include expertise from across multiple sectors in discussing potential solutions.

This could include developing constructive and transparent relationships with social media platforms to share information rapidly and effectively. Such co-operation and two-way reporting can flag, debunk disinformation and remove illegal speech, triangulating responses with social media platforms, could also.

Funding academic research

on mis-and disinformation

To develop effective policies to counteract the threat posed by mis- and disinformation, governments must continue to develop their understanding of the primary challenges and opportunities. For example, developing insights into how mis- and disinformation is shared, its impact on the relationship between citizens and governments, how levels of trust in the

29 Mcloughlin, Claire and Scott, Zoe (2010), Topic Guide on Communications and Governance, Communication for

Governance and Accountability Program (CommGAP), International Development Department, University of

Birmingham, May 2010.

17

and facilitating co-operation

between stakeholders

key institutions of public life are changing and the effectiveness of various responses will all serve to inform policy responses.

In addition to the value of increased direct funding for research, governments can also work to ensure academia, regulatory bodies and other relevant agencies are engaged in conversations about research needs, as well as ensure research findings are incorporated into policy development.

To facilitate public-private access to and sharing of information and data on mis- and dis-information threats and risks sharing, it would be relevant for governments to promote the establishment of partnerships, and shared data access and sharing mechanisms. In this light, co-ordinating the sharing of threat information between platforms and government This could be done through, for example, an information sharing and analysis organisation (ISAO) or information sharing and analysis centre (ISAC).30 In addition, partnerships may include open data initiatives related to mis- and disinformation, in addition to closed or secure data access and sharing arrangements between governments, technology companies and independent researchers.

Supporting the development

of open source tools and use

of open government data

This can be used as a means to help process and understand global information sources and obtain insights from subsequent data.

The identification, tracking and monitoring of misinformation sources can help reduce risk of individuals and governments alike using unreliable and inaccurate data for decision-making processes. Such efforts complement good data governance and management for AI systems.

Defining shared information and data standards, including around open government data, can also help build a common public communication narrative, supported by trustworthy and coherent information and data by different public actors.

Conducting civic dialogues

and deliberative processes

on core issues of concern

Participatory and community-led strategies can be used to help ensure that diverse local voices are heard, that local concerns are understood and citizens co-design policies and programmes. Deliberative democracy initiatives, such as citizen juries and assemblies, have the potential to encourage calmer, more evidence-based discussion,31 as well as provide a pool of informed citizens who can effectively advocate and represent informed positions on divisive topics. Specifically related to vaccine uptake, the aim is to build vaccine acceptability and confidence and overcome cultural, socioeconomic, and political barriers that lead to mistrust. 32

References

30 DiResta, Renee (2021), “Dancing in the Dark: Disinformation Researchers Need More Robust Data and

Partnerships” in Jackson, Dean (2021) COVID-19 and the Information Space: Boosting the Democratic Response

National Endowment for Democracy, Global Insights series, Washington, DC / Jaikumar Vijayan, “What is an ISAC or

ISAO? How these cyber threat information sharing organizations improve security,” CSO, 9 July 2019,

www.csoonline.com/article/3406505/what-is-an-isac-or-isao-how-these-cyber-threat-information-sharing-

organizations-improve-security.

31 Suiter, Jan (2018), “Deliberation in Action – Ireland’s Abortion Referendum,” Volume: 9 issue: 3, page(s): 30-32,

Sage Journals, September 1, 2018 https://doi.org/10.1177%2F2041905818796576.

32 The COVID-19 vaccines rush: participatory community engagement matters more than ever, Rochelle Ann Burgess;

Richard H Osborne; Kenneth A Yongabi; Trisha Greenhalgh; Deepti Gurdasani; Gagandeep Kang; et al.; December

10, 2020 DOI: https://doi.org/10.1016/S0140-6736(20)32642-8.

18

For Official Use

Blastland, M. et al. (2020), “Five rules for evidence communication”, Nature, Vol. 587/7834,

pp. 362-364, http://dx.doi.org/10.1038/d41586-020-03189-1.

[15]

Ceron, A. (2015), “Internet, News, and Political Trust: The Difference Between Social Media and

Online Media Outlets”, Journal of Computer-Mediated Communication, Vol. 20/5, pp. 487-

503, http://dx.doi.org/10.1111/jcc4.12129.

[5]

Donovan, J. (2020), “The Lifecycle of Media Manipulation”, in DataJournalism.com (ed.), The

Verification Handbook 3, https://datajournalism.com/read/handbook/verification-

3/investigating-disinformation-and-media-manipulation/the-lifecycle-of-media-manipulation

(accessed on 11 June 2021).

[9]

Edelman (2021), Edelman Trust Barometer 2021,

https://www.edelman.com/sites/g/files/aatuss191/files/2021-

03/2021%20Edelman%20Trust%20Barometer.pdf.

[7]

Fairbanks, J., K. Plowman and B. Rawlins (2007), “Transparency in Government

Communication”, Journal of Public Affairs, Vol. 7/1, pp. 23-37,

http://dx.doi.org/10.1002/pa.245.

[16]

Fletcher, R. et al. (2020), Information inequality in the UK coronavirus communications crisis |

Reuters Institute for the Study of Journalism, Reuters Institute,

https://reutersinstitute.politics.ox.ac.uk/information-inequality-uk-coronavirus-communications-

crisis#sub5 (accessed on 11 June 2021).

[8]

Heise, J. (1985), “Toward closing the confidence gap: an alternative approach to communication

between public and government”, Public Administration Quarterly, Vol. 9/2, pp. 196-217,

https://www.jstor.org/stable/pdf/40861057 (accessed on 11 June 2021).

[17]

Hyland-Wood, B. et al. (2021), “Toward effective government communication strategies in the

era of COVID-19”, Humanities and Social Sciences Communications, Vol. 8/1,

http://dx.doi.org/10.1057/s41599-020-00701-w.

[11]

Kavanagh, J. (2018), Truth Decay: An Initial Exploration of the Diminishing Role of Facts and

Analysis in American Public Life, RAND Corporation, https://doi.org/10.7249/RR2314.

[3]

Klein, E. and J. Robison (2019), “Like, Post, and Distrust? How Social Media Use Affects Trust in

Government”, Political Communication, Vol. 37/1, pp. 46-64,

http://dx.doi.org/10.1080/10584609.2019.1661891.

[6]

Lewandosky, S. et al. (2020), The Debunking Handbook 2020,

http://dx.doi.org/10.17910/b7.1182.

[2]

Matasick, C., C. Alfonsi and A. Bellantoni (2020), “Governance responses to disinformation

: How open government principles can inform policy options”, OECD Working Papers on

Public Governance, No. 39, OECD Publishing, Paris, https://dx.doi.org/10.1787/d6237c85-en.

[12]

OECD (2020), Transparency, communication and trust: The role of public communication in

responding to the wave of disinformation about the new coronavirus, https://read.oecd-

ilibrary.org/view/?ref=135_135220-cvba4lq3ru&title=Transparency-communication-and-trust-

The-role-of-public-communication-in-responding-to-the-wave-of-disinformation-about-the-

new-coronavirus (accessed on 11 December 2020).

[4]

19

OECD (2017), Trust and Public Policy: How Better Governance Can Help Rebuild Public Trust,

OECD Public Governance Reviews, OECD Publishing, Paris,

https://dx.doi.org/10.1787/9789264268920-en.

[10]

Privor-Dumm, L. and T. King (2020), “Community-based Strategies to Engage Pastors Can Help

Address Vaccine Hesitancy and Health Disparities in Black Communities”, Journal of Health

Communication, Vol. 25/10, pp. 827-830, http://dx.doi.org/10.1080/10810730.2021.1873463.

[14]

Smith, R. (2019), Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the

Internet Making Bigots of Us All, Bloomsbury, London, https://www.bloomsbury.com/uk/rage-

inside-the-machine-9781472963888/ (accessed on 11 June 2021).

[1]

WHO (2020), Managing the COVID-19 infodemic, World Health Organization, Geneva. [13]