Glossary

Key terms and concepts used throughout EMOD modules.

Platform & Governance(3)

Algorithmic Amplification

When platform algorithms increase the visibility and reach of content. Engagement-optimising algorithms can inadvertently amplify disinformation because it often generates strong reactions.

Content Moderation

The practice of monitoring and applying rules to user-generated content on platforms. Includes removing violating content, labelling misleading information, and reducing distribution.

Platform Governance

The rules, policies, and practices platforms use to manage content and user behaviour. Includes terms of service, content policies, and enforcement mechanisms.

Actors & Tactics(6)

Astroturfing

Creating the false impression of grassroots support for a position. Named after AstroTurf (fake grass), it involves organised campaigns that disguise their true origin and funding.

Related: CIB, Troll Farm

Bot

An automated account that posts or interacts on social media without human intervention. Can be used to artificially amplify content, create false impressions of popularity, or harass users.

Related: CIB, Troll Farm

Coordinated Inauthentic Behaviour (CIB)

When groups of accounts work together to mislead others about who they are or what they're doing. Often involves fake accounts, bots, or real people being paid to pose as grassroots supporters.

Influence Operation

A coordinated effort to affect the opinions, attitudes, or behaviour of a target audience. Can include disinformation, propaganda, and other manipulation techniques.

Related: FIMI, CIB, Propaganda

Propaganda

Information, especially biased or misleading, used to promote a political cause or point of view. Unlike disinformation, propaganda may contain true information but is presented to serve a specific agenda.

Troll Farm

An organised group that creates and operates fake social media accounts to spread disinformation or influence public opinion. Often state-sponsored or commercially operated.

Related: CIB, Bot, Astroturfing

Detection & Analysis(4)

Attribution

The process of identifying who is responsible for an information operation. Attribution is difficult and requires strong evidence - overclaiming attribution undermines credibility.

Fact-checking

The process of verifying claims and statements to determine their accuracy. Professional fact-checkers follow standardised methodologies and transparency practices.

Open Source Intelligence (OSINT)

Intelligence gathered from publicly available sources such as social media, websites, public records, and media. A key method for investigating disinformation campaigns.

Verification

The process of confirming whether information, images, or videos are authentic and accurately represent what they claim. A core practice for journalists and researchers.

Psychology(4)

Backfire Effect

When correcting misinformation causes people to believe it more strongly. Research suggests this is rarer than commonly claimed, but can occur on identity-linked beliefs.

Confirmation Bias

The tendency to search for, interpret, and recall information in a way that confirms one's pre-existing beliefs. Makes people vulnerable to disinformation that aligns with their worldview.

Echo Chamber

An environment where people only encounter information and opinions that reinforce their existing beliefs. Often involves active exclusion of contrary views.

Filter Bubble

The intellectual isolation that can occur when algorithms selectively present information based on user preferences. Different from echo chambers, which involve social reinforcement.

Frameworks(5)

Debunking

Correcting false information after it has spread. Also called 'reactive fact-checking'. Effective for misinformation but often insufficient for disinformation where intent is to deceive.

DIM

DROG Intervention Menu. A systematic framework for categorising and selecting responses to information manipulation. Includes Gen 2 (debunking), Gen 3 (prebunking), Gen 4 (moderation), and Gen 5 (interaction design).

Inoculation Theory

The psychological principle that exposure to weakened forms of persuasive attacks can build resistance to future manipulation, similar to how vaccines work. The basis for prebunking approaches.

Related: Prebunking, DIM

Prebunking

Proactively warning people about manipulation techniques before they encounter them. Based on inoculation theory - exposing people to weakened forms of manipulation builds resistance.

TTF

Transaction Table Framework. A method for mapping information ecosystems by identifying who gives what to get what, and how. Used to understand the economic incentives behind information manipulation.

Related: Disinfonomics, DIM

AI & Technology(3)

Deepfake

Synthetic media where a person's likeness is replaced with someone else's using AI. Can create convincing fake videos of people saying or doing things they never did.

Hybrid Threats

Threats that combine conventional and unconventional methods, including disinformation, cyberattacks, economic pressure, and military intimidation. Often employed by state actors.

Synthetic Media

Media (images, video, audio, text) that is artificially generated or manipulated using AI. Includes deepfakes, AI-generated images, and text produced by large language models.

Core Concepts(5)

Disinformation

False information deliberately created or spread with the intent to deceive. The key distinction from misinformation is the intentional nature of the deception.

FIMI

Foreign Information Manipulation and Interference. A pattern of behaviour that threatens or negatively impacts democratic processes, typically involving foreign state or non-state actors attempting to influence public opinion or political decisions.

Information Disorder

An umbrella term covering the spectrum of mis-, dis-, and malinformation. Coined by Claire Wardle and Hossein Derakhshan to describe the polluted information environment.

Malinformation

True information shared with the intent to cause harm. This includes private information exposed to damage someone, or accurate facts presented out of context to mislead.

Misinformation

False information shared without intent to deceive. The person spreading it believes it to be true. While unintentional, misinformation can still cause harm.

Want to learn more about these concepts?

Explore Modules