Active Measures campaigns are far from spontaneous. These state-sponsored operations are meticulously planned and follow a sequence of pre-determined steps designed to destabilize societies, manipulate perceptions, and erode trust. However, their structured nature provides opportunities for defenders to intervene and dismantle them.
This understanding underpins the Active Measures Kill Chain—a strategic framework to counter information warfare. By identifying the attacker’s methodology, defenders can pinpoint vulnerabilities, disrupt operations, and neutralize the threat before it takes root. Central to this effort is the application of the OODA loop (Observe, Orient, Decide, Act), a dynamic decision-making model that aligns perfectly with the kill chain
- Observe: Gather intelligence to identify the campaign’s origins, objectives, and methods. For example, detect coordinated social media disinformation campaigns or monitor the spread of false narratives.
- Orient: Analyze the gathered data to understand the intent, targets, and potential impact of the campaign. This step includes discerning the psychological and cultural levers being exploited by adversaries, such as polarizing issues or societal fears.
- Decide: Develop a strategy to counter the attack. This could involve amplifying credible sources, deploying counter-messaging, or discrediting the operation by exposing its origins.
- Act: Implement the chosen strategy to disrupt the campaign at its weakest points. Actions might include removing fake accounts, alerting the public to the disinformation effort, or leveraging legal and diplomatic measures to hold perpetrators accountable.
Integrating the OODA loop into the kill chain allows defenders to adapt rapidly and stay ahead of the attacker. It transforms defense into a fluid and iterative process, enabling faster recognition of threats and decisive actions that cut through the noise of disinformation.
The Active Measures Kill Chain, augmented by the OODA loop, doesn’t just provide a roadmap for defense—it is a blueprint for taking the fight to the attacker. Whether it’s exposing false narratives, destroying the infrastructure of disinformation, or countering with truth, each stage offers an opportunity to disrupt and kill the campaign before it achieves its objectives.
Active Measures 101
At the heart of an Active Measure lies Dezinformatsiya, the Russian term for state-sponsored disinformation. Its goal is to reshape perceptions of reality, often using the following tactics:
- Dismissing official investigations: Undermining public trust in credible inquiries by labeling them biased or fabricated. For example, Russian operatives cast doubt on international investigations into the downing of Malaysia Airlines Flight MH17 despite overwhelming evidence pointing to a Russian missile system.
- Distorting events to serve political interests: Twisting narratives to align with specific agendas. A notable case is the promotion of conspiracy theories suggesting that “9/11 was an inside job” designed to erode trust in the U.S. government.
- Creating distrust in institutions: Russian-backed trolls actively undermined trust in democratic systems, law enforcement, and the media. By amplifying false domestic narratives, such as the “Hammer and Scorecard” conspiracy theory, they promoted misinformation about the 2020 US presidential election, further deepening divisions within American society.
- Fostering dismay through conspiracy theories: Russian-backed disinformation campaigns leveraged conspiracy theories to sow confusion and distrust. These efforts often suggested shadowy explanations for events and implied that the truth was intentionally hidden. For example, during the COVID-19 pandemic, they spread false theories claiming vaccines were part of a sinister plot by Western governments to use 5G technology and chemtrails to infect and mind-control the population. Additionally, Russian operatives co-opted the domestic Active Measure Q-Anon, which initially aimed to subvert the MAGA movement, and repurposed it to promote a wide range of conspiracies. This included claims such as the supposed return of John F. Kennedy Jr. and the existence of a global Jewish cabal controlling all world events. By amplifying these narratives, they further fractured American society and fueled distrust in institutions and fellow citizens.
The following action list reflects the contemporary steps involved in Active Measures:
- Traditional or Cyber Espionage: State-sponsored actors engage in espionage to steal sensitive information and use it in modified form (i.e., forging documents).
- Social Media Manipulation: Governments use social media platforms to spread disinformation, manipulate public opinion, and polarize societies.
- Deepfake Technology: Creating and disseminating AI-generated images, videos, and audio to spread false information or discredit individuals and organizations.
- Troll Farms and Bot Networks: Organized groups deploy automated accounts and fake personas to amplify propaganda and influence online discussions.
- The weaponization of Leaked Data: State actors leak sensitive information via a HALO operation to influence elections, damage reputations, or destabilize foreign governments.
- Narrative Shaping through State-Controlled Media: State-owned media like RT and Sputnik and, in the West, propaganda outlets (like NYT, CNN, etc.) generate a large volume of false narratives to shape public opinion.
- Cultural and Academic Exchanges as Soft Power: Governments use cultural and academic exchanges to promote their ideologies and gain influence abroad.
- Covert Financial Support: Funding political parties, NGOs, or media outlets domestically and in foreign countries to influence policies and narratives.
- Disinformation Campaigns in Crisis Situations: Spreading false information during crises (e.g., pandemics, natural disasters, social conflicts, and mass shootings) to create chaos and undermine trust in institutions.
- Lobbying, censorship, and Influence Campaigns: Governments and affiliated groups (like social media) engage in activities that influence political campaigns to sway policy decisions and public opinion in their favor.
These examples demonstrate the varied tactics utilized in Active Measures, showcasing the evolving terrain of geopolitical influence and digital warfare. Amidst the evolution of digital information warfare tactics, certain fundamental analocomponents have persisted and can be categorized into the following steps:
- Identify societal fractures: Focus on social, demographic, economic, and ethnic divisions.
- Liquefy reality: In the 1980s, this was a single “big lie,” but today, it involves multiple contradictory alternative truths, creating a “fire-hose of falsehood” that distorts day-to-day, social, and political reality.
- Anchor in truth: Wrap these narratives around kernels of truth, as a core of fact helps the falsehoods spread.
- Build audiences: Gain access or control platforms (like X and TikTok) or cultivate relationships with domestic influencers receptive to the narratives.
- Conceal involvement: Make it seem as if the stories originated elsewhere.
- Use “useful idiots”: Encourage people who believe or can be bought to amplify the narratives to adopt even more extreme positions.
- Deny involvement: Deny any connection to the disinformation, even when the truth is obvious.
- Play the long game: Aim for long-term impact rather than immediate effects.
One of the most notorious examples of an Active Measure is Operation Denver, a Soviet disinformation campaign in the 1980s that falsely claimed the U.S. had created the HIV/AIDS virus as a biological weapon. By exploiting vulnerabilities in public health narratives and leveraging Cold War anxieties, the Soviets crafted a false but persuasive narrative. This campaign could have been countered at several stages:
1. Exposing the Initial Newspaper Publication:
The campaign began in 1983 with a seemingly independent article published in the Indian newspaper Patriot, claiming that AIDS was a product of U.S. biological warfare research. This article provided the foundation for further dissemination.
Countermeasure: Intelligence agencies and independent journalists could have investigated and exposed the Indian newspaper Patriot as an Indian Communist party/KGB-controlled outlet. By highlighting its ties to Soviet operatives, the article’s credibility could have been undermined from the outset, preventing its global spread.
2. Debunking the Pseudo-Scientific Papers:
To bolster the narrative in the Patriot article, three years later, in 1986, the KGB and Stasi orchestrated the distribution of a fabricated 47-page pamphlet titled “AIDS—Its Nature and Origin” authored by KGB agent Jakob Segal and his wife, Ronald Dehmlow. This document was disseminated during the Eighth Conference of Non-Aligned Nations in Zimbabwe. Designed to appear as credible scientific research, the pamphlet falsely claimed that HIV/AIDS was a biological weapon developed by the U.S. military. Additionally, the KGB planted similar reports in poorly regulated academic and media outlets to give the narrative wider reach and apparent legitimacy among uninformed audiences.
Countermeasure: Western scientific institutions, health organizations, and media outlets could have systematically reviewed and debunked these papers. Clear communication of the scientific consensus—backed by expert testimony—could have invalidated the campaign’s pseudo-scientific foundation and demonstrated data falsification.
3. Exposing Financial Ties to Media and Advocacy Groups:
The campaign gained traction as certain US-based media outlets and “civil rights” organizations began echoing the claims, inadvertently legitimizing them. These entities often had covert or indirect financial and ideological ties to Soviet influence campaigns.
Countermeasure: Investigative journalism could have exposed these financial connections, revealing the Soviets’ indirect support for organizations that amplified the disinformation. Publicizing these ties would have discredited the narrative’s domestic advocates and created accountability for media outlets that failed to vet their sources.
4. Countering Public Sentiment with Transparent Communication:
The campaign thrived on Cold War fears and societal distrust in government institutions, especially regarding transparency about the HIV/AIDS epidemic.
Countermeasure: US authorities such as the CDC could have addressed public fears by proactively releasing detailed, transparent research about the origins of HIV/AIDS, coupled with clear evidence of the Soviet campaign’s falsehoods. Public outreach through trusted figures and organizations could have mitigated the campaign’s impact.
Operation Denver exemplifies how disinformation campaigns manipulate public perception by leveraging fabricated evidence, credible-seeming channels, and societal vulnerabilities. Exposing the origins of planted stories, debunking false claims, and uncovering covert financial and operational ties could have disrupted the campaign at multiple stages.
This case underscores the critical need for vigilance, transparency, and investigative accountability in countering Active Measures. Modern digital spaces continue to see similar tactics, but the lesson from Operation Denver remains clear: identifying and exposing deception is vital to dismantling such campaigns before they gain traction.
The Active Measures Kill Chain
Active Measures are particularly effective because their victims often remain unaware of how these campaigns are developed, deployed, and distributed. Understanding these processes enables the creation of targeted countermeasures to disrupt information operations. Let’s examine each step of the information operations kill chain and identify strategies to counteract them effectively.
Step 1: Find the cracks
Open disagreements are inevitable in a democratic society, but one defense is to strengthen the institutions that uphold that society. In previous writings, I have discussed the importance of “common political knowledge” for the functioning of democracies. We need to reinforce this shared knowledge, making it harder to exploit societal divisions. It should be unacceptable—or at least costly—for domestic actors to employ disinformation techniques in their rhetoric and political maneuvering. We should highlight and encourage genuine cooperation when politicians work across party lines. Additionally, we must become suspicious of information that incites anger toward our fellow citizens. While we cannot entirely eliminate societal cracks, as they stem from the diversity that empowers democracies, we can make them more difficult to exploit.
Step 2: Liquefy reality
“Liquefy reality” is a metaphorical expression often used in contexts related to disinformation, propaganda, or psychological manipulation. It refers to the deliberate distortion or manipulation of information and facts so that truth becomes fluid and difficult to discern. This term suggests that reality is metaphorically “melted” or “liquefied” by spreading falsehoods, creating confusion, and eroding trust in established truths or authorities. We need to promote better digital literacy and source verification. Although this won’t entirely solve the problem, as much of the sharing of fake news is driven by the fake fact-checking industry, social signaling, and the desire to express core beliefs rather than a concern for truth, it is still an essential part of the solution.
Step 3: Wrap the false narratives around a grain of truth
Defending against this involves exposing the untruths and distortions, though this is challenging in practice. Psychologists have shown that debunking fake news can inadvertently amplify its message. Therefore, it is crucial to replace the fake news with accurate stories that counter the propaganda. The kernel of truth in the fake news should be integrated into a broader true narrative. We must ensure that this true narrative is legitimized and widely promoted.
Step 4: Build audiences
Social media companies have significantly impacted this step by enabling like-minded individuals to connect and communicate, providing propagandists easy access to receptive audiences. Defenses should focus on reducing the effectiveness of disinformation campaigns. Social media platforms must detect and remove accounts, bots, and groups operated by propagandists.
Step 5: Conceal your hand
The key here is attribution—repeatedly and swiftly. The faster we publicly assign the information operations to an entity, the more effectively we can defend against them. Achieving this demands concerted efforts from social media platforms and the intelligence community not only to detect and expose these operations but also to attribute the attacks accurately. Social media companies must enhance transparency regarding their algorithms and make the sources of online content more visible. Even modest steps, such as the Honest Ads Act requiring transparency in online political advertisements, can contribute significantly. In cases where companies lack the business incentives to act, regulation becomes imperative.
Step 6: Cultivate useful idiots
We can reduce the impact of individuals who unwittingly amplify harmful information, even if they are unaware they are spreading deliberate propaganda. This does not necessitate government regulation of speech; corporate platforms already employ various mechanisms to amplify or suppress specific speakers and messages. Moreover, countering the influence of those who unknowingly propagate propaganda requires other influential figures to respond with factual information — as one report puts it, we need to “amplify the truth.” Naturally, there will always be committed believers whom no amount of fact-checking or counter-speech can sway; efforts should focus instead on persuading those open to persuasion.
Step 7: Admit nothing and deny everything
When attack attribution relies on classified evidence, attackers can easily deny involvement. Publicly attributing information attacks must be backed by compelling evidence. While this can be challenging with classified intelligence, there’s no alternative. Relying on government assurances without evidence or talk is insufficient. Governments must disclose credible evidence to establish accountability.
Step 8: Play the long game
Counteractions can disrupt attackers’ ability to sustain information operations, as demonstrated by efforts to identify and remove Russian bots from social media platforms. For example, during the lead-up to the 2016 and 2020 U.S. elections, platforms like Twitter and Facebook collaborated with cybersecurity researchers to detect coordinated inauthentic behavior. By identifying fake accounts linked to Russian interference campaigns, such as those operated by the Internet Research Agency (IRA), and removing them from their networks, these platforms curtailed the spread of disinformation. Defenders must adopt a long-term perspective, fostering a mindset that looks beyond immediate crises and prioritizes strategic thinking and resilience over time.
Conclusion
A critical element in countering Active Measures and hostile information operations is deterrence and punishment. Deterrence strategies must evolve to address the realities of the digital information age, where the tools for executing information operations are widely accessible, and the time from production to consumption is mere seconds. Key to this effort is reducing the effectiveness of these operations through clear attribution, transparency, and accountability. The exact names, funding sources, organizational structures, and communications of actors involved in operations like Q-Anon must be made public. Public exposure to such networks will dismantle their future ability to operate covertly and weaken their influence. Publicly attributing attacks and responding through diplomatic, law enforcement, and operational measures are necessary steps to deter both domestic and foreign actors. The domestic and foreign interference in the 2020 and 2024 presidential elections demonstrated the feasibility and surprisingly low cost of these tactics. As these methods become accessible to a broader range of actors, including domestic groups, addressing them demands a fundamental shift in our defensive and offensive strategies.
No single defensive or offensive measure is sufficient on its own. Unlike the traditional cybersecurity kill chain, which defends against a sequential series of steps targeting a single entity like a network or organization, the information operations kill chain operates differently. Its steps are interconnected, overlapping, and often occur out of sequence. This complex framework spans multiple social media platforms and news outlets, requiring us to view society and its institutions as an integrated information system. Disrupting an information operation involves more than targeting isolated steps or actors—it requires a holistic, systemic approach.
Inventory of some of the Russian Active Measures in circulation