An Apocryphal Tale
The Ukraine war has been accompanied by an unprecedented surge of propaganda and disinformation (much, but not all, of it of Russian origins). Before 2022, however, Covid-19 and the Trump presidency had confirmed disinformation as a general problem of our era. Whilst as a practice it has ancient origins
, the term has an intriguing modern myth of origin which shapes how it is now understood. This myth serves as the departure point for a major new research project
, led by Stephen Hutchings and Vera Tolz. The project challenges current approaches to disinformation whilst contributing to more dynamic counter-disinformation practices whose limitations have been exposed by exponential increases in the Kremlin’s propaganda output. It will do so not via the big data-led models that this volume of activity seems to require, but through qualitative, humanities research methods.
Widely cited, the myth in question
attributes the word disinformation to a mid-20th
century English translation of the Russian term dezinformatsiya
, mendaciously coined by Stalin with a French-sounding etymology designating a non-Soviet practice which the Soviet Union was duly ‘obliged’ to emulate. In fact, the story, attributable to a Romanian defector
to the West, is spurious. English usage of the term ‘disinformation’ is traceable to late 19th century allegations
involving American press outlets. Instances of the word can also be found in early 20th
century Hansard records of British parliamentary debates. The first Soviet usages, meanwhile, predate the Stalin period, and reflect German, not French, sources.
This apocryphal tale encapsulates the three key interlinked issues we address: (1) the continuing impact of Cold War subtexts on discourses around disinformation; (2) the need to recognise its status as a form of dynamic inter-cultural, translingual communication; (3) the close connections between disinformation practices, allegations of disinformation and counter-disinformation measures. Scholars generally reduce disinformation to false/misleading information spread deliberately by deceitful adversaries or, in the unwitting form of ‘misinformation, by, gullible co-citizens. They assume that what constitutes disinformation is universally definable, albeit through increasingly complex typologies
differentiating the various kinds of information distortion now present in our ‘toxic’ online world. But to fully understand disinformation, it must also be treated in terms of both the polemical function of the allegation by which it is named in specific instances, and of the variable meanings the material it designates acquire as they are re-forged translingually within and across discrete discursive contexts. From the Factual and the False to the Translingual and the Discursive: Coming in From the Cold
Stressing the role of cross-cultural mediations, including translation, in the meaning-making process, we will work with policy makers to account for changing notions of disinformation and their implications for those who produce and analyse it. We will explore how definitions change across time and geopolitical settings; what Soviet activity and reactions to it reveal about current disinformation and counter-disinformation techniques; about where then, and now, their presumed deceit is located, and with what implications: in their ‘truth claims’; their narrative frames; or their perpetrators’ self-representation (it is worth in this context comparing The Guardian
’s annual April Fool’s story
, the UK tabloid press’s misleading, hyper-partisan news coverage, and the ‘camouflaging’ effect of the bland, fact-based reporting characterising some of the output of Kremlin broadcaster, RT). We will track disinformation’s journeys across multiple contexts and audiences, noting how translingual collusion, like that between far-right anglophone and Russian-language outlets, informs it.
myth reflects a logic wherein the habitual attribution of ‘disinformation’ to a hostile ‘Other’ is linked inextricably to the identity of a collective ‘Self’. This logic explains why non-Latin script in social media output can be over-interpreted as evidence of enemy state activity
; why we differentiate domestic ‘misinformation’ (unintended falsehood) from foreign ‘disinformation’ (intentional falsification) in a digital world where disguising agency is, however, routine; and why, as the term’s myth of origins illustrates, disinformation allegations are a potent tool in power struggles. Disinformation’s defining criteria are neither fixed nor universal: the discourses constituting it interact with the ‘false narratives’ identified by those discourses in an entangled conception-practice dimension that reflects its Cold War legacy.
The focus on Russia is far from anachronistic, for conceptual as well as empirical reasons. Whilst the Covid ‘infodemic’
revealed abundant ‘homegrown’ disinformation, and China now rivals Russia as its main external source, Cold War dichotomies pitting democratic ‘truth-tellers’ against totalitarian ‘dissemblers’ were revived by Russia’s interference in the 2016 US elections and the Ukraine war. Their reductive force – ‘if we face a uniformly malign foe, who cares about hair-splitting definitional issues?’ - creates a medley of poorly differentiated terms: disinformation, misinformation, mal-information, fake news, post-truth, conspiracy theories, state propaganda, trolling and astroturfing, to name a few. It explains the mushrooming of monitors, fact-checkers, literacy initiatives, and legislative oversight bodies whose lack of reflexivity underpins Joseph Bernstein’s take-down of a self-servingly alarmist, epistemologically naïve, ‘Big Disinfo’ industry
. We reject the more acerbic elements of Bernstein’s critique, seeking to improve counter-disinformation rather than undermine it – to bring Disinformation Studies ‘in from the Cold (War)’.Framing (and Contextualizing) Deceit
Our methodologies prioritise not mute data judged true or false, but human subjects with socially situated voices, cognition of which, as the great Russian philosopher Mikhail Bakhtin argues, ‘can only be dialogic’. We reject all forms of relativism. We accept that authoritarian states regularly mislead their own and other publics, recognising that the contingency
of narratives does not imply their factual equivalence
. However, by prioritising their modes of meaning generation
(across temporal, geopolitical and linguacultural boundaries) we will challenge current models which prioritise either the identification of false/misleading ‘content’, or the tracking of that content’s toxic ‘spread’ across ‘disinformation ecosystems.’ Since to allege ‘disinformation’ is a contextual act, we will consider how truth status shifts as information crosses these boundaries, and how such shifts complicate questions relating to what, precisely, is being tracked, and to drawing uniform distinctions between ‘mainstream’, legitimate forms of news distortion, and those designated as illicit. Drawing on Bakhtin and on translation theory, we will reconstruct the dynamics propelling (counter)disinformation strategies, redressing their monolingual, ahistorical bias.
Digitisation facilitates both industrial volumes of disinformation and data-driven methods of detection. These tools reflect ‘Big Disinfo’ emphases on universally verifiable falsehoods. Moreover, perceptions of disinformation’s new digital habitat perpetuate Cold War views of English as democracy’s lingua franca
, encouraging disinformation producers to prioritise anglophone audiences whilst disincentivizing platforms from tracking how non-English disinformation circulates. Facebook failed to identify
91% of Russia’s Ukraine war propaganda. What for anglophones are lurid conspiracy theories, may strike Arabic speakers as credible accounts of residual imperial projects
– a perception exploited in Soviet/Russian disinformation. When the EU banned RT, widening its definition of disinformation, RT’s Arabic ratings rose
, and the Kremlin’s Ukraine narrative infiltrated Western ‘deep state’ conspiracy theories
. The role of anglophone Kremlin proxy sites like News Front in linking Hispanic to Russian Ukraine war disinformation
needs more attention than it currently receives.
We must not, however, over-prioritise digital media flows. Traditional media outlets still play a central role in amplifying disinformation’s transcultural meaning shifts. They form part of the multiple feedback loops facilitating the amplification process. Our own research shows
that pre-digital, mainstream Western mediations of conspiratorial AIDS myths of origin helped legitimate the USSR’s largest disinformation campaign. In our work with counter-disinformation practitioners, therefore, we will strive to reconstitute the historical underprops of their own terminological apparatuses, and the local contexts of specific disinformation narratives.
We want to shift focus from universalist notions of disinformation to how context-contingent discourses shape manipulation techniques, corresponding counter measures, and the interplay between them; for example, authoritarian state appropriations of the terms ‘fake’
, ‘disinformation’ and ‘propaganda’ saturated their Ukraine war lexicon, showing that fact/falsehood distinctions obscure disinformation’s discursive aspects. Targeting the Russian ‘node’ in a translingual network, our case studies will centre on Soviet/Russian output identified as disinformation by counter-disinformation units. We will trace the trajectories of this output across regions strategically important to Russia, prioritising Russian, English Arabic, German, French, Spanish and Serbian material.
Throughout, we will track the complex interplay between content marked as disinformation, the strategies employed to identify it, and the counterstrategies of its assumed perpetrators. Synchronically, we view disinformation allegations as utterances in which ‘truths’ are contingent on narrating selves impugning others’ ‘falsifying’ practices which, in turn, pre-empt notions of disinformation prevailing in target contexts. Diachronically, we treat disinformation allegations and practices as an intercultural dynamic unfolding over time. This framework generates a toolset to be applied to a set of disinformation campaigns assigned Russian/Soviet provenance. These case studies are book-ended by historical and socio-cultural contextualization of the discourses in which the disinformation was practiced; and study of its social media remediation, the meanings various audiences give it, and policy responses to it. Methods: Capturing the Disinformation (Life) Cycle
The context analysis targets English and Russian material of the 1960s-1980s. Our pilot diachronic study of uses of the concept ‘disinformation’
will expand to include ‘misinformation’, ‘fake news’, ‘state propaganda’ and ‘psychological warfare’. A second dataset will capture the operational principles of the EU’s East StratCom Task Force
which has the largest disinformation database; NATO’s Digital Forensic Research Lab
; and EUDisinfoLab
, our project partner. For Russia, we will examine Russian counter-disinformation manuals, and prefaces and polemical commentaries accompanying Russian translations of key English texts, starting with the 1929 Soviet translation
of Harold Lasswell’s seminal book Propaganda Technique in the World War
. For non-Western contexts, we will analyse Misbar.com (Middle East)
and Defensoria de Publico (Latin America)
, pinpointing how they differentiate ‘truth-seeking’ selves and ‘deceptive’ others.
The case studies target media output identified as ‘false narratives’ or ‘disinformation campaigns.’ The historical cases are: Soviet influence campaigns around European neo-Nazi extremism (a precursor of current Kremlin propaganda); and on the origins of AIDS. For individual stories, we will identify linguacultural variation, comparing narrative structures and truth claim attributions. The contemporary material includes news reports, talk-shows, and social media posts. Based on the US State Department’s ‘5 top Kremlin narratives’
, it covers those specific to Russia; and those of a global scope, enabling us to pinpoint Russia’s role as a disinformation node. They are: ‘deep state’ and ‘global elite’ conspiracies; anti-vaxxer rhetoric; ‘Russophobia’; and the ‘demise of Western civilization’. All 5 acquired new valences during the Ukraine war. We will consider monitors’ attention to their linguacultural context; attribution of deceit; and lineage, evaluating their assigned disinformation status. Contemporary datasets will begin with ESTF database
examples to ensure that material is pre-defined
as ‘disinformation’. We will scrutinise multilingual sources mapped to ‘false narratives’ and supplied with ‘disproofs’.
To address translingual collusion and disinformation’s interactions with media narratives, we will examine tweets relating to each case study, mapping trajectories, identifying nodes of platform-to-platform remediation and linguistic journeys, especially across Russian, Serbian, and Arabic. This will inform the selection of a subset of tweets for analysis of user profiles and individual posts. We will survey meanings that narrative fragments acquire as they cross linguistic environments, are reappropriated by new knowledge networks, and redisclosed within alternative umbrella narratives.
Our audience analysis will explore how material labelled as disinformation is consumed by Russian minorities in Estonia and Serbia, and by UK Arabic-speakers. Focus groups will discuss specific narratives, using relevant prompts – e.g., social media posts circulating in local media - and media use and information sources. Data will be analysed via content analysis software (to gain a broad sense of how audience groups appropriate material) and more targeted thematic analysis (to link narratives to particular news sources).
To complete our account of the disinformation life cycle, our Chatham House
partners will use simulation workshop exercises designed to test policy responses to diagnosing key narratives. These exercises will centre on immersive planning for two scenarios: the possibility of a new pandemic, and a further episode of Russian expansionism. The workshops will challenge participants to generate culture-specific resilience measures through hands-on exercises, and by gaming the relationship between disinformation’s legislative definitions and dominant narratives in local contexts. Outcomes and Outputs: Towards a New (Critical) Disinformation Studies
The outputs that we plan will develop the agenda for a still incipient Critical Disinformation Studies
. However, our interrogation of the status quo serves a constructive purpose. Working with our policy partners, we will remain committed to supporting democratic integrity, information resilience and good governance, to improving their appreciation of the lingua-cultural and historical contexts in which disinformation is produced and consumed, their tools for detecting manipulated information and their understanding of the relationship between counter-disinformation theory and practice. The most efficacious forms of analysis are always self-aware and reflexive. For, as Timothy Garton-Ash argues
, ‘Self-criticism is [liberalism’s] traditional path to renewal’.