Definition and Examples of Propaganda

Galerie Bilderwelt/Getty Images

  • An Introduction to Punctuation
  • Ph.D., Rhetoric and English, University of Georgia
  • M.A., Modern English and American Literature, University of Leicester
  • B.A., English, State University of New York

Propaganda is a form of psychological warfare that involves the spreading of information and ideas to advance a cause or discredit an opposing cause. 

In their book Propaganda and Persuasion (2011), Garth S. Jowett and Victoria O'Donnell define propaganda as "the deliberate and systematic attempt to shape perceptions, manipulate cognitions, and direct behavior to achieve a response that furthers the desired intent of the propagandist."

Pronunciation: prop-eh-GAN-da

Etymology: from the Latin, "to propagate"

Examples and Observations

  • "Every day we are bombarded with one persuasive communication after another. These appeals persuade not through the give-and-take of argument and debate but through the manipulation of symbols and of our most basic human emotions. For better or worse, ours is an age of propaganda." (Anthony Pratkanis and Elliot Aronson, Age of Propaganda: The Everyday Use and Abuse of Persuasion , rev. ed. Owl Books, 2002)

Rhetoric and Propaganda

  • "Rhetoric and propaganda, both in popular and academic commentary, are widely viewed as interchangeable forms of communication; and historical treatments of propaganda often include classical rhetoric (and sophistry ) as early forms or antecedents of modern propaganda (e.g., Jowett and O'Donnell, 1992. pp. 27-31)." (Stanley B. Cunningham, The Idea of Propaganda: A Reconstruction . Praeger, 2002)
  • "Throughout the history of rhetoric, . . . critics have deliberately drawn distinctions between rhetoric and propaganda. On the other hand, evidence of the conflation of rhetoric and propaganda, under the general notion of persuasion, has become increasingly obvious, especially in the classroom, where students seem incapable of differentiating among the suasory forms of communication pervasive now in our heavily mediated society. . . .
  • "In a society where the system of government is based, at least in part, on the full, robust, give-and-take of persuasion in the context of debate, this conflation is deeply troubling. To the extent that all persuasive activity was lumped together with 'propaganda' and given the 'evil connotation ' (Hummel & Huntress 1949, p. 1) the label carried, persuasive speech (i.e. rhetoric) would never hold the central place in education or democratic civic life it was designed to." (Beth S. Bennett and Sean Patrick O'Rourke, "A Prolegomenon to the Future Study of Rhetoric and Propaganda." Readings in Propaganda and Persuasion: New and Classic Essays , ed by Garth S. Jowett and Victoria O'Donnell. Sage, 2006)

Examples of Propaganda

  • "A massive propaganda campaign by the South Korean military drew an ominous warning from North Korea on Sunday, with Pyongyang saying that it would fire across the border at anyone sending helium balloons carrying anti-North Korean messages into the country. "A statement carried by the North’s official news agency said the balloon-and-leaflet campaign 'by the puppet military in the frontline area is a treacherous deed and a wanton challenge' to peace on the Korean Peninsula." (Mark McDonald, "N. Korea Threatens South on Balloon Propaganda." The New York Times , Feb. 27, 2011)
  • "The US military is developing software that will let it secretly manipulate social media sites by using fake online personas to influence internet conversations and spread pro-American propaganda.
  • "A Californian corporation has been awarded a contract with United States Central Command (Centcom), which oversees US armed operations in the Middle East and Central Asia, to develop what is described as an 'online persona management service' that will allow one US serviceman or woman to control up to 10 separate identities based all over the world." (Nick Fielding and Ian Cobain, "Revealed: US Spy Operation That Manipulates Social Media." The Guardian , March 17, 2011)

ISIS Propaganda

  • "Former US public diplomacy officials fear the sophisticated, social media-borne propaganda of the Islamic State militant group (Isis) is outmatching American efforts at countering it.
  • "Isis propaganda runs the gamut from the gruesome video-recorded beheadings of journalists James Foley and Steven Sotloff to Instagram photographs of cats with AK-47s, indicating a comfort Isis has with internet culture. A common theme, shown in euphoric images uploaded to YouTube of jihadi fighters parading in armored US-made vehicles captured from the Iraqi military, is Isis’s potency and success. . . .
  • "Online, the most visible US attempt to counter to Isis comes from a social media campaign called Think Again Turn Away, run by a State Department office called the Center for Strategic Counterterrorism Communications." (Spencer Ackerman, "Isis's Online Propaganda Outpacing US Counter-Efforts." The Guardian , September 22, 2014)

The Aims of Propaganda

  • "The characteristic that propaganda is a form of mass media argumentation should not, in itself, be regarded as sufficient for drawing the conclusion that all propaganda is irrational or illogical or that any argument used in propaganda is for that reason alone fallacious. . . .
  • "[T]he aim of propaganda is not just to secure a respondent's assent to a proposition by persuading him that it is true or that it is supported by propositions he is already committed to. The aim of propaganda is to get the respondent to act, to adopt a certain course of action, or to go along with and assist in a particular policy. Merely securing assent or commitment to a proposition is not enough to make propaganda successful in securing its aim." (Douglas N. Walton, Media Argumentation: Dialectic, Persuasion, and Rhetoric . Cambridge University Press, 2007)

Recognizing Propaganda

  • "The only truly serious attitude . . . is to show people the extreme effectiveness of the weapon used against them, to rouse them to defend themselves by making them aware of their frailty and their vulnerability instead of soothing them with the worst illusion, that of a security that neither man's nature nor the techniques of propaganda permit him to possess. It is merely convenient to realize that the side of freedom and truth for man has not yet lost, but that it may well lose--and that in this game, propaganda is undoubtedly the most formidable power, acting in only one direction (toward the destruction of truth and freedom), no matter what the good intentions or the goodwill may be of those who manipulate it." (Jacques Ellul, Propaganda: The Formation of Men's Attitudes . Vintage Books, 1973)
  • Definition of Spin in Propaganda
  • Definition and Examples of Dialectic in Rhetoric
  • Persuasion and Rhetorical Definition
  • What Is Astroturfing in Politics? Definition and Examples
  • Definition and Examples of the New Rhetorics
  • Propositions in Debate Definition and Examples
  • What Is a Sermon?
  • What Is Rhetoric?
  • An Introduction to Psychological Warfare
  • Definition and Examples of Ethos in Classical Rhetoric
  • Rhetoric: Definitions and Observations
  • Definition and Examples of Anti-Rhetoric
  • What Is Newspeak?
  • What is a Rhetorical Situation?
  • What Is Phronesis?
  • Definition and Examples of Epideictic Rhetoric
  • Search Menu

Sign in through your institution

  • Browse content in Arts and Humanities
  • Browse content in Archaeology
  • Anglo-Saxon and Medieval Archaeology
  • Archaeological Methodology and Techniques
  • Archaeology by Region
  • Archaeology of Religion
  • Archaeology of Trade and Exchange
  • Biblical Archaeology
  • Contemporary and Public Archaeology
  • Environmental Archaeology
  • Historical Archaeology
  • History and Theory of Archaeology
  • Industrial Archaeology
  • Landscape Archaeology
  • Mortuary Archaeology
  • Prehistoric Archaeology
  • Underwater Archaeology
  • Zooarchaeology
  • Browse content in Architecture
  • Architectural Structure and Design
  • History of Architecture
  • Residential and Domestic Buildings
  • Theory of Architecture
  • Browse content in Art
  • Art Subjects and Themes
  • History of Art
  • Industrial and Commercial Art
  • Theory of Art
  • Biographical Studies
  • Byzantine Studies
  • Browse content in Classical Studies
  • Classical History
  • Classical Philosophy
  • Classical Mythology
  • Classical Literature
  • Classical Reception
  • Classical Art and Architecture
  • Classical Oratory and Rhetoric
  • Greek and Roman Epigraphy
  • Greek and Roman Law
  • Greek and Roman Papyrology
  • Greek and Roman Archaeology
  • Late Antiquity
  • Religion in the Ancient World
  • Digital Humanities
  • Browse content in History
  • Colonialism and Imperialism
  • Diplomatic History
  • Environmental History
  • Genealogy, Heraldry, Names, and Honours
  • Genocide and Ethnic Cleansing
  • Historical Geography
  • History by Period
  • History of Emotions
  • History of Agriculture
  • History of Education
  • History of Gender and Sexuality
  • Industrial History
  • Intellectual History
  • International History
  • Labour History
  • Legal and Constitutional History
  • Local and Family History
  • Maritime History
  • Military History
  • National Liberation and Post-Colonialism
  • Oral History
  • Political History
  • Public History
  • Regional and National History
  • Revolutions and Rebellions
  • Slavery and Abolition of Slavery
  • Social and Cultural History
  • Theory, Methods, and Historiography
  • Urban History
  • World History
  • Browse content in Language Teaching and Learning
  • Language Learning (Specific Skills)
  • Language Teaching Theory and Methods
  • Browse content in Linguistics
  • Applied Linguistics
  • Cognitive Linguistics
  • Computational Linguistics
  • Forensic Linguistics
  • Grammar, Syntax and Morphology
  • Historical and Diachronic Linguistics
  • History of English
  • Language Acquisition
  • Language Evolution
  • Language Reference
  • Language Variation
  • Language Families
  • Lexicography
  • Linguistic Anthropology
  • Linguistic Theories
  • Linguistic Typology
  • Phonetics and Phonology
  • Psycholinguistics
  • Sociolinguistics
  • Translation and Interpretation
  • Writing Systems
  • Browse content in Literature
  • Bibliography
  • Children's Literature Studies
  • Literary Studies (Asian)
  • Literary Studies (European)
  • Literary Studies (Eco-criticism)
  • Literary Studies (Romanticism)
  • Literary Studies (American)
  • Literary Studies (Modernism)
  • Literary Studies - World
  • Literary Studies (1500 to 1800)
  • Literary Studies (19th Century)
  • Literary Studies (20th Century onwards)
  • Literary Studies (African American Literature)
  • Literary Studies (British and Irish)
  • Literary Studies (Early and Medieval)
  • Literary Studies (Fiction, Novelists, and Prose Writers)
  • Literary Studies (Gender Studies)
  • Literary Studies (Graphic Novels)
  • Literary Studies (History of the Book)
  • Literary Studies (Plays and Playwrights)
  • Literary Studies (Poetry and Poets)
  • Literary Studies (Postcolonial Literature)
  • Literary Studies (Queer Studies)
  • Literary Studies (Science Fiction)
  • Literary Studies (Travel Literature)
  • Literary Studies (War Literature)
  • Literary Studies (Women's Writing)
  • Literary Theory and Cultural Studies
  • Mythology and Folklore
  • Shakespeare Studies and Criticism
  • Browse content in Media Studies
  • Browse content in Music
  • Applied Music
  • Dance and Music
  • Ethics in Music
  • Ethnomusicology
  • Gender and Sexuality in Music
  • Medicine and Music
  • Music Cultures
  • Music and Religion
  • Music and Media
  • Music and Culture
  • Music Education and Pedagogy
  • Music Theory and Analysis
  • Musical Scores, Lyrics, and Libretti
  • Musical Structures, Styles, and Techniques
  • Musicology and Music History
  • Performance Practice and Studies
  • Race and Ethnicity in Music
  • Sound Studies
  • Browse content in Performing Arts
  • Browse content in Philosophy
  • Aesthetics and Philosophy of Art
  • Epistemology
  • Feminist Philosophy
  • History of Western Philosophy
  • Metaphysics
  • Moral Philosophy
  • Non-Western Philosophy
  • Philosophy of Science
  • Philosophy of Language
  • Philosophy of Mind
  • Philosophy of Perception
  • Philosophy of Action
  • Philosophy of Law
  • Philosophy of Religion
  • Philosophy of Mathematics and Logic
  • Practical Ethics
  • Social and Political Philosophy
  • Browse content in Religion
  • Biblical Studies
  • Christianity
  • East Asian Religions
  • History of Religion
  • Judaism and Jewish Studies
  • Qumran Studies
  • Religion and Education
  • Religion and Health
  • Religion and Politics
  • Religion and Science
  • Religion and Law
  • Religion and Art, Literature, and Music
  • Religious Studies
  • Browse content in Society and Culture
  • Cookery, Food, and Drink
  • Cultural Studies
  • Customs and Traditions
  • Ethical Issues and Debates
  • Hobbies, Games, Arts and Crafts
  • Natural world, Country Life, and Pets
  • Popular Beliefs and Controversial Knowledge
  • Sports and Outdoor Recreation
  • Technology and Society
  • Travel and Holiday
  • Visual Culture
  • Browse content in Law
  • Arbitration
  • Browse content in Company and Commercial Law
  • Commercial Law
  • Company Law
  • Browse content in Comparative Law
  • Systems of Law
  • Competition Law
  • Browse content in Constitutional and Administrative Law
  • Government Powers
  • Judicial Review
  • Local Government Law
  • Military and Defence Law
  • Parliamentary and Legislative Practice
  • Construction Law
  • Contract Law
  • Browse content in Criminal Law
  • Criminal Procedure
  • Criminal Evidence Law
  • Sentencing and Punishment
  • Employment and Labour Law
  • Environment and Energy Law
  • Browse content in Financial Law
  • Banking Law
  • Insolvency Law
  • History of Law
  • Human Rights and Immigration
  • Intellectual Property Law
  • Browse content in International Law
  • Private International Law and Conflict of Laws
  • Public International Law
  • IT and Communications Law
  • Jurisprudence and Philosophy of Law
  • Law and Politics
  • Law and Society
  • Browse content in Legal System and Practice
  • Courts and Procedure
  • Legal Skills and Practice
  • Primary Sources of Law
  • Regulation of Legal Profession
  • Medical and Healthcare Law
  • Browse content in Policing
  • Criminal Investigation and Detection
  • Police and Security Services
  • Police Procedure and Law
  • Police Regional Planning
  • Browse content in Property Law
  • Personal Property Law
  • Study and Revision
  • Terrorism and National Security Law
  • Browse content in Trusts Law
  • Wills and Probate or Succession
  • Browse content in Medicine and Health
  • Browse content in Allied Health Professions
  • Arts Therapies
  • Clinical Science
  • Dietetics and Nutrition
  • Occupational Therapy
  • Operating Department Practice
  • Physiotherapy
  • Radiography
  • Speech and Language Therapy
  • Browse content in Anaesthetics
  • General Anaesthesia
  • Neuroanaesthesia
  • Browse content in Clinical Medicine
  • Acute Medicine
  • Cardiovascular Medicine
  • Clinical Genetics
  • Clinical Pharmacology and Therapeutics
  • Dermatology
  • Endocrinology and Diabetes
  • Gastroenterology
  • Genito-urinary Medicine
  • Geriatric Medicine
  • Infectious Diseases
  • Medical Toxicology
  • Medical Oncology
  • Pain Medicine
  • Palliative Medicine
  • Rehabilitation Medicine
  • Respiratory Medicine and Pulmonology
  • Rheumatology
  • Sleep Medicine
  • Sports and Exercise Medicine
  • Clinical Neuroscience
  • Community Medical Services
  • Critical Care
  • Emergency Medicine
  • Forensic Medicine
  • Haematology
  • History of Medicine
  • Browse content in Medical Dentistry
  • Oral and Maxillofacial Surgery
  • Paediatric Dentistry
  • Restorative Dentistry and Orthodontics
  • Surgical Dentistry
  • Browse content in Medical Skills
  • Clinical Skills
  • Communication Skills
  • Nursing Skills
  • Surgical Skills
  • Medical Ethics
  • Medical Statistics and Methodology
  • Browse content in Neurology
  • Clinical Neurophysiology
  • Neuropathology
  • Nursing Studies
  • Browse content in Obstetrics and Gynaecology
  • Gynaecology
  • Occupational Medicine
  • Ophthalmology
  • Otolaryngology (ENT)
  • Browse content in Paediatrics
  • Neonatology
  • Browse content in Pathology
  • Chemical Pathology
  • Clinical Cytogenetics and Molecular Genetics
  • Histopathology
  • Medical Microbiology and Virology
  • Patient Education and Information
  • Browse content in Pharmacology
  • Psychopharmacology
  • Browse content in Popular Health
  • Caring for Others
  • Complementary and Alternative Medicine
  • Self-help and Personal Development
  • Browse content in Preclinical Medicine
  • Cell Biology
  • Molecular Biology and Genetics
  • Reproduction, Growth and Development
  • Primary Care
  • Professional Development in Medicine
  • Browse content in Psychiatry
  • Addiction Medicine
  • Child and Adolescent Psychiatry
  • Forensic Psychiatry
  • Learning Disabilities
  • Old Age Psychiatry
  • Psychotherapy
  • Browse content in Public Health and Epidemiology
  • Epidemiology
  • Public Health
  • Browse content in Radiology
  • Clinical Radiology
  • Interventional Radiology
  • Nuclear Medicine
  • Radiation Oncology
  • Reproductive Medicine
  • Browse content in Surgery
  • Cardiothoracic Surgery
  • Gastro-intestinal and Colorectal Surgery
  • General Surgery
  • Neurosurgery
  • Paediatric Surgery
  • Peri-operative Care
  • Plastic and Reconstructive Surgery
  • Surgical Oncology
  • Transplant Surgery
  • Trauma and Orthopaedic Surgery
  • Vascular Surgery
  • Browse content in Science and Mathematics
  • Browse content in Biological Sciences
  • Aquatic Biology
  • Biochemistry
  • Bioinformatics and Computational Biology
  • Developmental Biology
  • Ecology and Conservation
  • Evolutionary Biology
  • Genetics and Genomics
  • Microbiology
  • Molecular and Cell Biology
  • Natural History
  • Plant Sciences and Forestry
  • Research Methods in Life Sciences
  • Structural Biology
  • Systems Biology
  • Zoology and Animal Sciences
  • Browse content in Chemistry
  • Analytical Chemistry
  • Computational Chemistry
  • Crystallography
  • Environmental Chemistry
  • Industrial Chemistry
  • Inorganic Chemistry
  • Materials Chemistry
  • Medicinal Chemistry
  • Mineralogy and Gems
  • Organic Chemistry
  • Physical Chemistry
  • Polymer Chemistry
  • Study and Communication Skills in Chemistry
  • Theoretical Chemistry
  • Browse content in Computer Science
  • Artificial Intelligence
  • Computer Architecture and Logic Design
  • Game Studies
  • Human-Computer Interaction
  • Mathematical Theory of Computation
  • Programming Languages
  • Software Engineering
  • Systems Analysis and Design
  • Virtual Reality
  • Browse content in Computing
  • Business Applications
  • Computer Security
  • Computer Games
  • Computer Networking and Communications
  • Digital Lifestyle
  • Graphical and Digital Media Applications
  • Operating Systems
  • Browse content in Earth Sciences and Geography
  • Atmospheric Sciences
  • Environmental Geography
  • Geology and the Lithosphere
  • Maps and Map-making
  • Meteorology and Climatology
  • Oceanography and Hydrology
  • Palaeontology
  • Physical Geography and Topography
  • Regional Geography
  • Soil Science
  • Urban Geography
  • Browse content in Engineering and Technology
  • Agriculture and Farming
  • Biological Engineering
  • Civil Engineering, Surveying, and Building
  • Electronics and Communications Engineering
  • Energy Technology
  • Engineering (General)
  • Environmental Science, Engineering, and Technology
  • History of Engineering and Technology
  • Mechanical Engineering and Materials
  • Technology of Industrial Chemistry
  • Transport Technology and Trades
  • Browse content in Environmental Science
  • Applied Ecology (Environmental Science)
  • Conservation of the Environment (Environmental Science)
  • Environmental Sustainability
  • Environmentalist Thought and Ideology (Environmental Science)
  • Management of Land and Natural Resources (Environmental Science)
  • Natural Disasters (Environmental Science)
  • Nuclear Issues (Environmental Science)
  • Pollution and Threats to the Environment (Environmental Science)
  • Social Impact of Environmental Issues (Environmental Science)
  • History of Science and Technology
  • Browse content in Materials Science
  • Ceramics and Glasses
  • Composite Materials
  • Metals, Alloying, and Corrosion
  • Nanotechnology
  • Browse content in Mathematics
  • Applied Mathematics
  • Biomathematics and Statistics
  • History of Mathematics
  • Mathematical Education
  • Mathematical Finance
  • Mathematical Analysis
  • Numerical and Computational Mathematics
  • Probability and Statistics
  • Pure Mathematics
  • Browse content in Neuroscience
  • Cognition and Behavioural Neuroscience
  • Development of the Nervous System
  • Disorders of the Nervous System
  • History of Neuroscience
  • Invertebrate Neurobiology
  • Molecular and Cellular Systems
  • Neuroendocrinology and Autonomic Nervous System
  • Neuroscientific Techniques
  • Sensory and Motor Systems
  • Browse content in Physics
  • Astronomy and Astrophysics
  • Atomic, Molecular, and Optical Physics
  • Biological and Medical Physics
  • Classical Mechanics
  • Computational Physics
  • Condensed Matter Physics
  • Electromagnetism, Optics, and Acoustics
  • History of Physics
  • Mathematical and Statistical Physics
  • Measurement Science
  • Nuclear Physics
  • Particles and Fields
  • Plasma Physics
  • Quantum Physics
  • Relativity and Gravitation
  • Semiconductor and Mesoscopic Physics
  • Browse content in Psychology
  • Affective Sciences
  • Clinical Psychology
  • Cognitive Psychology
  • Cognitive Neuroscience
  • Criminal and Forensic Psychology
  • Developmental Psychology
  • Educational Psychology
  • Evolutionary Psychology
  • Health Psychology
  • History and Systems in Psychology
  • Music Psychology
  • Neuropsychology
  • Organizational Psychology
  • Psychological Assessment and Testing
  • Psychology of Human-Technology Interaction
  • Psychology Professional Development and Training
  • Research Methods in Psychology
  • Social Psychology
  • Browse content in Social Sciences
  • Browse content in Anthropology
  • Anthropology of Religion
  • Human Evolution
  • Medical Anthropology
  • Physical Anthropology
  • Regional Anthropology
  • Social and Cultural Anthropology
  • Theory and Practice of Anthropology
  • Browse content in Business and Management
  • Business Strategy
  • Business Ethics
  • Business History
  • Business and Government
  • Business and Technology
  • Business and the Environment
  • Comparative Management
  • Corporate Governance
  • Corporate Social Responsibility
  • Entrepreneurship
  • Health Management
  • Human Resource Management
  • Industrial and Employment Relations
  • Industry Studies
  • Information and Communication Technologies
  • International Business
  • Knowledge Management
  • Management and Management Techniques
  • Operations Management
  • Organizational Theory and Behaviour
  • Pensions and Pension Management
  • Public and Nonprofit Management
  • Strategic Management
  • Supply Chain Management
  • Browse content in Criminology and Criminal Justice
  • Criminal Justice
  • Criminology
  • Forms of Crime
  • International and Comparative Criminology
  • Youth Violence and Juvenile Justice
  • Development Studies
  • Browse content in Economics
  • Agricultural, Environmental, and Natural Resource Economics
  • Asian Economics
  • Behavioural Finance
  • Behavioural Economics and Neuroeconomics
  • Econometrics and Mathematical Economics
  • Economic Systems
  • Economic History
  • Economic Methodology
  • Economic Development and Growth
  • Financial Markets
  • Financial Institutions and Services
  • General Economics and Teaching
  • Health, Education, and Welfare
  • History of Economic Thought
  • International Economics
  • Labour and Demographic Economics
  • Law and Economics
  • Macroeconomics and Monetary Economics
  • Microeconomics
  • Public Economics
  • Urban, Rural, and Regional Economics
  • Welfare Economics
  • Browse content in Education
  • Adult Education and Continuous Learning
  • Care and Counselling of Students
  • Early Childhood and Elementary Education
  • Educational Equipment and Technology
  • Educational Strategies and Policy
  • Higher and Further Education
  • Organization and Management of Education
  • Philosophy and Theory of Education
  • Schools Studies
  • Secondary Education
  • Teaching of a Specific Subject
  • Teaching of Specific Groups and Special Educational Needs
  • Teaching Skills and Techniques
  • Browse content in Environment
  • Applied Ecology (Social Science)
  • Climate Change
  • Conservation of the Environment (Social Science)
  • Environmentalist Thought and Ideology (Social Science)
  • Natural Disasters (Environment)
  • Social Impact of Environmental Issues (Social Science)
  • Browse content in Human Geography
  • Cultural Geography
  • Economic Geography
  • Political Geography
  • Browse content in Interdisciplinary Studies
  • Communication Studies
  • Museums, Libraries, and Information Sciences
  • Browse content in Politics
  • African Politics
  • Asian Politics
  • Chinese Politics
  • Comparative Politics
  • Conflict Politics
  • Elections and Electoral Studies
  • Environmental Politics
  • European Union
  • Foreign Policy
  • Gender and Politics
  • Human Rights and Politics
  • Indian Politics
  • International Relations
  • International Organization (Politics)
  • International Political Economy
  • Irish Politics
  • Latin American Politics
  • Middle Eastern Politics
  • Political Methodology
  • Political Communication
  • Political Philosophy
  • Political Sociology
  • Political Behaviour
  • Political Economy
  • Political Institutions
  • Political Theory
  • Politics and Law
  • Politics of Development
  • Public Administration
  • Public Policy
  • Quantitative Political Methodology
  • Regional Political Studies
  • Russian Politics
  • Security Studies
  • State and Local Government
  • UK Politics
  • US Politics
  • Browse content in Regional and Area Studies
  • African Studies
  • Asian Studies
  • East Asian Studies
  • Japanese Studies
  • Latin American Studies
  • Middle Eastern Studies
  • Native American Studies
  • Scottish Studies
  • Browse content in Research and Information
  • Research Methods
  • Browse content in Social Work
  • Addictions and Substance Misuse
  • Adoption and Fostering
  • Care of the Elderly
  • Child and Adolescent Social Work
  • Couple and Family Social Work
  • Direct Practice and Clinical Social Work
  • Emergency Services
  • Human Behaviour and the Social Environment
  • International and Global Issues in Social Work
  • Mental and Behavioural Health
  • Social Justice and Human Rights
  • Social Policy and Advocacy
  • Social Work and Crime and Justice
  • Social Work Macro Practice
  • Social Work Practice Settings
  • Social Work Research and Evidence-based Practice
  • Welfare and Benefit Systems
  • Browse content in Sociology
  • Childhood Studies
  • Community Development
  • Comparative and Historical Sociology
  • Economic Sociology
  • Gender and Sexuality
  • Gerontology and Ageing
  • Health, Illness, and Medicine
  • Marriage and the Family
  • Migration Studies
  • Occupations, Professions, and Work
  • Organizations
  • Population and Demography
  • Race and Ethnicity
  • Social Theory
  • Social Movements and Social Change
  • Social Research and Statistics
  • Social Stratification, Inequality, and Mobility
  • Sociology of Religion
  • Sociology of Education
  • Sport and Leisure
  • Urban and Rural Studies
  • Browse content in Warfare and Defence
  • Defence Strategy, Planning, and Research
  • Land Forces and Warfare
  • Military Administration
  • Military Life and Institutions
  • Naval Forces and Warfare
  • Other Warfare and Defence Issues
  • Peace Studies and Conflict Resolution
  • Weapons and Equipment

The Oxford Handbook of Propaganda Studies

The Oxford Handbook of Propaganda Studies

The Oxford Handbook of Propaganda Studies

Jonathan Auerbach, University of Maryland

Russ Castronovo is Dorothy Draheim Professor of English at the University of Wisconsin-Madison. He is author of three books: Fathering the Nation: American Genealogies of Slavery and Freedom; Necro Citizenship: Death, Eroticism, and the Public Sphere in the Nineteenth-Century United States; and Beautiful Democracy: Aesthetics and Anarchy in a Global Era. He is also editor of Materializing Democracy: Toward a Revitalized Cultural Politics (with Dana Nelson) and States of Emergency: The Object of American Studies (with Susan Gillman).

  • Cite Icon Cite
  • Permissions Icon Permissions

This handbook includes 23 essays by leading scholars from a variety of disciplines, divided into three sections: (1) Histories and Nationalities, (2) Institutions and Practices, and (3) Theories and Methodologies. In addition to dealing with the thorny question of definition, the handbook takes up an expansive set of assumptions and a full range of approaches that move propaganda beyond political campaigns and warfare to examine a wide array of cultural contexts and practices.

Signed in as

Institutional accounts.

  • GoogleCrawler [DO NOT DELETE]
  • Google Scholar Indexing

Personal account

  • Sign in with email/username & password
  • Get email alerts
  • Save searches
  • Purchase content
  • Activate your purchase/trial code
  • Add your ORCID iD

Institutional access

Sign in with a library card.

  • Sign in with username/password
  • Recommend to your librarian
  • Institutional account management
  • Get help with access

Access to content on Oxford Academic is often provided through institutional subscriptions and purchases. If you are a member of an institution with an active account, you may be able to access content in one of the following ways:

IP based access

Typically, access is provided across an institutional network to a range of IP addresses. This authentication occurs automatically, and it is not possible to sign out of an IP authenticated account.

Choose this option to get remote access when outside your institution. Shibboleth/Open Athens technology is used to provide single sign-on between your institution’s website and Oxford Academic.

  • Click Sign in through your institution.
  • Select your institution from the list provided, which will take you to your institution's website to sign in.
  • When on the institution site, please use the credentials provided by your institution. Do not use an Oxford Academic personal account.
  • Following successful sign in, you will be returned to Oxford Academic.

If your institution is not listed or you cannot sign in to your institution’s website, please contact your librarian or administrator.

Enter your library card number to sign in. If you cannot sign in, please contact your librarian.

Society Members

Society member access to a journal is achieved in one of the following ways:

Sign in through society site

Many societies offer single sign-on between the society website and Oxford Academic. If you see ‘Sign in through society site’ in the sign in pane within a journal:

  • Click Sign in through society site.
  • When on the society site, please use the credentials provided by that society. Do not use an Oxford Academic personal account.

If you do not have a society account or have forgotten your username or password, please contact your society.

Sign in using a personal account

Some societies use Oxford Academic personal accounts to provide access to their members. See below.

A personal account can be used to get email alerts, save searches, purchase content, and activate subscriptions.

Some societies use Oxford Academic personal accounts to provide access to their members.

Viewing your signed in accounts

Click the account icon in the top right to:

  • View your signed in personal account and access account management features.
  • View the institutional accounts that are providing access.

Signed in but can't access content

Oxford Academic is home to a wide variety of products. The institutional subscription may not cover the content that you are trying to access. If you believe you should have access to that content, please contact your librarian.

For librarians and administrators, your personal account also provides access to institutional account management. Here you will find options to view and activate subscriptions, manage institutional settings and access options, access usage statistics, and more.

Our books are available by subscription or purchase to libraries and institutions.

  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Rights and permissions
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

This website uses cookies to ensure you get the best experience on our website. Without cookies your experience may not be seamless.

institution icon

  • Journal of Advanced Military Studies

a magnifying glass search icon

  • Volume 12, Number 1, 2021
  • Marine Corps University Press

View HTML

Additional Information

  • Political Warfare and Propaganda An Introduction
  • James J. F. Forest , PhD (bio)

The digital age has greatly expanded the terrain and opportunities for a range of foreign influence efforts. A growing number of countries have invested significantly in their capabilities to disseminate online propaganda and disinformation worldwide, while simultaneously establishing information dominance at home. This introductory essay provides a brief examination of terms, concepts, and examples of these efforts and concludes by reviewing how the articles of this issue of the Journal of Advanced Military Studies contribute to our understanding of political warfare and propaganda.

information operations, digital influence, political warfare, psychological warfare

In 1970, Canadian media theorist Marshall McLuhan predicted that World War III would involve "a guerrilla information war with no division between military and civilian participation." 1 More than 30 years later, in their 2001 groundbreaking book Networks and Netwars: The Future of Terror, Crime, and Militancy , John Arquilla and David Ronfeld described how

the conduct and outcome of conflicts increasingly depend on information and communications. More than ever before, conflicts revolve around "knowledge" and the use of "soft power." Adversaries are learning to emphasize "information operations" and "perception management"—that is, media-oriented [End Page 13] measures that aim to attract or disorient rather than coerce, and that affect how secure a society, a military, or other actor feels about its knowledge of itself and of its adversaries. Psychological disruption may become as important a goal as physical destruction. 2

How prescient these observations seem today, particularly given how malicious actors—both foreign and domestic—are now weaponizing information for the purpose of influencing political, economic, social, and other kinds of behavior.

This issue of the Journal of Advanced Military Studies addresses the intersection of political warfare and the digital ecosystem. To frame the contributions that follow, this introduction to the issue reviews the broad landscape of terms and concepts that refer to the weaponization of information, and then provides a small handful of historical and modern examples that reflect the goals and objectives pursued through influence efforts. The discussion then turns to describe how the articles in this issue contribute to our understanding of political warfare and propaganda in the digital age, before concluding with some thoughts about the need for research-based strategies and policies that can improve our ability to defend against foreign influence efforts and mitigate their consequences.

A Diverse Landscape of Terms and Concepts

The past several centuries have largely been defined by physical security threats, requiring a nation's military to physically respond with whatever means they have available. But as explained by Isaiah Wilson III—president of Joint Special Operations University—today we face "compound security threats," which include physical security threats as well as "communication and information operations that scale with the speed of a social media post that goes viral, as well as cyber warfare, hacking and theft by our adversaries, both state and non-state actors." 3 These compound security threats can exploit cybersecurity vulnerabilities as well as psychological and emotional vulnerabilities of targets, using modern internet platforms to reach targets worldwide.

Terms like information operations or information warfare have been frequently used in military doctrine to describe computer network attacks (often by highly trained military units) like hacking into databases to observe or steal information, disrupting and degrading a target's technological capabilities, weakening military readiness, extorting financial ransoms, and much more. These terms have also referred to operations intended to protect our own data from these attacks by adversaries. Computer network attacks like these can also be used to send a message (e.g., about a target's vulnerabilities and the attacker's capabilities), and in that way could be a means of influencing others. Cyberattacks are seen as compound security threats because they can have implications [End Page 14] for multiple dimensions of a nation's well-being, including politics, economics, technology, information security, relations with other countries, and much more.

Today's digital influence attacks also have implications for these same multiple dimensions and are likewise seen as compound security threats. The goals of digital influence attacks can include disrupting and degrading a target's societal cohesion, undermining confidence in political systems and institutions (i.e., democratic elections), fracturing international alliances, and much more. Tactics used in such attacks include various forms of deception and provocation, from deepfake videos and fake social media accounts to gaslighting, doxing, trolling, and many others. Through social media and other internet technologies, attackers can incentivize and manipulate interactions directly with citizens of a foreign population, bypassing government efforts to insulate their citizens from an onslaught of disinformation. 4 These types of attacks exploit human vulnerabilities more than technological attacks and capitalize on psychological and emotional dimensions like fear, uncertainty, cognitive biases, and others.

A variety of terms are used to describe these attacks, sometimes leading to confusion rather than clarity. The term political warfare was used by the legendary diplomat George Kennan in 1948 to describe "the employment of all the means at a nation's command, short of war, to achieve its national objectives. Such operations are both overt and covert and can include various kinds of propaganda as well as covert operations that provide clandestine support to underground resistance in hostile states." 5 Paul A. Smith describes political warfare as "the use of political means to compel an opponent to do one's will" and "its chief aspect is the use of words, images, and ideas, commonly known, according to context, as propaganda and psychological warfare." 6 Carnes Lord notes a "tendency to use the terms psychological warfare and political warfare interchangeably" along with "a variety of similar terms—ideological warfare, the war of ideas, political communication and more." 7 And the U.S. Department of Defense has used the term military information support operations to describe efforts to "convey selected information and indicators to foreign audiences to influence their emotions, motives, objective reasoning, and ultimately the behavior of foreign governments, organizations, groups, and individuals in a manner favorable to the originator's objectives." 8

In a 2019 research report published by Princeton University, Diego A. Martin and Jacob N. Shapiro illustrate how "foreign actors have used social media to influence politics in a range of countries by promoting propaganda, advocating controversial viewpoints, and spreading disinformation." 9 The researchers define foreign-influence efforts as: 1) coordinated campaigns by one state to impact one or more specific aspects of politics in another state, 2) [End Page 15] through media channels, including social media, and by 3) producing content designed to appear indigenous to the target state. 10 The objective of such campaigns can be quite broad and to date have included influencing political decisions by shaping election outcomes at various levels, shifting the political agenda on topics ranging from health to security, and encouraging political polarization. 11 Similarly, research by Philip N. Howard describes "countries with dedicated teams meddling in the affairs of their neighbors through social media misinformation." 12 And social media platforms—most notably Facebook—are now using the term information operations when referring to deliberate and systematic attempts to steer public opinion using inauthentic accounts and inaccurate information. 13

A recent book by Carl Miller describes how "digital warfare has broken out between states struggling for control over what people see and believe." 14 Other terms used in the literature include "new generation warfare," "ambiguous warfare," "full-spectrum warfare," and "non-linear war." 15 Scholars have also described these security challenges as forms of hybrid warfare, encompassing a combination of political warfare, psychological operations, and information operations (including propaganda). Similar terms in this broad landscape include public diplomacy and strategic communications . Further, some states are portrayed as pursuing "information dominance" over the populations of other states through a combination of computer network operations, deception, public affairs, public diplomacy, perception management, psychological operations, electronic countermeasures, jamming, and defense suppression. 16

Whatever we want to call it, there are clear examples of aggression, attackers, targets, defenders, tactics, strategies, goals, winners, losers, and innocent victims. And this is not something that only states do to other states: nonstate actors are increasingly engaged in these kinds of activities as well. 17 The author's own work has used the term influence warfare to describe the kinds of activities in which the focus is not the information but on the purposes of that information. 18 This conceptual approach views the implicit goal of spreading propaganda, misinformation, disinformation, and so forth as shaping perceptions and influencing behavior of a specific target (or set of targets). Further, influence warfare strategies and tactics—particularly as we have seen online—also involve more than just manipulation of information; they can include behavior signaling (e.g., swarming or bandwagoning), trolling, gaslighting, and other means by which the target is provoked into having an emotional response that typically overpowers any rational thought or behavior. 19 Clickbait, memes, and ragebait (for example) are not really seen as forms of information operations as traditionally conceived, but they are certainly ways of influencing others via the internet. This leads us to the term digital influence warfare , which will be used variably throughout this introduction [End Page 16] as a catchall phrase representing the broadly diverse terrain of political and psychological warfare in the digital age. 20

Strategic Goals and Tactics of Influence Warfare

The "weaponization of information" in order to obtain power and influence is of course not new. The principles of influence warfare are based on an ancient and much-repeated maxim, attributed to the Chinese general and military theorist Sun Tzu, paraphrased as "to win one hundred victories in one hundred battles is not the highest skill. To subdue the enemy without fighting is the highest skill." 21 When the thirteenth-century Mongols were rolling across Eurasia, they deliberately spread news of the atrocities they perpetrated on cities that did not surrender, the obvious goal being what Sun Tzu argued was the ultimate victory: to defeat the enemy before a single shot has been fired. As Marc Galeotti explains, fear is a powerful emotion, and in this instance it was used to coerce the behavior of cities the Mongols had in their sights, preferring that they surrender instead of having to spend valuable resources conquering them through force. 22 Mongol hordes would also drag branches behind their horses to raise dust clouds suggesting their armies were far larger than reality—an early and effective form of deception and disinformation.

The previous century saw a wide variety of efforts involving the weaponization of information for strategic purposes. During the Chinese Civil War (1945–49), both the Communist and Nationalist (Kuomintang, or KMT) armies spread false information to sow discord in enemy-controlled areas, spreading rumors about defections, falsifying enemy attack plans, and stirring up unrest in an effort to misdirect enemy planning. After the Nationalist government relocated to Taiwan in 1949, the influence efforts continued as the two sides flooded propaganda and disinformation into enemy-controlled territories to affect public opinion and troop morale. 23 Various forms of influence warfare also played a major role in both World Wars. For example, the Committee on Public Information was created during World War I by U.S. president Woodrow Wilson to facilitate communications and serve as a worldwide propaganda organization on behalf of the United States. 24

Influence warfare was increasingly prominent throughout World War II, especially the massive amounts of propaganda disseminated by Joseph Goebbels and the Nazi regime. In response, U.S. president Franklin D. Roosevelt established the Office of War Information in 1942, responsible for (among other things) undermining the enemy's morale—often through various psychological and information operations—as well as for providing moral support and strengthening the resolve of resistance movements in enemy territories. The Voice of America (VOA) was also established in 1942 as the foreign radio and television broadcasting service of the U.S. government, broadcasting in English, [End Page 17] French, and Italian. Years later, the United States Information Agency (USIA) was created in 1953 as a primary conduit for enhancing our nation's strategic influence during the Cold War. 25 The director of USIA reported to the president through the National Security Council and coordinated closely with the secretary of state on foreign policy matters.

Meanwhile, when Radio Moscow began broadcasting in 1922, it was initially available only in Moscow and its surrounding areas, but by 1929, the Soviets were able to broadcast into Europe, North and South America, Japan, and the Middle East using a variety of languages. 26 By 1941, the Union of Soviet Socialist Republics (USSR) was able to broadcast in 21 languages and, 10 years later, had a program schedule of 2,094 hours. 27 But radio and television broadcasting were just the visible tip of the iceberg for what became a multidimensional influence effort during the Cold War involving an array of covert influence tactics, particularly through the spread of disinformation. As Thomas Rid notes, "Entire bureaucracies were created in the Eastern bloc during the 1960s for the purpose of bending the facts." 28 The Soviets used disinformation "to exacerbate tensions and contradictions within the adversary's body politic, by leveraging facts, fakes, and ideally a disorienting mix of both." 29

In the first academic study of the Soviet-era active measures program, Richard H. Shultz and Roy Godson explain how the Soviets cultivated several different types of so-called "agents of influence … including the unwitting but manipulated individual, the 'trusted contact,' and the controlled covert agent." 30 As they explain,

The agent of influence may be a journalist, a government official, a labor leader, an academic, an opinion leader, an artist, or involved in a number of other professions. The main objective of an influence operation is the use of the agent's position—be it in government, politics, labor, journalism or some other field—to support and promote political conditions desired by the sponsoring foreign power. 31

Forged documents—including faked photographs—have also been a part of influence warfare for more than a century. For example, during the 1920s the Soviet Cheka (secret police) used elaborate forgeries to lure anti-Bolsheviks out of hiding, and many were captured and killed as a result. 32 During the Cold War, as Shultz and Godson note, many "authentic-looking but false U.S. government documents and communiqués" could be categorized mainly as either "altered or distorted versions of actual US documents that the Soviets obtained (usually through espionage)" or "documents that [were] entirely fabricated." 33 Examples include falsified U.S. State Department documents ordering diplomatic [End Page 18] missions to sabotage peace negotiations or other endeavors, fake documents outlining U.S. plans to manipulate the leaders of Third World countries, or even forged cables from an American embassy outlining a proposed plan to overthrow a country's leader. 34

In one case, an authentic, unclassified U.S. government map was misrepresented as showing nuclear missiles targeting Austrian cities. A fabricated letter ostensibly written by the U.S. defense attaché in Rome contained language denying "rumors suggesting the death of children in Naples could be due to chemical or biological substances stored at American bases near Naples," while no such substances were stored at those bases. 35 Even a fake U.S. Army Field Manual was distributed, purportedly encouraging Army intelligence personnel to interfere in the affairs of host countries and subvert foreign government officials and military officers. 36 Through these and other types of information operations, the Soviets tried to influence a range of audiences, and the lessons to be learned from this history—both successes and failures—can inform the influence warfare efforts of many countries today.

Influence Opportunities in the Digital Age

While the primary strategies and goals of influence warfare have remained fairly constant, the operational environment in which these efforts take place has changed significantly during the past two decades. The rise of the internet and social media companies, whose profit model is based on an attention economy, has been a game changer. Within the attention economy, the most valued content is that which is most likely to attract attention and provoke engagement, with no regard to whether it is beneficial or harmful, true or untrue. New tools have emerged for creating and spreading information (and disinformation) on a global scale. Connectivity in the digital realm is now much easier, and yet the emergence of hyperpartisan echo chambers has sequestered many online users into separate communities who reject the credibility and merits of each other's ideas, beliefs, and narratives.

Unlike conventional cyberattacks, the goal of a digital influence warfare campaign is not about degrading the functional integrity of a computer system. Rather, it is to use those computer systems against the target in whatever ways might benefit that attacker's objectives. Often, those objectives include a basic divide and conquer strategy—a society that is disunited will fight among themselves over lots of things, instead of coming together in the face of a threat that only some of them believe is there. Many influence activities are meant to shape the perceptions, choices, and behaviors of a society—and in some cases, the goal may in fact be making the target dysfunctional as a society. This is not simply propaganda, fake news, or perception manipulation. It is a battle over [End Page 19] what people believe is reality and the decisions that each individual makes based on those beliefs. The victors in this battle are the attackers who have convinced scores of victims to make decisions that directly benefit the attackers.

Digital influence warfare involves the use of persuasion tactics, information and disinformation, provocation, identity deception, computer network hacking, altered videos and images, cyberbullying, and many other types of activity explored in this issue of the Journal of Advanced Military Studies . The attacker (or "influencer") seeks to weaponize information against a target in order to gain the power needed to achieve the goals articulated in their strategic influence plan. Some goals may involve changing the target's beliefs and behaviors, prompting the targets to question their beliefs in the hopes that once those beliefs have been undermined, the targets may change their minds. Other goals may include manufacturing uncertainty to convince the target that nothing may be true and anything may be possible. 37 In other instances, the goals of an influence strategy could include strengthening the target's certainty, even their commitment to believing in things that are actually untrue.

The central goal of influence attacks is—according to a recent report by Rand—"to cause the target to behave in a manner favorable to the influencer." 38 The influencer may seek to disrupt the target's information environment—for example, interrupting the flow of information between sources and intended recipients of an organization, or on a broader level, between the target's government and its citizens. Similarly, the influencer may also seek to degrade the quality, efficiency, and effectiveness of the target's communication capabilities, which may involve flooding channels of communication with misinformation and disinformation. The overall goal here involves undermining the perceived credibility and reliability of information shared among the adversary's organizational members (government or corporate) or between the target's government and its citizens. 39 Attackers in the digital influence domain can organize swarms of automated social media accounts ("bots") alongside real accounts, coordinated to amplify a particular narrative or attack a specific target. Government (or corporate) leaders can hire technically skilled mercenaries and contractors (from large so-called social media influence corporations to lone hackers) to do the dirty work for them. 40

Based on whatever goals the attacker wants to achieve, they will need to identify the targets they want to influence. When conducting research on their targets, the attackers will seek to answer specific questions like: What do they already believe about their world and/or their place within it? What do they think they know, and what are they uncertain about? What assumptions, suspicions, prejudices, and biases might they have? What challenges and grievances (economic, sociopolitical, security, identity, etc.) seem to provoke the most emotional reactions among them? Throughout the history of influence warfare, [End Page 20] this information has been relatively easy to identify in open liberal democracies of the West. In more closed or oppressed societies, an additional step may be needed to determine how the target audience's perceptions compare to the discourse in the public domain—for example, what the news media (often owned and controlled by the government) identify as important topics and acceptable views within that society may not fully reflect the reality.

Influence efforts should always be guided by data on potential targets. An attacker should never waste their resources on target audiences that are already well-armed to repeal the influence efforts; better instead to identify vulnerable targets to exploit. For example, if the goal is to sow division and increase political polarization within a society, the United States offers a prime target for achieving that goal. Research by the Oxford Internet Institute in 2019 has found that people in the United States share more junk news (i.e., completely fabricated information disguised to look like authentic news) than people in other advanced democracies such as France, Germany, and the United Kingdom. 41 A study by the Pew Research Center in 2017 found that 67 percent of U.S. adults received news through social media sites like Twitter and Facebook. 42 Further, analysis of Russian influence efforts by the Atlantic Council's Digital Forensic Research Lab in 2018 found that Americans were vulnerable to a distinct type of troll accounts that used "carefully crafted personalities" to infiltrate activist communities and post hyperpartisan messages in order to "make their audiences ever more radical." 43

These research studies reflect another important dimension of influence efforts: after gathering enough quality information about the target, the attacker will then seek to establish a foothold in the information environment preferred by that target. They must establish a credible presence among an audience of like-minded social media users before attempting to influence or polarize that audience. A common approach involves initially posting some messages that the target audience is likely to agree with. The convention of "like" or "share" facilitated by social media platforms can draw the target toward recognition of an acceptable persona (the "like-minded, fellow traveler"). 44 Once established within the target's digital ecosystem, the persona can then begin to shape perceptions and behavior in ways that will benefit their influence strategy.

Perhaps the most well-known example of this in the public arena today is called disinformation or fake news. Essentially, these are forms of information deception, and there are several variations to consider. According to researcher Claire Wardle, some of the most "problematic content within our information ecosystem" includes:

• False connection: when headlines, visuals, or captions do not support the substance or content of the story itself; [End Page 21]

• Misleading content: misleading use of information to frame an issue or individual;

• False context: when genuine content is shared with false contextual information;

• Imposter content: when genuine sources are impersonated;

• Manipulated content: when genuine information or imagery is manipulated to deceive (altered videos and images, including deepfakes, are the most prevalent examples of this); and

• Fabricated content: new content is 100 percent false and designed to deceive and do harm. 45

Each of these forms of "problematic content" has a role to play in achieving an influence warfare strategy. Further, in many cases the most effective means of using these types of information (or disinformation) involves a careful integration between fake details and accurate details that the target already accepts as true. In the field of education, teachers often refer to the concept of scaffolding as a strategy to foster learning by introducing material that builds on what the student already understands or believes. For the purposes of an influence strategy, as Thomas Rid explains, for disinformation to be successful it must "at least partially respond to reality, or at least accepted views." 46

Additional examples of deceptive digital influence tactics include identity deception (e.g., using fake or hijacked social media accounts) and information source deception (e.g., rerouting internet traffic to different sources of information that seem legitimate but relays false information to the viewers). As with the other forms of deception, a primary intent of these tactics is for the influencer to make the target believe what is not true. Similarly, the influencer may also spread disinformation through the target's trusted communication channels to degrade the integrity of their decision making and even their perception of reality.

Of course, deception is only one of several digital influence strategies. Another, which we have seen in use frequently in recent years, is to encourage engagement—especially by provoking emotional responses—using information that may in fact be all or partially accurate. Unlike disinformation and deception, the primary focus here is less on the message than on provoking people to propagate the message. Effective targets for this approach are those who have higher uncertainty about what is true or not but are willing to share and retransmit information without knowing whether it is untrue (and often because they want it to be true). And it is widely understood that fear is an exceptionally powerful emotion that can lead people to make a wide variety of (often unwise) decisions.

There are many kinds of influence goals that can be achieved by intentionally [End Page 22] provoking emotional responses, usually in reference to something that the target already favors or opposes. The tactic of provoking outrage can be particularly effective here against a target audience—as Sun Tzu wrote, "Use anger to throw them into disarray." 47 With the right sort of targeting, message format, and content, the influencer can use provocation tactics to produce whatever kinds of behavior they want by the target (e.g., angrily lashing out at members of an opposing political party or questioning the scientific evidence behind an inconvenient truth). And an additional type of influence warfare involves attacking the target directly—threatening or bullying them, calling them derogatory names, spreading embarrassing photos and videos of them, and so forth.

One of the most well-known earlier forms of digital influence warfare was North Korea's attack against Sony. In the summer of 2014, Sony Pictures had planned to release a comedy, The Interview , featuring a plot in which two bumbling, incompetent journalists score an interview with Kim Jong-un, but before they leave they are recruited by the Central Intelligence Agency (CIA) to blow him up. 48 An angered North Korea responded by hacking into Sony's computer networks, destroying some key systems and stealing tons of confidential emails that they later released publicly in small, increasingly embarrassing quantities. Details about contracts with Hollywood stars, medical records, salaries, and Social Security numbers were also released. But unlike other well-reported cyberattacks of that era, this was—in the words of David E. Sanger—"intended as a weapon of political coercion." 49 As with many other examples of this hack and release tactic, the strategic goals are fairly straightforward: for example, to weaken an adversary by undermining its perceived credibility. This same script was followed by Russia during the 2016 U.S. presidential election, when they hacked into John Podesta's email account and released (via WikiLeaks) a stream of embarrassing messages (as detailed in the investigation report by former Federal Bureau of Investigation [FBI] director Robert S. Mueller III). 50

Today, states are engaged in these kinds of digital influence activities with increasing regularity and sophistication. As a July 2020 report by the Stanford Internet Observatory explains:

Well-resourced countries have demonstrated sophisticated abilities to carry out influence operations in both traditional and social media ecosystems simultaneously. Russia, China, Iran, and a variety of other nation-states control media properties with significant audiences, often with reach far beyond their borders. They have also been implicated in social media company takedowns of accounts and pages that are manipulative either by virtue of the fake accounts and suspicious domains involved, or by way of coordinated distribution tactics [End Page 23] to drive attention to certain content or to create the perception that a particular narrative is extremely popular. 51

China in particular has significantly ramped up its digital foreigninfluence efforts, to include disrupting Twitter conversations about the conflict in Tibet and meddling in Taiwanese politics. 52 In fact, public opinion warfare and psychological warfare are closely intertwined in Chinese military doctrine. According to a recent Pentagon report, China's approach to psychological warfare "seeks to influence and/or disrupt an opponent's decision-making capability, to create doubts, foment anti-leadership sentiments, to deceive opponents and to attempt to diminish the will to fight among opponents." 53 A primary objective, as Laura Jackson explains, is "to demoralize both military personnel and civilian populations, and thus, over time, to diminish their will to act … to undermine international institutions, change borders, and subvert global media, all without firing a shot." 54

China's "Three Warfares" doctrine is focused on: (1) public opinion (media) warfare ( yulun zhan ); (2) psychological warfare ( xinli zhan ); and (3) legal warfare ( falu zhan ). 55 In their conception of public opinion warfare, the goal is to influence both domestic and international public opinion in ways that build support for China's own military operations, while undermining any justification for an adversary who is taking actions counter to China's interests. 56 But this effort goes well beyond what Steven Collins refers to in a 2003 NATO Review article as "perception management," in which a nation or organization provides (or withholds) certain kinds of information to influence foreign public opinion, leaders, intelligence agencies, and the policies and behaviors that result from their interpretation of this information. 57 According to the Pentagon report, China "leverages all instruments that inform and influence public opinion … and is directed against domestic populations in target countries." 58 As Laura Jackson explains, "China's extensive global media network, most notably the Xinhua News Agency and China Central Television (CCTV), also plays a key role, broadcasting in foreign languages and providing programming to stations throughout Africa, Central Asia, Europe, and Latin America." 59 In turn, Western media outlets then repeat and amplify the spread of messages to a broader international audience, lending a perception of legitimacy to what is in fact Chinese state-directed propaganda. 60

Similarly, Russia has also engaged in a broad, multifaceted influence warfare campaign involving all of the former tools and tactics of its active measures program along with a flurry of new technological approaches. Media outlets like Sputnik and RT (formerly Russia Today) view themselves—according to Margarita Simonyan, chief editor of RT—as equal in importance to the Defense Ministry, using "information as a weapon." 61 And like many other authoritarian [End Page 24] regimes, Russia has invested heavily in online troll farms, armies of automated bot accounts, cyber hacking units, and other means by which they can pursue their foreign influence goals using the most modern tools available to them. 62 While the "agent of influence" of the Cold War may have been a journalist, a government official, a labor leader, or an academic (among many other examples), today the agent is more likely to be a social media user with enough followers to be considered a potential "influencer." 63

According to a report by the Stanford Internet Observatory, both China and Russia have "full-spectrum propaganda capabilities," including prominent Facebook pages and YouTube channels targeting regionalized audiences. 64 Both have military units dedicated to influencing foreign targets and also encourage and incentivize citizen involvement in those efforts. 65 They gather extensive information about their targets and manage an array of fake Facebook pages and Twitter personas that are used for eroding the international perception and domestic social cohesion of its rivals. 66 And as detailed in many reports by congressional committees, think tanks, and academics, Russia has been particularly aggressive during this past decade in its online efforts to influence democratic elections in the United States, Europe, Africa, and elsewhere, as well as to sow confusion and encourage widespread societal polarization and animosity. 67

Meanwhile, other countries are also increasingly engaging in their own forms of digital influence warfare. In October 2019, Facebook announced the deletion of 93 Facebook accounts, 17 Facebook pages, and 4 Instagram accounts "for violating our policy against coordinated inauthentic behavior. This activity originated in Iran and focused primarily on the US, and some on French-speaking audiences in North Africa." 68 According to the announcement, "the individuals behind this activity used compromised and fake accounts—some of which had already been disabled by our automated systems—to masquerade as locals, manage their Pages, join Groups and drive people to off-platform domains connected to our previous investigation into the Iran-linked 'Liberty Front Press' and its removal in August 2018." 69 Facebook also removed 38 Facebook accounts, 6 pages, 4 groups, and 10 Instagram accounts that originated in Iran and focused on countries in Latin America, including Venezuela, Brazil, Argentina, Bolivia, Peru, Ecuador, and Mexico. The page administrators and account owners typically represented themselves as locals, used fake accounts to post in groups and manage pages posing as news organizations, as well as directed traffic to other websites. 70 And that same month, Microsoft announced that hackers linked to the Iranian government targeted an undisclosed U.S. presidential campaign, as well as government officials, media outlets, and prominent expatriate Iranians. 71

In short, older strategies, tactics, and tools of influence warfare have evolved to encompass a new and very powerful digital dimension. By using massive [End Page 25] amounts of internet user data, including profiles and patterns of online behavior, microtargeting strategies have become a very effective means of influencing people from many backgrounds. The strategies, tactics, and tools of digital influence warfare will increasingly be used by foreign and domestic actors to manipulate our perceptions in ways that will negatively affect us. According to a 2018 United Nations Educational, Scientific and Cultural Organization (UNESCO) report, the danger we face in the future is "the development of an 'arms race' of national and international disinformation spread through partisan 'news' organizations and social media channels, polluting the information environment for all sides." 72

Tomorrow's disinformation and perceptions manipulation will be much worse than what we are dealing with now, in part because the tactics and tools are becoming more innovative and sophisticated. As a 2019 report by Rand notes, "Increasingly, hostile social manipulation will be able to target the information foundations of digitized societies: the databases, algorithms, networked devices, and artificial intelligence programs that will dominate the day-to-day operation of the society." 73 The future evolution of digital influence tools—including augmented reality, virtual reality, and artificial intelligence (AI)—promise to bring further confusion and challenges to an already chaotic situation, offering a new frontier for disinformation and perceptions manipulation. 74 For example, in the not-too-distant future we will see a flood of fake audio, images, messages, and video created through AI that will appear so real it will be increasingly difficult to convince people they are fakes. 75 Technology already exists that can be used to manipulate an audio recording to delete words from a speech and then stitch the rest together seamlessly, or add new words using software that replicates the voice of the speaker with uncanny accuracy. 76 Imagine the harm that can be done when in the future, digital influencers have the ability to clone any voice, use it to say anything the influencer wants, and then use that audio recording to persuade others. 77

Creating deepfake images and video is also becoming easier, with increasingly realistic results becoming more convincing. One particularly sophisticated AI-related approach involves a tool known as generative adversarial networks (GANs). These involve integrating a competitive function into software, with one network seeking to generate an item, such as an image or video, while the other network judges the item to determine whether it looks real. As the first network continues to adapt to fool the adversarial network, the software learns how to better create more realistic images or videos. 78 Over time, according to Michael Mazzar and his colleagues at Rand, "As technology improves the quality of this production, it will likely become more difficult to discern real events from doctored or artificial ones, particularly if combined with the advancements in audio software." 79 If the target of such deepfake disinformation holds [End Page 26] true to the old adage of "hearing and seeing is believing," the long-term harmful effects of this technology are quite obvious. Technological advances will make it increasingly difficult to distinguish real people from computer-generated ones, and even more difficult to convince people that they are being deceived by someone they believe is real.

And, of course, we can fully expect that digital influence warfare attacks against democratic elections will continue and will likely involve new and innovative tactics. For example, there are concerns that in the future malicious hackers could use ransomware to snatch and hold hostage databases of local voter registrations or cause power disruptions at polling centers on election day. Further, as one expert noted, "with Americans so mistrustful of one another, and of the political process, the fear of hacking could be as dangerous as an actual cyberattack—especially if the election is close." 80 As Laura Rosenberger observes, "You don't actually have to breach an election system in order to create the public impression that you have." 81 The future will likely bring darker influence silos that no light of truth can penetrate, resulting in heightened uncertainty and distrust, deeper animosity, more extremism and violence, and widespread belief in things that simply are not true. This is the future that the enemies of America's peace and prosperity want to engineer. The United States must find ways to prevent them from succeeding. The research and analysis provided in this issue contributes to that important goal.

The Issue of JAMS on Political Warfare and Propaganda

Each of the contributions to this issue addresses the central theme of influencing perceptions and behavior. First, Daniel de Wit draws lessons from a historical analysis of the Office of Strategic Services (OSS), America's intelligence and special operations organization in World War II. In addition to its efforts to collect intelligence on the Axis powers and to arm and train resistance groups behind enemy lines, the OSS also served as America's primary psychological warfare agency, using a variety of "black propaganda" methods to sow dissension and confusion in enemy ranks. 82 As noted earlier, psychological warfare plays a significant role in the conduct of today's military operations, so de Wit's research offers important historical lessons for contemporary campaign planners.

Next, Kyleanne Hunter and Emma Jouenne examine the uniquely troubling effects of spreading misogynistic views online. Their analysis of three diverse case studies—the U.S. military, the incel movement, and ISIS—reveals how unchecked online misogyny can result in physical behavior that can threaten human and national security. Glen Segell then explores how perceptions about cybersecurity operations can have positive or negative impacts on civil-military relations, drawing on a case study of the Israeli experience. Lev Topor and Alexander Tabachnik follow with a study of how Russia uses the [End Page 27] strategies and tactics of digital influence warfare against other countries, while continually seeking to strengthen its information dominance over Russian citizens. And Donald M. Bishop reveals how other countries do this as well, including China, North Korea, Iran, Cuba, and Venezuela. Each is engaged in these same kinds of efforts to control the information that circulates within their respective societies, while using various forms of propaganda against other countries to strengthen their influence and national power.

Phil Zeman's contribution to this issue looks at how China and Russia are trying to fracture American and Western societies through information, disinformation, economic coercion, and the creation of economic dependencies—in many cases capitalizing on specific attributes and vulnerabilities of a target nation to achieve their strategic objectives. Through these efforts, he concludes, China and Russia hope to prevent the will or ability of American or Western states to respond to an aggressive act. Next, Michael Cserkits explains how a society's perceptions about armed forces can be influenced by cinematic productions and anime, drawing on a case study comparison of Japan and the United States. And finally, Anthony Patrick examines how social media penetration and internet connectivity could impact the likelihood that parties within a conventional intrastate conflict will enter negotiations.

As a collection, these articles make a significant contribution to the scholarly research literature on political warfare and propaganda. The authors shed light on the need for research-based strategies and policies that can improve our ability to identify, defend against, and mitigate the consequences of influence efforts. However, when reflecting on the compound security threats described at the beginning of this introduction—involving both cyberattacks and influence attacks—a startling contrast is revealed: we have committed serious resources toward cybersecurity but not toward addressing the influence issues examined in this issue. We routinely install firewalls and other security measures around our computer network systems, track potential intrusion attempts, test and report network vulnerabilities, hold training seminars for new employees, and take many other measures to try and mitigate cybersecurity threats. In contrast, there are no firewalls or intrusion detection efforts defending us against digital influence attacks of either foreign or domestic origin. Government sanctions and social media deplatforming efforts respond to influence attackers once they have been identified as such, but these efforts take place after attacks have already occurred, sometimes over the course of several years.

The articles of this issue reflect an array of efforts to influence the perceptions, emotions, and behavior of human beings at both individual and societal levels. In the absence of comprehensive strategies to more effectively defend against these efforts, the United States risks losing much more than military advantage; we are placing at risk the perceived legitimacy of our systems [End Page 28] and institutions of governance, as well as our economic security, our ability to resolve social disagreements peacefully, and much more. 83 Further, many other nations are also facing the challenges of defending against foreign influence efforts. As such, the transnational nature of influence opportunities and capabilities in the digital age may require a multinational, coordinated response. In the years ahead, further research will be needed to uncover strategies for responding to the threat of digital influence warfare with greater sophistication and success.

James J. F. Forest is a professor at the School of Criminology & Justice Studies, University of Massachusetts Lowell and a visiting professor at the Fletcher School, Tufts University. He has published more than 20 books in the field of international security studies, most recently Digital Influence Warfare in the Age of Social Media (2021) and Digital Influence Mercenaries (2021).

1. Marshall McLuhan, Culture Is Our Business (Eugene, OR: Wipf and Stock Publishers, 1970), 66.

2. John Arquilla and David Ronfeldt, "The Advent of Netwar (Revisited)," in Networks and Netwars: The Future of Terror, Crime, and Militancy , ed. John Arquilla and David Ronfeldt (Santa Monica, CA: Rand, 2001), 1, https://doi.org/10.7249/MR1382 .

3. Isaiah Wilson III, "What Is Compound Security?: With Dr. Isaiah 'Ike' Wilson III (Part 2 of 4)," YouTube, 26 February 2021, 16:48; and Isaiah Wilson III and Scott A. Smitson, "The Compound Security Dilemma: Threats at the Nexus of War and Peace," Parameters 50, no. 2 (Summer 2020): 1–17.

4. Wilson, "What Is Compound Security?"; and Wilson and Smitson, "The Compound Security Dilemma."

5. Max Boot and Michael Doran, "Political Warfare," Council on Foreign Relations, 28 June 2013.

6. Paul A. Smith, On Political War (Washington, DC: National Defense University Press, 1989), 3.

7. Carnes Lord, "The Psychological Dimension in National Strategy," in Political Warfare and Psychological Operations: Rethinking the US Approach , ed. Carnes Lord and Frank R. Barnett (Washington, DC: National Defense University Press, 1989), 16.

8. Military Information Support Operations , Joint Publication 3-13.2 (Washington, DC: Joint Chiefs of Staff, 2014).

9. Diego A. Martin and Jacob N. Shapiro, Trends in Online Foreign Influence Efforts (Princeton, NJ: Woodrow Wilson School of Public and International Affairs, Princeton University, 2019), 3.

10. Martin and Shapiro, Trends in Online Foreign Influence Efforts .

11. Martin and Shapiro, Trends in Online Foreign Influence Efforts .

12. Philip N. Howard, Lie Machines: How to Save Democracy from Troll Armies, Deceitful Robots, Junk News Operations and Political Operatives (New Haven, CT: Yale University Press, 2020), 75.

13. Caroline Jack, Lexicon of Lies: Terms for Problematic Information (New York: Data & Society Research Institute, 2017), 6.

14. Carl Miller, The Death of the Gods: The New Global Power Grab (London: Windmill Books, 2018), xvi.

15. Mark Galeotti, Russian Political War: Moving Beyond the Hybrid (Abingdon, UK: Routledge, 2019), 11.

16. Michael V. Hayden, The Assault on Intelligence: American National Security in an Age of Lies (New York: Penguin Press, 2018), 191.

17. In addition to terrorists and insurgents using these tools of digital influence for political purposes, we also see various kinds of individuals and marketing firms engaged in profit-seeking activities as described in James J. F. Forest, Digital Influence Mercenaries: Profit and Power Through Information Warfare (Annapolis, MD: Naval Institute Press, 2021).

18. James. J. F. Forest, ed., Influence Warfare: How Terrorists and Governments Fight to Shape Perceptions in a War of Ideas (Westport, CT: Praeger Security International, 2009).

19. While Arquilla and Ronfeldt initially defined swarming as a "deliberately structured, coordinated, strategic way to strike from all directions," in this context the term is used to describe a collection of social media accounts that converges on a single target like a swarm of bees. See John Arquilla and David Ronfeldt, Swarming and the Future of Conflict (Santa Monica, CA: Rand, 2000); Ali Fisher, "Swarmcast: How Jihadist Networks Maintain a Persistent Online Presence," Perspectives on Terrorism 9, no. 3 (June 2015): 3–20; and bandwagoning is a term from social psychology used to describe a type of cognitive bias and collective identity signaling that leads people to adopt the behaviors or attitudes of others. This can be observed in political campaigns, support for a winning sports team, fashion trends, adoption of new consumer electronics, and many other arenas of daily life.

20. James J. F. Forest, Digital Influence Warfare in the Age of Social Media (Santa Barbara, CA: ABC-CLIO/Praeger Security International, 2021).

21. Specifically, chapter 3, "Attack by Strategem" reads: "Supreme excellence consists in breaking the enemy's resistance without fighting." Sun Tzu, The Art of War (New York: Fall River Press, 2015), 54.

22. Galeotti, Russian Political War , 10.

23. Russell Hsiao, "CCP Propaganda against Taiwan Enters the Social Age," China Brief 18, no. 7 (April 2018).

24. W. Phillips Davison, "Some Trends in International Propaganda," Annals of the American Academy of Political Science and Social Science 398, no. 1 (November 1971): 1–13, https://doi.org/10.1177/000271627139800102 .

25. Daniel Baracskay, "U.S. Strategic Communication Efforts during the Cold War," in Influence Warfare , 253–74.

26. James Woods, History of International Broadcasting , vol. 2 (London: IET, 1992), 110.

27. Woods, History of International Broadcasting , 110–11.

28. Thomas Rid, Active Measures: The Secret History of Disinformation and Political Warfare (New York: Farrar, Strauss and Giroux, 2020), 4.

29. Rid, Active Measures , 7.

30. Richard H. Shultz and Roy Godson, Dezinformatsia: Active Measures in Soviet Strategy (New York: Pergamon Brassey's, 1984), 133.

31. Shultz and Godson, Dezinformatsia , 133.

32. Shultz and Godson, Dezinformatsia , 149.

33. Shultz and Godson, Dezinformatsia , 150–51.

34. Shultz and Godson, Dezinformatsia , 152–53.

35. Shultz and Godson, Dezinformatsia , 155.

36. Shultz and Godson, Dezinformatsia , 157.

37. This is a cornerstone of Russia's digital influence warfare program and the title of an important book. See Peter Pomerantsev, Nothing Is True and Everything Is Possible: The Surreal Heart of the New Russia (New York: Public Affairs, 2014).

38. This section of the discussion significantly amplifies and paraphrases a report by Eric V. Larson et al., Understanding Commanders' Information Needs for Influence Operations (Santa Monica, CA: Rand, 2009), Appendix B: Task List Analysis, 71–73, which cites several Department of the Army documents and 1st Information Operations Command (Land), Field Support Division, "Terminology for IO Effects," in Tactics, Techniques and Procedures for Operational and Tactical Information Operations Planning (Washington, DC: Department of the Army, 2004), 23.

39. Larson et al., Understanding Commanders' Information Needs for Influence Operations , 71–73.

40. For details, see Forest, Digital Influence Mercenaries .

41. Howard, Lie Machines , 99–100. Junk news was defined by the Oxford Internet Institute as being articles from outlets that publish "deliberately misleading, deceptive or incorrect information." See Ryan Browne, "'Junk News' Gets Massive Engagement on Facebook Ahead of EU Elections, Study Finds," CNBC, 21 May 2019.

42. Elisa Shearer and Jeffrey Gottfried, "News Use Across Social Media Platforms 2017," Pew Research Center, 7 September 2017.

43. Ben Nimmo, Graham Brookie, and Kanishk Karanm, "#TrollTracker: Twitter Troll Farm Archives, Part One—Seven Key Take Aways from a Comprehensive Archive of Known Russian and Iranian Troll Operations," Atlantic Council's Digital Forensic Research Lab, 17 October 2018.

44. For the purpose of this discussion, a "like-minded fellow traveler" is described as someone who sees the world in much the same way you do and is moving intellectually and emotionally in a direction that you approve of.

45. Claire Wardle, "Fake News. It's Complicated," First Draft, 16 February 2017.

46. Rid, Active Measures , 5, with a direct quote from famous Soviet defector Ladislav Bittman, author of the 1972 book The Deception Game (Syracuse, NY: Syracuse University Research Corp, 1972).

47. Various interpretations of this classic work use different phrasing. For example, "If your opponent is of choleric temper, seek to irritate him." Sun Tzu, The Art of War , 49 (passage 1.22); and "When their military leadership is obstreperous, you should irritate them to make them angry—then they will become impetuous and ignore their original strategy." Sun Tzu, The Art of War , trans. by Thomas Cleary (Boston, MA: Shambhala Pocket Classics, 1991), 15 (passage 1.12).

48. For a detailed examination of this event, see David E. Sanger, The Perfect Weapon: Sabotage and Fear in the Cyber Age (New York: Crown Publishing, 2018), 124–43.

49. Sanger, The Perfect Weapon , 143.

50. Robert S. Mueller III, Report on the Investigation into Russian Interference in the 2016 Presidential Election , vol. 1 (Washington, DC: Department of Justice, 2019).

51. Renee DiResta et al., Telling China's Story: The Chinese Communist Party's Campaign to Shape Global Narratives (Stanford, CA: Stanford Internet Observatory and Hoover Institution, Stanford University, 2020), 3.

52. Howard, Lie Machines , 77; Jonathan Kaiman, "Free Tibet Exposes Fake Twitter Accounts by China Propagandists," Guardian , 22 July 2014; and Nicholas J. Monaco, "Taiwan: Digital Democracy Meets Automated Autocracy," in Computational Propaganda: Political Parties, Politicians and Political Manipulation on Social Media , ed. Samuel C. Woolley and Philip N. Howard (New York: Oxford University Press, 2018), 104–27, https://doi.org/10.1093/oso/9780190931407.003.0006 .

53. Stefan Halper, China: The Three Warfares (Washington, DC: Office of the Secretary of Defense, 2013), 12.

54. Halper, China .

55. Larry M. Wortzel, The Chinese People's Liberation Army and Information Warfare (Carlisle Barracks, PA: United States Army War College Press, 2014), 29–30. Note: according to Wortzel, a direct translation of yulun is "public opinion"; thus, in many English translations, the term "public opinion warfare" is used. In some People's Liberation Army translations of book titles and articles, however, it is called "media warfare."

56. Wortzel, The Chinese People's Liberation Army and Information Warfare .

57. Steven Collins, "Mind Games," NATO Review (Summer 2003).

58. Halper, China , 12–13.

59. Laura Jackson, "Revisions of Reality: The Three Warfares—China's New Way of War," in Information at War: From China's Three Warfares to NATO's Narratives (London: Legatum Institute, 2015), 5–6.

60. Jackson, "Revisions of Reality."

61. Ben Nimmo, "Question That: RT's Military Mission," Atlantic Council's Digital Forensic Research Lab, 8 January 2018.

62. Statement Prepared for the U.S. Senate Select Committee on Intelligence Hearing, 115th Cong. (30 March 2017) (statement of Clint Watts on "Disinformation: A Primer in Russian Active Measures and Influence Campaigns"), hereafter Watts statement.

63. Watts statement.

64. Watts statement.

65. For details on the efforts of both China and Russia, see Ross Babbage, Winning With out Fighting: Chinese and Russian Political Warfare Campaigns and How the West Can Prevail , vol. 1 (Washington, DC: Center for Strategic and Budgetary Assessments, 2019); Esther Chan and Rachel Blundy, "'Bulletproof' China-backed Site Attacks HK Democracy Activists," Yahoo News, 1 November 2019; John Costello and Joe McReynolds, China's Strategic Support Force: A Force for a New Era , China Strategic Perspectives 13 (Washington, DC: National Defense University Press, 2018); Joanne Patti Munisteri, "Controlling Cognitive Domains," Small Wars Journal , 24 August 2019; Austin Doehler, "How China Challenges the EU in the Western Balkans," Diplomat , 25 September 2019; Keoni Everington, "China's 'Troll Factory' Targeting Taiwan with Disinformation Prior to Election," Taiwan News , 5 November 2018; "Hong Kong Protests: YouTube Shuts Accounts over Disinformation," BBC News, 22 August 2019; Paul Mozur and Alexandra Stevenson, "Chinese Cyberattack Hits Telegram, App Used by Hong Kong Protesters," New York Times , 13 June 2019; and Tom Uren, Elise Thomas, and Jacob Wallis, Tweeting through the Great Firewall: Preliminary Analysis of PRC-linked Information Operations on the Hong Kong Protests (Canberra: Australian Strategic Policy Institute, 2019).

66. DiResta et al., Telling China's Story .

67. Background to "Assessing Russian Activities and Intentions in Recent U.S. Elections": The Analytic Process and Cyber Incident Attribution (Washington, DC: Office of the Director of National Intelligence, 2017); Ellen Nakashima, "Senate Committee Unanimously Endorses Spy Agencies' Finding that Russia Interfered in 2016 Presidential Race in Bid to Help Trump," Washington Post , 21 April 2020; Jane Mayer, "How Russia Helped Swing the Election for Trump," New Yorker , 24 September 2018; Philip N. Howard et al., The IRA, Social Media and Political Polarization in the United States, 2012–2018 (Oxford, UK: Programme on Democracy & Technology, 2018); and Nike Aleksejeva et al., Operation Secondary Infektion: A Suspected Russian Intelligence Operation Targeting Europe and the United States (Washington, DC: Atlantic Council Digital Forensic Research Lab, 2019).

68. Nathaniel Gleicher, "Removing More Coordinated Inauthentic Behavior from Iran and Russia," Facebook Newsroom, 21 October 2019.

69. Gleicher, "Removing More Coordinated Inauthentic Behavior from Iran and Russia."

70. Gleicher, "Removing More Coordinated Inauthentic Behavior from Iran and Russia."

71. "Hacking Group Linked to Iran Targeted a U.S. Presidential Campaign, Microsoft Says," Los Angeles (CA) Times , 4 October 2019.

72. Cherilyn Ireton and Julie Posetti, Journalism, "Fake News" and Disinformation (Paris: UNESCO, 2018), 18.

73. Michael J. Mazarr et al., The Emerging Risk of Virtual Societal Warfare: Social Manipulation in a Changing Information Environment (Santa Monica, CA: Rand, 2019), 65–66, https://doi.org/10.7249/RR2714 .

74. For instance, see Rob Price, "AI and CGI Will Transform Information Warfare, Boost Hoaxes, and Escalate Revenge Porn," Business Insider, 12 August 2017; and Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 87.

75. Will Knight, "Fake America Great Again: Inside the Race to Catch the Worryingly Real Fakes that Can Be Made Using Artificial Intelligence," MIT Technology Review 17 August 2018; for some examples of realistic Instagram memes created by powerful computer graphics equipment combined with AI, see "the_fakening," Instagram, accessed 6 April 2021.

76. Avi Selk, "This Audio Clip of a Robot as Trump May Prelude a Future of Fake Human Voices," Washington Post , 3 May 2017; Bahar Gholipour, "New AI Tech Can Mimic Any Voice," Scientific American , 2 May 2017; and Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 85–86.

77. "Imitating People's Speech Patterns Precisely Could Bring Trouble," Economist , 20 April 2017; and Mazarr et al, The Emerging Risk of Virtual Societal Warfare , 86.

78. "Fake News: You Ain't Seen Nothing Yet," Economist , 1 July 2017; Faizan Shaikh, "Introductory Guide to Generative Adversarial Networks (GANs) and Their Promise!," Analytics Vidhya, 15 June 2017; and Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 88.

79. Mazarr et al., The Emerging Risk of Virtual Societal Warfare , 91.

80. Matthew Rosenberg, Nicole Perlroth, and David E. Sanger, "'Chaos Is the Point': Russian Hackers and Trolls Grow Stealthier in 2020," New York Times , 10 January 2020.

81. Rosenberg, Perlroth, and Sanger, "'Chaos Is the Point'."

82. Howard Becker, "The Nature and Consequences of Black Propaganda," American Sociological Review 14, no. 2 (April 1949): 221, https://doi.org/10.2307/2086855 . "'Black' propaganda is that variety which is presented by the propagandizer as coming from a source inside the propagandized."

83. For a discussion of strategies to counter foreign influence threats from Chinese and Russian malign influence efforts, see Thomas G. Mahnken, Ross Babbage, and Toshi Yoshihara, Countering Comprehensive Coercion: Competitive Strategies Against Authoritarian Political Warfare (Washington, DC: Center for Strategic and Budgetary Assessments, 2018).

inline graphic

This work is licensed under a Creative Commons Attribution 4.0 International License.

Previous Article

From the Editors

Next Article

Fake News for the Resistance: The OSS and the Nexus of Psychological Warfare and Resistance Operations in World War II

Creative Commons License

Project MUSE Mission

Project MUSE promotes the creation and dissemination of essential humanities and social science resources through collaboration with libraries, publishers, and scholars worldwide. Forged from a partnership between a university press and a library, Project MUSE is a trusted part of the academic and scholarly community it serves.

MUSE logo

2715 North Charles Street Baltimore, Maryland, USA 21218

+1 (410) 516-6989 [email protected]

©2024 Project MUSE. Produced by Johns Hopkins University Press in collaboration with The Sheridan Libraries.

Now and Always, The Trusted Content Your Research Requires

Project MUSE logo

Built on the Johns Hopkins University Campus

Beyond Intractability

Knowledge Base Masthead

The Hyper-Polarization Challenge to the Conflict Resolution Field: A Joint BI/CRQ Discussion BI and the Conflict Resolution Quarterly invite you to participate in an online exploration of what those with conflict and peacebuilding expertise can do to help defend liberal democracies and encourage them live up to their ideals.

Follow BI and the Hyper-Polarization Discussion on BI's New Substack Newsletter .

Hyper-Polarization, COVID, Racism, and the Constructive Conflict Initiative Read about (and contribute to) the  Constructive Conflict Initiative  and its associated Blog —our effort to assemble what we collectively know about how to move beyond our hyperpolarized politics and start solving society's problems. 

By Eric Brahm

August 2006  

The term propaganda has a nearly universally negative connotation. Walter Lippmann described it as inherently "deceptive" and therefore evil.[1] Propaganda is more an exercise of deception rather than persuasion. Partisans often use the label to dismiss any claims made by their opponents while at the same time professing to never employ propaganda themselves. It is akin to advertising and public relations, but with political purpose. Although propaganda has been utilized for centuries, the term was first used in 1622 when Pope Gregory XV issued the Sacra Congregatio de Propaganda Fide to counter the growing Protestant threat in order "to reconquer by spiritual arms" those areas "lost to the Church in the debacle of the sixteenth century."[2] Propaganda has become a common element of politics and war. As new communications technologies have developed, propagandists have developed new methods to reach increasingly large audiences in order to shape their views. The shift to targeting mass audiences and not just elite publics has been called by some as "new propaganda."[3] This essay aims to provide a brief overview of the concept of propaganda, various propaganda techniques, and related topics.

In a nutshell, propaganda is designed to manipulate others' beliefs and induce action in the interest of the propagator by drilling the message into the listeners' heads. It involves the use of images, slogans and symbols to play on prejudices and emotions. The ultimate goal of propaganda is to entice the recipient of the message to come to 'voluntarily' accept the propagandist's position as if it was one's own. Propaganda may be aimed at one's own people or at members of other groups. It can be designed to agitate the population or to pacify it. We often think of propaganda as false information that is meant to reassure those who already believe. Believing what is false can create cognitive dissonance, which people are eager to eliminate. Therefore, propaganda is often directed at those who are already sympathetic to the message in order to help overcome this discomfort. One the one hand, then, propaganda generally aims to construct the self as a noble, strong persona to which individuals in the domestic population can feel connected. At the same time, propaganda often attempts to rally the domestic public to action creating fear, confusion, and hatred by portraying the antagonist as an abominable figure.[4] Typically, the Other is demonized or dehumanized.[5] Stereotyping and scapegoating are common tactics in this regard. At its most extreme, propaganda is intended to overcome a reluctance to kill. In its modern usage, propaganda also tends to be characterized by some degree of institutionalization, mass distribution, and repetition of the message. [6]

Propagandists often conceal their purpose, even their identity, in order to distract the public. White propaganda, for instance, is from a correctly identified source and is not intentionally deceptive. Black propaganda, by contrast, is purposefully deceptive in giving the impression that the source is friendly.[7] Finally, the term gray propaganda has been used to describe propaganda that falls somewhere in between.

Although the range of propaganda techniques is seemingly limitless, space permits only an abbreviated discussion.[8] One common technique is bandwagoning, in other words appealing to people's desire to belong especially to the winning side, rather than the rightness of the position. Doublespeak involves the use of language that is deliberately constructed to disguise or distort its actual meaning. Examples might include downsizing, extraordinary rendition, or the coalition of the willing. These may take the form of euphemisms, which are used to make something sound better than it is such as the term collateral damage. Another strategy is to appeal to authority. For instance, the World War II-era series This is War! emphasized how FDR's leadership qualities were similar to greats like George Washington and Abraham Lincoln.[9] At other times, testimonials may be effective. Propaganda is also often heavily laced with rationalization and oversimplification. On the latter point, glittering generalities are words that, while they may have different positive meaning for individual, are linked to concepts that are highly valued by the group. Therefore, when these words are invoked, they demand approval without thinking, simply because such an important concept is involved. For example, when a person is asked to do something in 'defense of democracy' they are more likely to agree. The concept of democracy has a positive connotation to them because it is linked to a concept that they value. Propagandists sometimes use simple name-calling to draw a vague equivalence between a concept and a person, group, or idea. At other times, they may use "plain folks rhetoric" in order to convince the audience that they, and their ideas, are "of the people." Finally, propaganda often tries to at least implicitly gain the approval of respected and revered social institutions such as church or nation in order to transfer its authority and prestige to the propagandist's program.

Overall, many have pointed out that the most effective propaganda campaigns rely heavily on selective truth-telling, the confusion of means and ends, and the presentation of a simple idyllic vision that glosses over uncomfortable realities.[10] Psychologists Pratkanis and Aronson recommend four strategies for a successful propaganda campaign.[11] The first point is the importance of pre-persuasion. The propagandist should attempt to create a climate in which the message is more likely to be believed. Second is the credibility of the source. He/she should be a likable or authoritative communicator. Third, the message should be focused on simple, achievable goals. Finally, the message should arouse the emotions of the recipient and provide a targeted response.

It is unclear whether technological developments are making propaganda efforts easier or not. On the one hand, advances in communications technologies may be reducing government control over information.[12] Through the internet and satellite television, people need no longer rely solely on their governments for information. On the other hand, technology may make propaganda more effective. For example, it can make the experience of war more superficial and distort the lessons of prior conflict.[13] In addition, one can get overwhelmed with the amount of information on the internet, making it difficult to determine whether a particular source is credible. What is more, there appears to be significant 'virtual Balkanization' in which like-minded individuals form closed communities in which other viewpoints are not sought after.

Whether for scholars or the average person, Jowett and O'Donnell offer a 10 point checklist for analyzing propaganda:[14]

  • The ideology and purpose of the propaganda campaign,
  • The context in which the campaign occurs (for example, history or the ideological and social mileu),
  • Identification of the propagandist,
  • The structure of the propaganda organization (for example, identifying the leadership, organizational goals, and the form of media utilized),
  • The target audience,
  • Media utilization techniques,
  • Special techniques to maximize effect (which include creating resonance with the audience, establishing the credibility of the source, using opinion leaders, using face-to-face contact, drawing upon group norms, using rewards and punishment, employing visual symbols of power, language usage, music usage, and arousing emotions),
  • Audience reaction to various techniques,
  • Counterpropaganda (if present),
  • Effects and evaluation.

Psychological Operations (PSYOPs)

PSYOPs are a military tactic that also involves the use of propaganda. Rather than build support amongst one's citizenry, the goal is to demoralize one's opponent and create confusion. Since World War II, most wars have seen the creation of radio stations that broadcast music and news meant to hurt morale of the opposition. Dropping leaflets over enemy lines and even amongst the civilian population of one's opponents is also common. These techniques are designed to promote dissension and defections from enemy combat units as well as emboldening dissident groups within the country. PSYOPs can also provide cover and deception for one's own operations. Finally, PSYOPs may have the added benefit boosting the morale of one's own troops as well as amongst resistance groups behind enemy lines.

Public Diplomacy

More generally, public diplomacy involves the attempt to influence foreign publics without the use of force. The now-defunct U.S. Information Agency defined public diplomacy as "promoting the national interest and the national security of the United States through understanding, informing, and influencing foreign publics and broadening dialogue between American citizens and institutions and their counterparts abroad."[15] The areas of public diplomacy used to influence foreign target audiences are media diplomacy, public information, internal broadcasting, education and cultural programs, and political action. The idea of public diplomacy emerged from the Office of War Information, which existed during WWII. During the early part of the Cold War, a succession of offices within the U.S. Department of State had responsibility for the dissemination of information abroad. During the Eisenhower Administration, an independent agency was created for the purpose. The agency was later abolished by President Carter and its functions folded into the newly created International Communication Agency (ICA) in 1978 (later redesignated US Information Agency, or USIA, in 1982 during the Reagan Administration). In the 1990s, USIA and the Voice of America (VOA) were incorporated back into the State Department. Most recently, the White House established its own Office of Global Communications in 2001 to formulate and coordinate messages to foreign audiences. Other significant agencies include the International Broadcasting Bureau and the National Endowment for Democracy.

One observer has suggested a list of best practices in the conduct of public diplomacy, at least from the perspective of the United States.[16]

  • First, the primary goal is policy advocacy, in other words, to ensure that foreign publics understand US policies and motivations. As such, public diplomacy must be incorporated into foreign policy and it should involve coordination amongst a number of government agencies.
  • Second, public diplomacy must be rooted in American culture and values.
  • Third, the messages conveyed need to be consistent, truthful, and credible.
  • Fourth, it is important to tailor messages to a particular audience.
  • Fifth, a strategy needs to reach not only to opinion leaders, but also the mass public through national and global media outlets.
  • Sixth, there are a number of nonstate actors such as MNCs, the expatriot community, and humanitarian organizations that can serve as partners to help deliver the message accurately.
  • Finally, the US needs to recognize public diplomacy is a dialogue and to also listen to sentiment in other countries.

The Internet has become a major tool for information dissemination and interactive communication between the US government and their target populations as well as developing links with civil society actors around the world. Arquilla and Ronfeldt have described the strategy as 'noopolitik' as opposed to state-centered realpolitik . The former involves the use of soft power to shape ideas, values, norms, laws, and ethics.[17]

Cultural and educational programs, such as the Fulbright program, seek to provide a deeper understanding of a country's society, values, institutions and motives for forming the positions it takes. While funding of arts and cultural exchange was a prominent part of the ideological battle between the US and USSR, support has declined since the end of the Cold War.[18]

Propaganda and the War on Terror

The United States' War on Terror is but one of the most recent iterations of the use of propaganda in conflict. Since 9/11, the Bush administration has used fundamentalist discourse dominated by the binaries of good-evil and security-peril as well as appealing to a missionary obligation to spread freedom, while at the same time not broaching dissent.[19] This has had some resonance with segments of the American population. However, in this era of globalization, bad news in Iraq have obstructed the message and it has also been received very differently abroad. The US military has also utilized the practice of embedding journalists, which the British first learned during the Falklands war could be an effective government strategy because it creates sympathy for the troops on the part of the journalist.[20]

Despite gaffs of referring to the War on Terror as a crusade, the administration quickly recognized the importance of shoring up its image around the world, and the Middle East in particular. Within a month of 9/11, Charlotte Beers, a pioneer of branding strategies who had previously led Ogilvy & Mather and J. Walter Thompson, two of the largest advertising firms in the world, was named to the post of Under Secretary for Public Diplomacy and Public Affairs. Beers was later replaced by Karen Hughes. Upon Beers' appointment, Secretary of State Colin Powell described her role in these terms: "We are selling a product. There is nothing wrong with getting somebody who knows how to sell something. We need someone who can rebrand American policy"[21] The administration did just that, undertaking a "brand America" campaign in the Middle East. Amongst Beers' initiatives were a glossy brochure depicting the carnage of 9/11 and the "Shared Values" campaign that consisted of a series of short videos of Muslims describing their lives in the US. The latter portrayed an American egalitarian culture, that the US was wronged and a victim. The videos showed successful Muslims. They tried to enhance their authenticity by showing Muslims doing 'traditional' things. The US made a particularly concerted effort to reach young Arabs. Many argue that the use of public diplomacy can be an important tool to offer desperate youth, particularly in the Arab world, a compelling ideological alternative to extremism.[22] To the present, however, the American propaganda campaign has failed in Iraq on all four of Pratkanis and Aronson's counts.[23] To be effective, some argue for the importance of a greater recognition amongst policymakers and politicians that public diplomacy is a long-term effort. In addition, some have called for a strengthened agency that has independent reporting, an increased budget, as well as greater training.[24] There is also a need for better organization and a better articulation of an overarching strategy in the conduct of public diplomacy.[25]

Political Communication

Propaganda itself is a subcategory of political communication, which encompasses a wide range of communicative behaviors that have political ends. One element encompasses the conduct of an effective election campaign, to disseminate the candidate's message and to counter the message of one's opponents. Governments, too, employ various techniques, including as we have seen propaganda, to build support for policies and stifle dissent. Chomsky and Herman's propaganda model of the media[26] "depicts the media system as having a series of five successive filters through which the "raw material of news" must pass, leaving a "cleansed residue" of what "news is fit to print, marginaliz[ing] dissent, and allow[ing] the government and dominant private interests to get their messages across to the public." In brief paraphrase, these filters are (a) a focus on profitability by an increasingly concentrated industry that has close ties to the government and is in a position by sheer volume to overwhelm dissenting media voices, (b) the dependence of these media organizations on funding through advertising, leading them to favor content likely to appeal to the affluent and making concessions to commercial sponsors, (c) the dependence of journalists who work for the media on information from sources that constitute, collectively, a powerful and prestigious establishment; (d) commercial interests that make the media vulnerable to "flak" and criticism from groups and institutions with the power to generate criticism and protest to which they respond with caution; and, finally, (e) "anticommunism" (or some ideological equivalent) that those who produce content have internalized, thus conjoining them to frame the news in a dichotomous fashion, applying one standard to those on "our" side and a quite different one to "enemies." Most recently, the "war against terrorism" has served as a non-ideological substitute…. The propaganda model assigns to the media system just one major function to which everything else is subordinate. That function is the "manufacture of consent" for government policies that advance the goals of corporations and preserve the capitalist system."[27]

Some argue that evolving communications technologies and advertising and marketing techniques are damaging democratic practice by replacing thoughtful discussion with simplistic soundbites and manipulative messages.[28] Campaigns play on our deepest fears and most irrational hopes with the result being that we have a skewed view of the world. That said, media effects on politics are not uniform around the world. Rather, they are the product of the types of media technologies, the structure of the media market, the legal and regulatory framework, the nature of political institutions, and the characteristics of individual citizens.[29] What is more, others argue, by contrast, that "blaming the messenger" overlooks deep-rooted flaws in the systems of representative democracy that are responsible for the sorry condition of political discussion.[30] There is also much discussion about whether the internet is a positive for American democracy.[31] With respect to often delicate peace processes, the role of the media in the Rwandan genocide has given the news media a tarnished reputation. However, in some instances, the news media has sometimes played a constructive role in sustaining peace efforts.[32]

[1] Lippmann, W. A Preface to Morals . New York: Macmillan, 1929. 281.

[2] Guilday, Peter. "The Sacred Congregation De Propaganda Fide." Catholic Historical Review 6. 480. See also: Jowett, Garth S. and Victoria O'Donnell. Propaganda and Persuasion . 3rd ed. Thousand Oaks, CA: Sage Publications, 1999. 72-73.

[3] Combs, J.E. and D. Nimmo. The New Propaganda: The Dictatorship of Palaver in Contemporary Politics . New York: Longman, 1993.

[4] Kimble, James J. "Whither Propaganda? Agonism and 'The Engineering of Consent'." Quarterly Journal of Speech 91.2 (May 2005).

[5] Link, Jurgen. "Fanatics, Fundamentalists, Lunatics, and Drug Traffickers: The New Southern Enemy Image." Cultural Critique 19 (Fall 1991): 33-53.

[6] Kimble, 203.

[7] Jowett, Garth S. and Victoria O'Donnell. Propaganda and Persuasion . 4th ed. Thousand Oaks, CA: Sage Publications, 2006.

[8] For further discussion, see: Center for Media and Democracy. "Propaganda Techniques." < http://www.sourcewatch.org/index.php?title=Propaganda-techniques> .

[9] Horten, Gerd. Radio Goes to War: The Cultural Politics of Propaganda During World War II . Berkeley, CA: University of California Press, 2002.

[10] Cunningham, S.B. The Idea of Propaganda: A Reconstruction . Westport, CT: Praeger, 2002.; Ellul, J. "The Ethics of Propaganda: Propaganda, Innocence and Amorality." Communication 6 (1981): 159-175.; Plaisance, Patrick Lee. 2005. "The Propaganda War on Terrorism: An Analysis of the United States' 'Shared Values' Public-Diplomacy Campaign After September 11, 2001." Journal of Mass Media Ethics 20.4 (2005): 250-268.

[11] Pratkanis, Anthony and Elliot Aronson. Age of Propaganda: The Everyday Use and Abuse of Persuasion . Owl Books, 2001.

[12] Deibert, R. "International Plug 'n' Play: Citizen Activism, the Internet and Global Public Policy." International Studies Perspectives 1.3 (2000): 255-272.; Rothkopf, D. "The Disinformation Age." Foreign Policy 114 (1999): 82-96.; Volkmer, I. News in the Global Sphere . Luton: University of Luton Press, 1999.

[13] Hoskins, Andrew. Televising War: From Vietnam to Iraq . London and New York: Continuum, 2004.

[14] Jowett and O'Donnell (2006), 270.

[15] U.S. Information Agency Alumni Association. "What is Public Diplomacy?" 1 Sep 2002. 2 Apr 2003. < http://www.publicdiplomacy.org/1.htm> .

[16] Ross, Christopher. "Pillars of Public Diplomacy." Harvard Review Aug 2003. Available at: < http://www.iwar.org.uk/news-archive/2003/08-21-3.htm> .

[17] Arquilla, J. and D. Ronfeldt. The Emergence of Noopolitik: Toward an American Information Strategy . Santa Monica, CA: Rand, 1999. w13. < http://www.rand.org/publications/MR/MR 1033/ MR1033.pdf/MR1033.chap3.pdf>.

[18] Smith, Pamela. "What Is Public Diplomacy?" Address before the Mediterranean Academy of Diplomacy, Malta, 2000. < http://diplo.diplomacy.edu/Books/mdiplomacy-book/smith/p.h.%20smith.htm> .

[19] Domke, David. God Willing? Political Fundamentalism In The White House, The War On Terror And The Echoing Press . London: Pluto Press, 2004.

[20] Knightley, Philip. The First Casualty: The War Correspondent as Hero, Propagandist and Myth-maker from the Crimea to Iraq . London: André Deutsch, 2003.; Miller, David (ed.) Tell Me Lies: Propaganda and Media Distortion in the Attack on Iraq . London and Sterling, VA: Pluto Press, 2004.

[21] Klein, N. "The Problem is the U.S. Product." Seattle Post-Intelligencer 28 Jan 2003: B5.

[22] Finn, Helena K. "The Case for Cultural Diplomacy: Engaging Foreign Audiences." Foreign Affairs 82.6 (Nov-Dec 2003): 15.

[23] McKay, Floyd. "Propaganda: America's Psychological Warriors." The Seattle Times , 19 Feb 2006. < http://www.commondreams.org/views06/0219-24.htm> .

[24] Johnson, Stephen and Helle Dale. "How to Reinvigorate U.S. Public Diplomacy." The Heritage Foundation Backgrounder 1645 (23 Apr 2003). < http://www.heritage.org/Research/NationalSecurity/loader.cfm?url=/common... .

[25] GAO Report on Public Diplomacy. 2003. < http://www.gao.gov/new.items/d03951.pdf> .

[26] Herman, Edward S. and Noam Chomsky, Manufacturing Consent: The Political Economy of the Mass Media . New York: Pantheon, 2002. Excepts of a previous edition available at < http://www.thirdworldtraveler.com/Herman%20/Manufac-Consent-Prop-Model.h... .

[27] Lang, Kurt and Gladys Engel Lang. "Noam Chomsky and the Manufacture of Consent for American Foreign Policy." Political Communication 21.93 (2004): 94.

[28] Bennett, W. Lance and Robert Entman (eds.) 2000. Mediated Politics: Communication in the Future of Democracy . Cambridge University Press, 2000.; Pratkanis, Anthony and Elliot Aronson. Age of Propaganda: The Everyday Use and Abuse of Persuasion . Owl Books, 2001.

[29] Gunther, Richard and Anthony Mughan (eds.) Democracy and the Media . Cambridge University Press, 2000.; Hallin, Daniel C. and Paolo Mancini. Comparing Media Systems: Three Models of Media and Politics . Cambridge University Press, 2004.

[30] Norris, Pippa. A Virtuous Circle: Political Communications in Post-Industrial Democracies . Cambridge University Press, 2000.

[31] Bimber, Bruce. Information and American Democracy: Technology in the Evolution of Political Power . Cambridge University Press, 2003.

[32] Wolfsfeld, Gadi. Media and the Path to Peace . Cambridge University Press, 2004.

Additional Resources

The intractable conflict challenge.

introduction of propaganda essay

Our inability to constructively handle intractable conflict is the most serious, and the most neglected, problem facing humanity. Solving today's tough problems depends upon finding better ways of dealing with these conflicts.   More...

Selected Recent BI Posts Including Hyper-Polarization Posts

Hyper-Polarization Graphic

  • Massively Parallel Peace and Democracy Building Roles - Part 4 -- The last in a four-part series of MPP roles looking at those who help balance power so that everyone in society is treated fairly, and those who try to defend democracy from those who would destroy it.
  • Massively Parallel Peace and Democracy Building Links for the Week of May 19, 2024 -- Another in our weekly set of links from readers, our colleagues, and others with important ideas for our field.
  • Crisis, Contradiction, Certainty, and Contempt -- Columbia Professor Peter Coleman, an expert on intractable conflict, reflects on the intractable conflict occurring on his own campus, suggesting "ways out" that would be better for everyone.

Get the Newsletter Check Out Our Quick Start Guide

Educators Consider a low-cost BI-based custom text .

Constructive Conflict Initiative

Constructive Conflict Initiative Masthead

Join Us in calling for a dramatic expansion of efforts to limit the destructiveness of intractable conflict.

Things You Can Do to Help Ideas

Practical things we can all do to limit the destructive conflicts threatening our future.

Conflict Frontiers

A free, open, online seminar exploring new approaches for addressing difficult and intractable conflicts. Major topic areas include:

Scale, Complexity, & Intractability

Massively Parallel Peacebuilding

Authoritarian Populism

Constructive Confrontation

Conflict Fundamentals

An look at to the fundamental building blocks of the peace and conflict field covering both “tractable” and intractable conflict.

Beyond Intractability / CRInfo Knowledge Base

introduction of propaganda essay

Home / Browse | Essays | Search | About

BI in Context

Links to thought-provoking articles exploring the larger, societal dimension of intractability.

Colleague Activities

Information about interesting conflict and peacebuilding efforts.

Disclaimer: All opinions expressed are those of the authors and do not necessarily reflect those of Beyond Intractability or the Conflict Information Consortium.

Beyond Intractability 

Unless otherwise noted on individual pages, all content is... Copyright © 2003-2022 The Beyond Intractability Project c/o the Conflict Information Consortium All rights reserved. Content may not be reproduced without prior written permission.

Guidelines for Using Beyond Intractability resources.

Citing Beyond Intractability resources.

Photo Credits for Homepage, Sidebars, and Landing Pages

Contact Beyond Intractability    Privacy Policy The Beyond Intractability Knowledge Base Project  Guy Burgess and Heidi Burgess , Co-Directors and Editors  c/o  Conflict Information Consortium Mailing Address: Beyond Intractability, #1188, 1601 29th St. Suite 1292, Boulder CO 80301, USA Contact Form

Powered by  Drupal

production_1

introduction of propaganda essay

The Effects of Participatory Propaganda: From Socialization to Internalization of Conflicts

A look at how propaganda has been rewired for the digital age and how this new, “participatory propaganda” mediates conflict, manipulates relationships and creates isolation, both online and offline.

In this essay, Gregory Asmolov, a Leverhulme Early Career Fellow at King’s College London and a scholar noted for his work understanding the Russian Internet (Runet), examines a new set of propaganda strategies emerging on social networks in Ukraine and Russia. He takes us on a conceptual journey from understanding how traditional propaganda has been “rewired” for the digital age to examining its methodologies and impact today. This new phenomenon of "participatory propaganda" seeks not only to persuade users to interpret events through a particular lens, but also to manipulate relationships, dividing friends, breaking alliances and leaving individuals isolated and tractable, online and offline.

— Ethan Zuckerman, Editor

Propaganda is no longer just a tool for changing your opinion. Now, in our digitally mediated world, propaganda is a pathway to instantaneous participation in political conflicts from the safety and comfort of your living room chair. It is also, ironically, now a tool for instantaneously breaking connections between friends and relatives whose opinions differ. Participatory propaganda helps to socialize conflicts and make them part of everyday life. This increasing scope of engagement can also lead to an internalization of conflict, which means that instead of encouraging you to filter alternative sources of information, participatory propaganda aims to reshape your cognitive filters as well as the relationship between you and your environment. 1

Introduction: Back in the USSR

It is October of 1986. I am one of 25 children in the pre-school group of a kindergarten in the Leninsky district of Moscow. It is the “quiet hour” in the middle of the day when children are supposed to nap, but I cannot sleep. I am very worried. Every evening, after the “Spokoynoy Nochi, Malyshi” (Goodnight, Kids) children’s show on television at 8:45 pm, I watch the evening news program “Vremya” (Time). Last evening I heard that our leader, Mikhail Gorbachev, is going to meet the American leader, Ronald Reagan, in Reykjavík. I cannot understand why Gorbachev is going there. I am sure the Americans are going to kill him. I am also sleepless because I am afraid of nuclear war. At this time “Star Wars” for me is not a movie, but a plan for American military aggression against the people of the Soviet Union.

So my parents offered me a new game. They gave me an old radio and taught me how to search for short-wave radio stations. Unlike our TV, which had only six buttons for six channels, the radio offered a range of voices in different languages. The purpose of the game was to scan the short waves and find Russian-speaking stations broadcasting from beyond the borders of the USSR; the so-called “Vrazheskie golosa” (Enemy Voices). It was quite tricky, since the tiniest movements of my fingers would sweep past these stations, and their wavelengths sometimes changed in order to avoid being jammed by the Soviet government.

I learned very quickly how to recognize Radio Freedom, the Voice of Israel, the Voice of America and the BBC. (Who could imagine that 30 years later I would have an office in Bush House, where the BBC Russian Service was broadcasting from at that time, and is now part of King’s College London!) I really enjoyed my parents’ new game. For the first time in my life, I was actively involved in searching for news. I also started to sleep better during the “quiet hour” at kindergarten. Through that radio game I learned that the same events can be described in very different ways. Although I wasn’t able to understand many things, it highlighted the polyphony of voices and framings. I was lucky to have this experience just then, in 1986. Only a year later, “glasnost,” a new policy of media openness, began to influence Soviet TV, and the “enemy voices” lost their unique value as a window onto an alternative reality.

The image of me as a child sitting in my bedroom in front of the radio and searching for “enemy voices” comes back as I think about how the Internet has changed propaganda. In 1986 that old short-wave radio was a physical mediator between me as a user and the global environment. It brought new meanings directly into my bedroom. I didn’t know that what I heard was called “anti-Soviet propaganda.” Similarly, I hadn’t known that the news I watched on TV was propaganda, either. What really mattered was the range of voices brought to me by these various tools of mediation.

Today, with the Internet, it is much easier to find alternative sources. The quality of information is often good, and there is no need for tiny movements of the fingers, although instead of the Soviet-style “glushilki” (jammers), we have new technologies like packet filtering and state-sponsored censorship. These days, however, it seems that even a huge diversity of voices still does not help to challenge propaganda or increase critical thinking. One could suggest that, in order to address this puzzle, we need to focus not on the content of propaganda, but on its delivery, and to ask how the new technological tools used for the proliferation of propaganda change the relationship between users and their environment.

The Affordances of “Rewired Propaganda”: A Mediational Perspective

The comparison between a television set picking up six broadcast channels and a short-wave radio picking up hundreds highlights the difference between closed and open artifacts mediating the relationship between subjects and their environment. “Closed” artifacts transfer only limited streams of information, both in terms of the number of channels and of the scope of sources that can be covered by those channels. “Open” artifacts offer a window onto a limitless world of sources and an unrestricted number of channels. Propaganda has always been more at home in an isolated environment, where it need not compete with alternative sources and where it has a monopoly over shaping the perception of the audience. Counter-propaganda, in its turn, has tried to break this monopoly and find a way through the “curtain” of isolation, either by distributing printed matter (for example, dropping leaflets from the air) or by using radio, whose signal waves are notoriously unimpressed by national borders.

The emergence of the Internet, however, challenged the capacity of state actors to isolate any environment from external information. Some countries, such as North Korea and Turkmenistan, disconnected their local Internet from the global infrastructure in order to maintain that isolation. Others introduced advanced mode of filtering such as what we now know as the “Great Firewall of China.” Russia chose a different path. From the outset, the Russian Internet, also known as Runet, developed as an independent space. Its development has been driven by imaginaries of alternative cultural, social, and political environments beyond the control of traditional political institutions (Asmolov & Kolozaridi, 2017). “The online sphere challenges how the Russian state has traditionally dominated the information heights via television” (Oates, 2016). This, however, sparked a new type of propaganda that would be effective despite the lack of state control over the information environment. This new type of propaganda, described by Sarah Oates as “rewired propaganda,” seeks to neutralize the Internet’s capacity to undermine authority and challenge the narratives of the state. In Oates’s formulation, “a commitment to disinformation and manipulation, when coupled with the affordances of the new digital age, gives particular advantages to a repressive regime that can pro-actively shape the media narrative.”

Rewired propaganda uses some traditional tools of Internet control, like filtering and censorship. But its novelty lies in its preference for more innovative models of propaganda, including sophisticated manipulation of information and computational propaganda (Wooley & Howard, 2016; Sanovich, 2017). Computational propaganda, in particular, relies on affordances that allow fake identities to be created by mutually reinforcing human and non-human agents, including disinformation agents and bots. For example, a believable inauthentic voice is created by an individual, then amplified by bots. These actors not only distribute content but also increase the visibility of information. They may also change the structure of discourse and increase its emotional sentiment. Human image synthesis technologies, which rely on AI and machine learning, provide the means to fabricate evidence, including “deep fakes” where the line between what appears to be genuine and what is not has been eliminated (Edwards, S. & S. Livingston, 2018). Less sophisticated tools for content editing allow these actors to create “shallow fakes” in which an image is recontextualized, or simply misrepresented (Johnson, 2019).

Defining Propaganda

Before we go deeper into our discussion of propaganda, we must first define what it is. One of the classical definitions of propaganda is “the management of collective attitudes by the manipulation of significant symbols” (Laswell, 1927). A more detailed definition states that “Propaganda is the expression of opinions or actions carried out deliberately by individuals or groups with a view to influencing the opinions or actions of other individuals or groups for predetermined ends and through psychological manipulations” (the Institute for Propaganda Analysis, 1937/1972). Yet another describes propaganda as “Communication designed to manipulate a target population by affecting its beliefs, attitudes, or preferences in order to obtain behaviour compliant with political goals of the propagandist” (Benkler et. al., 2018). One may also argue that propaganda often incorporates the voice of the state and is driven by the interests of institutional hegemonic actors.

Three elements are central to these definitions: Propaganda is intentional; it relies on manipulation, specifically through the use of misleading information; and its purpose is to support political goals by drawing out and managing behavior. The challenge, however, is to define what elements within propagandist messaging are misleading or manipulative. Addressing these questions is particularly challenging in the context of conflicts. What is considered propaganda by one side of the conflict would be treated by the other side as the legitimate “presentation of a case in such a way that others may be influenced” (Stuart, 1920) and dissemination of information for a justified cause. It is considerably challenging to “coherently distinguish ‘propaganda’ from a variety of other terms that refer to communication to a population that has a similar desired outcome: persuasion, marketing, public relations, and education” (Benkler et al., 2018).

In order to address some of these challenges, I focus my attention in this essay on one particular aspect of propaganda: its role in the mobilization of individuals and groups. Gustave Le Bon in 1903 was among the first to consider propaganda as a way to shape the opinions and beliefs of crowds in order to move those crowds towards specific goals. By 1965 Jacques Ellul was also focused on the link between propaganda and action, while considering propaganda “A set of methods employed by an organized group that wants to bring about the active or passive participation in its actions of a mass of individuals….” More specific models for the interrelationship between propaganda and desired action had already been mapped by George Bruntz (1938). For example, leaflets dropped from the air onto enemy soldiers can be viewed as a “propaganda of despair” intended to “break down the morale of the enemy,” and at the same time as a “propaganda of hope” intended to present to the enemy army and civilians a picture of a promised land they can enter if they will only lay down their arms.

Understanding propaganda as a way to drive a specific mode of action among a target audience highlights the dual role of propaganda. On the one hand, it seeks to shape a particular world view and offer a specific interpretation of something happening in the environment surrounding the subject. On the other hand, by relying on the symbolic dissemination of meanings, it also seeks to support or provoke an action by this subject that will impact and potentially change the environment in a specific way. This duality can be captured and conceptualized if we approach propaganda from a mediational perspective (Kaptelinin, 2014), in other words, as something that shapes the relationship between a subject and their environment. Relying on that approach, I offer a definition of propaganda that relies on a notion of mediation:

Propaganda is an intentional effort to shape the relationship between an individual target of information (the subject) and their environment (the object) by relying on the dissemination of symbolic meaning in order to support a particular course of the subject’s activity in relation to specific objects of activity.

In a nutshell, digital propaganda changes the relationship between users (subjects) and conflict (objects of users’ activity in their environment).

The relationship between subject and object has two directions. The first direction, from the world towards the subject, relies on the mediation of meaning. The second direction, from the subject towards the world, relies on the mediation of activity. Propaganda aims either to support or change an existing relationship to an object, or to construct a new object that requires the subject’s activity. The intentional construction of subject-object relationships may rely on manipulative psychological techniques, as well as on the dissemination of disinformation. The mediational perspective suggests that the discussion of digital affordances should focus on how new digital means of production and the proliferation of propaganda change the relationship between a subject and their environment. The relationship between digital users in conflicts is an example of the subject-object relationship. In that case, the mediation perspective explores not only how propaganda offers new frames and interpretations of different conflict-related events, but also illustrates the range of activities that is offered to users relying on digital tools in conflict situations.

I’ll note, however, that propaganda does not necessarily aim to construct an active relationship between subject and object. As pointed out by Ellul, one mode of activity is passivity, which is sometimes the mode that the propagandist desires. This often happens in cases where propaganda seeks to induce disorientation, a situation “in which the target population simply loses the ability to tell truth from falsehood or where to go for help in distinguishing between the two” (Benkler et al., 2018).

To sum up, propaganda is not only a way to change a person’s perception of the environment via symbolic means, but also a way to change the behavior of a target audience in order to change the environment. In this sense, mediation always acts in two directions: One, it aims to change the perceptions of the recipient/ target audience (a group of subjects). Two, it aims to shape the activity of the target audience in relation to the environment (or lack of action, should the activity need to be neutralized). In the past, these two processes were distinguishable from each other. First, a subject received a message via an artifact, either in public spaces (e.g. posters, cinema, newsstands or loudspeakers) or in private spaces (TV or radio receivers). The subject then chose to act in accordance with the message they received.

Digital affordances have now changed the structure of relationships between messaging that tinkers with the subject’s perception of the environment and the subject’s activity in relation to that environment. Digital platforms allow Internet users to not only consume information, but to also choose from a broad range of potential follow-on activities in relation to the objects whose perception is shaped by propaganda. In order to understand the effects of “rewired propaganda,” we need to look specifically at how the design of our digital information environment allows for new kinds of links between how subjects receive information and their activity after they receive it.

The Participatory Affordances of Propaganda

Over the last century, propaganda has gradually moved from open squares and public places to our homes. This process can be associated with the domestication of technologies, where the device that mediates meanings, particularly the TV, has continuously occupied domestic spaces (Silverstone, 1994). The boundaries of spaces in which we consume media have expanded further with the rise of mobile technologies including laptops and handheld devices. Maren Hartmann (2013) describes this trend as a shift from domestication to “mediated mobilism.”

As a consequence, propaganda infiltrates our most intimate spaces, where users interact with their laptops and mobile devices. The location of technological interaction is not simply the household, but the bed or sofa — spaces commonly associated with relaxation and entertainment. Propaganda moves from the living room to the bedroom, follows people as they travel to work on crowded public transport, and remains with them in office time. We can wake up and fall asleep with propaganda in our hands. It finds us at the university, in the bathroom or on the beach.

Propaganda is also reshaped by the design of the spaces in which content is encountered and shared. Traditional media relied on physical artifacts such as newspapers or TV, so content consumption was mostly a solitary activity rather than a social one. Even when news consumption happened in a public place, for example, people listening to the radio outside in the square or friends or family watching TV news together, the media space and the social interaction space were separate. In contrast, the interactive nature of digital media removes the gap between the space where content is generated and distributed and the space where content is consumed and discussed. Social networking platforms combine news consumption with social interaction, turning social interaction into a mechanism of content proliferation and selective amplification (Zuckerman, 2018).

The integration of content generation/sharing and content discussion creates an immersive effect whereby users are unable to separate content consumption (and its impact on their lives) from their personal communication. In online environments, the consumption of propaganda is deeply embedded in the structure of social relations, which allows the propaganda to further infiltrate our everyday lives. More important are the ways social media and the spread of online content create opportunities for immediate action: spreading propaganda further, or taking other actions directly suggested by the propaganda.

Propaganda has often been linked to a desired mode of action, such as surrender or contributing one’s resources to a specific cause. Historically, however, the means of propaganda distribution and the means of action were separate and distinct. The target (or subject) of propaganda was first exposed to a message (via leaflet, poster, newspaper article, or broadcast message), which they subsequently acted upon. Due to the participatory nature of digital technologies, propaganda distribution, consumption, and participation often share the same platform and are mediated by the same digital devices (such as mobile phones or laptops). The person exposed to propaganda is also offered a selection of actions to carry out (often instantly) in the same virtual environment.

The consequences of these new participatory affordances are particularly visible in the context of conflicts. In his book iSpy: Surveillance and Power in the Interactive Era , Mark Andrejevic points out that “in a disturbing twist to the interactive promise of the Internet, once-passive spectators are urged to become active participants.” In this way, Andrejovic says, Internet users become citizen-soldiers when “we are invited to participate in the war on terrorism from the privacy of our homes and from our offices, or wherever we might happen to be.” David Patrikarakos analyzes a number of cases of digitally mediated citizen involvement in war and comes to the same conclusion: “In the world of Homo Digitalis, anyone with an Internet connection can become an actor in a war” (Patrikarakos, 2017).

Social Media and Propaganda

At least three novel aspects in the relationship between social media and propaganda are worth considering:

Digitally mediated participation in the creation and proliferation of propaganda and various online content-related activities, including various forms of engagement with content (from commenting to complaining).

Digitally mediated participation in online and offline action triggered by propaganda, beyond content-related activities and relying on various forms of crowdsourcing.

The action of disconnection, using digital means to effect the immediate cutting of social ties, including unfollowing, unfriending or blocking.

The participatory nature of propaganda, particularly where propaganda is linked to a call to take part in propaganda efforts, has been well-documented. “Peer-to-peer propaganda” is a situation where “Ordinary people experience the propaganda posts as something shared by their own trusted friends, perhaps with comments or angry reactions, shaping their own opinions and assumptions” (Haigh et al., 2017). The same researchers argue that “States can rely on citizens’ do-it-yourself disinformation campaigns to maintain the status quo.” Mejias and Vokuev (2017) point out that “…social media can also give ordinary citizens the power to generate false and inaccurate information,” while “propaganda is co-produced by regimes and citizens.” Finally, Khaldarova and Pantti (2016) explore participatory verification of data, where an online initiative such as the StopFake platform “mobilizes ordinary Internet users to engage in detecting and revealing fabricated stories and images on the Ukraine crisis” and address this as “Crowdsourced Information Warfare.”

It is important to differentiate between open and transparent calls to participate in the generation, proliferation, and verification of content in order to support your state, and various forms of clandestine or camouflaged online manipulation designed to trigger user participation. An illustration of an open call can be seen in the case of the Ukrainian I-army project launched by the Ukrainian Ministry of Information:

“In one year, we created a powerful army that defends us in the Donbas area. Now, it’s time to resist Russian invaders on the information front. Every Ukrainian who has access to the Internet can contribute to the struggle. Every message is a bullet in the enemy’s mind.”

A similar type of initiative could be seen on the Russian side. A website, “ Internet Militia ” called on Internet users to take part in defense of the Motherland:

“Even in five minutes you can do a lot. “Internet Militia” — this is a news feed, where links are accompanied by suggestions for direct action. For example, follow the link and leave a comment. What could be easier? Today it is important the participation of everyone who loves his Motherland.”

In many cases, however, user participation is driven not by open, direct calls, but by various forms of psychological manipulation. We can see forms of propaganda that support user engagement via the sharing of the emotional, imaginary and so-called “fake news” and through the activity of state-sponsored trolls and computational propaganda. We can also differentiate between volunteer and paid forms of users participation. These paid forms of participation (as in the case of the Chinese 50 cents party ) limit the scope of participants and usually operate in secret. In other cases, user generated propaganda transforms from crowd participation to targeted engagement of selected users who develop specific skills, as in the case of the Russian troll factory in Ol’gino where “More than 1,000 paid bloggers and commenters reportedly worked only in a single building at Savushkina Street in 2015.” This illustrates the shift from crowdsourcing to outsourcing of propaganda.

Crowdsourcing Conflict

The notion of crowdsourcing is particularly useful when analyzing participatory propaganda, as mobile devices are not only good tools for recirculating content, but also for mobilizing resources. When combined with crowdsourcing, propaganda offers a double effect. It not only builds awareness of the propaganda messaging, but also allows users to respond to propaganda issues at the same time and through the same channel. The range of user resources that can be mobilized by relying on digital mediation of propaganda is astounding and includes: sensor resources (for data collection); analytical resources (for data classification); intellectual resources (to build knowledge and skills); social resources (to engage more people around a specific goal); financial resources (also known as crowdfunding), and physical resources. Crowdsourcing also allows us to highlight how propaganda creates an emotional condition in the user, which in turn supports the mobilization of resources and reshapes the user’s priorities for future resource allocation.

Content-related activities, such as sharing, liking, commenting and complaining, can also be viewed as a form of crowdsourcing since the generation and proliferation of content also relies on the mobilization of user activity. Crowdsourcing as a concept is particularly helpful in showing how propaganda-driven digitally mediated activity goes beyond the usual content-related actions that take place online. The Russia-Georgia and Russia-Ukraine conflicts illustrate the range of potential activities in this context (Deibert et al., 2012; Hunter, 2018). This includes data-gathering for intelligence purposes, diverse forms of open-source intelligence analysis (OSINT), various forms of hacktivism, logistical support for different sides of a conflict, including the purchasing of military equipment through crowdfunding, and various forms of offline volunteering.

Some forms of participation are afforded by increasing the role of big data. For example, modern conflicts take place in an environment where all sides of the conflict as well as the local population in the areas of conflict generate conflict-related data. These data create new opportunities for gathering valuable intelligence, both for informational as well as ground warfare. In that way, users have an opportunity to participate in data generation, collection, and analysis. Some users develop skills for open source intelligence and create online data analytics communities. Examples include groups like the Ukrainian Inform-Napalm, Russian Conflict Intelligence Team (CIT) and UK-based Bellingcat (Toler, 2018). Members of communities also teach others how to analyze conflict-related data. These community groups played a major role in confirming the presence of Russian soldiers in Ukraine, despite denials by Russian leaders, exposing the scale of casualties among Russian soldiers, as well as investigating the downing of Malaysia Airlines flight 17 .

Some Russian conflict related data are not available in open sources, but are still obtained by hackers from both sides of the conflict. Various forms of hacker activities include accessing restricted data or attacking websites that are considered enemy targets. Most aspects of hacktivism require some degree of advanced skills, though a broad range of Internet users can carry out hacking-related tasks using standard computing resources and tools that simplify participation. Members of “the crowd” successfully helped analyze hacker-obtained email and other types of internal communication by the rival side of the Russian conflict. That analysis fed into propaganda and counter propaganda efforts by both sides of the conflict, while also providing valuable intelligence.

Various crowdfunding initiatives sprang up on both sides of the conflict, and relied on social networks and blogs as well as dedicated websites. These crowdfunding efforts supported both traditional military units (particularly on the Ukrainian side) as well as volunteer units, with most of the funds collected being used to purchase military equipment and ammunition. Other crowdfunding efforts enabled offline engagement of Internet users. For example, by using the funds to purchase drones, some Ukrainian users were able to self-organize and establish volunteer groups for air reconnaissance (drone-based surveillance) in order to gather real-time intelligence.

Digital platforms also played a major role in engagement and coordination of various types of warfare-related offline activities. A variety of Ukrainian groups relied on social networks, messengers, and crowdsourcing platforms to coordinate logistical support for volunteer battalions and military units. On the Russian side, dedicated Vkontakte groups as well as the website Dobrovolec.org (volunteers) helped coordinate opportunities for volunteers to join pro-Russian paramilitary units in the eastern Ukraine. And social media on both sides of the conflict allowed users to provide humanitarian assistance to people displaced by conflict.

These examples of digitally-mediated user resource mobilization illustrate the increasing scope of users’ participation in conflict. These forms of participation were shaped by the perception of the conflict as it was communicated via digital media on both the Russian and Ukraine side. I’ll also note that the scope of participation on the Ukrainian side was broader due to a shared understanding that the state is under threat of Russian aggression, and because of the limited capabilities of the Ukrainian traditional military to provide an adequate response during the initial phases of conflict. To some extent, Ukrainian users formed a digitally mediated ecosystem of participation where various forms of conflict-related activity supported one another.

The Ukrainian case demonstrates that digital platforms were effective in supporting users’ participation in conflict, not only due to the connection between the calls to action and the affordance of participation, but also because digital networks exposed the inability of traditional institutions to offer an adequate response to such an external threat. Therefore, one may argue that users’ participation was driven not only by the state’s propaganda but also by narratives related to the absence of state. One may also argue that propaganda as a strategy to shape the relationship between people and conflict aims not only to support people’s engagement, but also to control the scope of participation. On the Russian side of the conflict, the scope of users’ participation was mostly limited to online content-related activities (such as commenting, liking, sharing, etc.) and crowdfunding, while on the Ukrainian side, the scope of participation was substantially broader and went beyond the state’s control.

While participatory propaganda and crowdsourced participation leverage the non-geographic nature of digital content to place production and action in the same channel — a channel that pervades all physical and social spaces in human life — they are not the truly disruptive faces of this phenomenon. More disturbing is propaganda that seeks disconnection. Bruntz (1938) argued that one type of propaganda is “particularist propaganda” that seeks to divide the members of a target audience. Christopher Wiley, a whistleblower who revealed information about Cambridge Analytica’s operations, points out that disconnection is one of the main elements of the Breitbart doctrine that was shaped by Steve Bannon. Wiley says, “If you want to fundamentally change society, you first have to break it. And it’s only when you break it is when you can remold the pieces into your vision of a new society.” (Source: the documentary “The Great Hack”).

Digital Disconnection

Disconnection shapes the boundaries of social networks and consequently their social structure. It is easy to forget that before the digital age, disconnection from a friend required either face-to-face action, such as a refusal to shake hands, or time-consuming mediated action such as sending a letter. Online social networking sites (SNSs) offer not only easier ways to make friends, but also easier ways to unmake them. The affordance of disconnection depends on the particular design of a social networking site. For example, on Twitter a user can be unfollowed, muted, blocked and/or reported. On Facebook, one of the most common acts of disconnection is unfriending.

It is very easy to cut social ties online, as most of us know by now. And, like other types of digitally mediated activities, the disconnection takes place in the same domain as the messages are distributed. Because of this, when political messages including propaganda are pushed out, they can be followed by an immediate act of disconnection, particularly since other users take an active role in the generation and proliferation of the content. So propaganda can not only influence users’ perception of a situation and trigger activity around it, but it also shapes how we perceive other users within the situation. When we receive propaganda via social networks, we are forced to decide whether the sender should remain part of our social network.

Facebook’s design offers a fruitful environment for disconnection since it enables the “sharing [of] the same conversations with highly different audiences” (Schwarz and Shani, 2016). And because people are exposed to the political opinions of their Facebook friends, as well as other bits of information they may not have been privy to otherwise, propaganda becomes an effective tool for disconnection and polarization. Nicholas John and Shira Dvir-Gvirsman (2015) argue that Facebook unfriending can be considered “a mechanism of disconnectivity that contributes to the formation of homogeneous networks.” The constant production of categories used to divide social groups into “us” and “them” as well as disconnection between members of these groups can be viewed as a longterm impact of propaganda. That is, the impact of messages can be seen in changes to social structure and goes beyond the specific context of the situation that triggers unfriending. In the case of the Russia-Ukraine conflict, considerable evidence suggests that the conflict had a robustly destructive impact on strong ties, including those between relatives, close friends and classmates. It mainly affected relationships that had been developed long before the conflict (Asmolov, 2018).

The type of social relationship most affected by disconnective practice was between former classmates. Many platforms and groups support relationships between classmates, including Facebook and a social network called Odnoklassniki (classmates) that is popular among users aged 40 and over. Through these platforms, many people who shared the same school room dozens of years ago found themselves on different sides of the Russia-Ukraine conflict. One Facebook user reported that she unfriended two of her classmates because of their position on the situation in Crimea. Another user from Ukraine described on Facebook an experience of chatting with classmates from Russia on WhatsApp. When his classmates discovered that he lives in Ukraine, they began discussing the conflict and eventually tried to ban him from the chat. A Ukrainian user, Irina Anilovksaya, published a book in 2014 describing the experience of conflict-driven disconnection between people who were once close friends. In the book, Irina describes a two-day exchange of online messages between herself and her classmate Alexander, who lives in Russia. The story begins with a friendly discussion of the events and ends when Alexander and Irina accuse each other of being “zombies” and “people who are afraid of the truth.” They say a mutual farewell forever.

The Effects of Participatory Propaganda

What do these new digital affordances actually do to us as individuals? And what are the effects of participatory propaganda on our individual and collective psyches? Propaganda that relies on the participatory design of digital networks is best explained by looking at the link between two interrelated processes: the socialization of political conflicts and the internalization of political conflicts.

The notion of “conflict socialization” was introduced by E. E. Schattschneider (1975), who argues that “the outcome of all conflict is determined by the scope of its contagion,” while “the number of people involved in any conflict determines what happens.... every increase or reduction in the number of participants, affects the result.” The notion of the scope of contagion highlights the role of the crowd in the context of political conflicts. Schattschneider notes that, “Nothing attracts a crowd as quickly as a fight. Nothing is so contagious.” Schattschneider and others (Coser, 1956) also highlighted many years ago how political actors can control and manipulate a conflict for their own purposes.

Today the digital public sphere offers a new set of tools for the manipulation and control of citizen engagement in conflicts. The socialization of conflict is now driven by the content proliferated through social networks, as well as through the digital affordances of online platforms that offer a range of responses to conflict. The role of content in the socialization of conflicts relies on the distinctive nature of social networking platforms that combine the consumption of news with social interaction, and makes social interaction a mechanism of content proliferation. New information technologies — social networks and crowdsourcing practices — also enable the geographically unrestricted “socialization of conflict.” They provide an option not only of “watching together” but also of “acting together.” In other words, users can participate in a conflict a continent away without ever leaving the safety and comfort of their bedrooms.

As a result, one may argue that propaganda has become less interested in changing people’s opinion about a specific object or in convincing people that it is either truth or fiction. The main purpose of 21st century propaganda is to increase the scope of participation in relation to the object of propaganda. In a digital environment relying on user participation, propaganda is a technology of power that drives the socialization of conflicts and a tool for increasing the scope of contagion. While participation in political debates is often considered to be an important feature of democracy, propaganda allows us to define the structure and form of participation in a way that serves only those who generate propaganda, and minimizing the constructive outcomes of participation. In addition, the focus on propaganda as a driver of participation could be considered a meeting point between political and commercial interests, since increasing engagement with a given object of content is a path towards more pageviews and more surrender of personal data. In that sense, propaganda serves not only the political actors, but also the platform owners.

Increased participation in political conflicts also has effects on both the individual and the collective psyche. This is highlighted by the notion of internalization , developed originally as part of developmental psychology (Vygotsky, 1978). “Internalization of mediated external processes results in mediated internal processes. Externally mediated functions become internally mediated” (Kaptelinin and Nardi, 2006). Through internalization, external cultural artifacts are integrated into the cognitive process and help to define our human relationship with reality. For example, using maps gradually transforms the way we think about our environment and how we navigate it. In other cases, “likes” and emojis have been internalized and have become ingrained in our attitude toward a specific object when we think about it. The way we see things translates into our activities in relation to our environment, but the reverse is also true: Our relationship with our environment is shaped by participatory affordances and by the design of digital networks. In that way, digitally mediated participation in conflict is linked to the development of cognitive filters that shape the way we perceive social reality.

The role of internalization can be seen in the ways in which we think about conflicts and how we consider various objects in the context of a conflict. This suggests that participatory technologies that offer a broad range of ways to participate in conflict both increase socialization of conflict (meaning an increase in the scope of participation), but also create a psychological change in users. The latter process is conceptualized here as the internalization of conflict . This internalization means that the participatory design of social networks shapes not only our views on a specific issue, but our perception of our environment in general. Aleksandr Shkurko tells us that “social cognition is fundamentally categorical” (2014). According to Shkurko, “ we perceive others and regulate our behaviour according to how we position ourselves and others in the social world by ascribing them to a particular social label.” In that light, internalization means that digitally-native propaganda is able to shape the structure of categorization.

As a result, digital participatory propaganda shapes our relationship with our environment beyond any specific topic (object); it changes the apparatus of cognitive optics that structures our perception of everyday life. A conflict encountered through digital propaganda becomes a point of reference for the classification of a broad spectrum of events and social interactions. It shapes interpretative frameworks in a variety of situations that are not related directly to the conflict.

Nobel Laureate Joseph Brodsky noted that humanistic classification of others should rely not on abstract categories of a person’s nationality, culture, religion or political beliefs, but primarily on very specific categories that are related to their deeds; i.e. if they are greedy or not, kind or not, coward or not. Conflict-driven social classification diminishes the role of individual deeds in shaping the structure of social relations and allows the institutional actors and state-sponsored media to impose a dominant structure of classification. For example, friends, relatives, former classmates, and co-workers started to be judged based not on previous interactions, common experiences, their professionalism or their values, but based on their positions in regard to conflict. The activities of everyday life, whether related to work or just a common experience on the street, as well as personal frustrations and joys, are examined through the lens of a conflict. A birthday party or family meeting turns into a discussion of conflict, which either concludes satisfactorily because everyone agrees about the conflict, or transforms into an unpleasant and even hostile encounter if one or more individuals disagrees.

One outcome of internalization is the destruction of social ties between friends by means of disconnection. It is not so much that the shape of social categories shifts, but that certain categories become increasingly significant when it comes to classification of everyday life events and social relationships. Individuals begin to view everything through a conflict-oriented cognitive filter, including issues not at all related to the conflict. Internalizing the conflict — allowing it to reshuffle the relevance of one’s social categories — supports the socialization of the conflict, through recirculating propaganda and mobilizing resources towards crowdsourced warfare projects. In that way, the internalization and socialization of conflict mutually support and reinforce each other.

Figure 1 illustrates how these processes are interrelated. Digital platforms mediate a relationship between a user in Russia or Ukraine (subject of propaganda) and the various aspects related to perception of the conflict (object of propaganda), e.g. the Russian annexation of Crimea. The tools that mediate these relationships offer the user a broad range of conflict-related forms of participation, from proliferation of conflict-related content to crowdfunding, hacktivism, and online volunteering. This is conflict socialization. In contrast, the participation of a user also contributes to an increase in the prominence of conflict in the user’s everyday life and specifically the way conflict-related judgments shape the users’ perception of their social circles and the environment beyond the conflict. The outcome, as we discussed earlier, is that former classmates, friends, and relatives begin to identify primarily with their position vis à vis the conflict. The categories of that position are imposed by propaganda embedded in the news feeds of social networks, and whose effect is multiplied by commenting, sharing, and generating additional propaganda-related content. Those who have an opposite opinion about the topics are excluded from social circles. This is an outcome of conflict internalization.

introduction of propaganda essay

Figure 1: The mechanism of digitally mediated participatory propaganda

Internalization explains the most insidious aspect of digital propaganda: the transformation in users’ cognitive structure that manifests as a shift in their classification structure. This shift, usually to binary thinking — in seeing the world in terms of either you support the Russian statement “Crimea is ours” or oppose it — affects all spheres of the user’s social relations and perceptions of the world far beyond the specific topic of propaganda. The collective and the individual psyche are interrelated. One may suggest that the more propaganda has been socialized, the more it is internalized by the subject and reproduced within the subject. Digitally mediated participation in propaganda-related activities makes propaganda a part of our “inner space” and allows it to define our perception of reality from within.

Conclusion: Beyond the USSR

Back in the USSR, propaganda sought to ensure that the state controlled the way its citizens perceived reality and mobilized their resources. This control was achieved by relying on a monopoly over informational sources. The purpose of Western “counter-propaganda” was to break the walls of informational isolation. The radio that I used as a child to search for “enemy voices” was actually my Internet — an opportunity to look for information in an open environment beyond the walls.

More than 30 years later we live in a significantly different information environment. Thanks to the proliferation of the Internet, states like Russia are not able to control the information environment by limiting the range of sources. Despite infrastructural support and major financial investment, state-sponsored TV channels have become less popular than YouTube (Ostrovsky, 2019). In addition, thanks to social networks and messenger services, personal communication relies on horizontal networks and is not limited by any physical borders. In the “space of flows,” as conceptualized by Manuel Castells (1999), information technologies challenge the state’s sovereignty not only over its territory but also, and significantly, over its citizens. In the multicultural and global information environment, state actors have no effective tools that allow total isolation of their citizens from a broad range of sources (with the exception of North Korea and Turkmenistan). Complete control over the information space through filtering and blocking is very hard to achieve.

The threat that comes from new information technologies was identified by some states at very early stages. The first document signed by Russian President Vladimir Putin in 2000 was the Information Security Doctrine, which addressed new information technologies as a potential threat to political and social stability. Concern over the loss of control in the new media environment is manifest in the way the Russian authorities try to regulate the Internet. The concept of a “sovereign Internet” seeks to equalize the scale of control over cyberspace with the scale of control over offline space. But it seems that most traditional approaches to re-creating various forms of isolation, at least within the Runet, are failing.

The need to compensate for the loss of control over the media environment and social interactions between people has required new approaches. These seek not to restrict new information technologies, but to build on new digital affordances, which allow us to offer a direct link between propaganda and the mobilization of the resources of digital crowds. New forms of propaganda harness the participatory design of social networks, crowdsourcing and the affordances of disconnectivity. They flourish in an environment where news cannot be separated from interpersonal communication.

The purpose of the new propaganda is neither the production of reality nor of unreality. The new propaganda seeks to offer a new way of restoring the state’s sovereignty over people in the new information environment and to rebuild walls that have been demolished by global horizontal networks of communication. It aims to mitigate the capacity of these networks to challenge the state’s sovereignty. If the state is not able to control the flow of information and communication, it targets the way this information is interpreted and analyzed. Conflict-based cognitive filters ensure that horizontal networks and uncontrolled flows of information do not threaten a state’s control over its citizens, as well as expand the control of a state’s actors over individuals beyond its borders.

I’ll note that this essay doesn’t present an argument against digitally mediated participation in conflict. People retain the right to disconnect online as well as offline from people they don’t agree with. The question addressed here, however, is if and how these participatory and disconnective affordances can be harnessed by state actors relying on propaganda in order to achieve their political goals. One may argue, for instance, that a massive digitally mediated participation of users in the Ukrainian conflict was essential in order to protect their country from a potential security threat. It’s not my purpose to draw a line at what type of participation is genuine, and what type of participation can be considered as an outcome of political manipulation. I might argue, however, that participation that is driven by non-genuine actors and information from non-transparent sources, participation that relies on fakes, and participation that harnesses emotions is likely to be considered part of participatory propaganda. The analysis of disconnective action should also focus on whether that type of action was driven by manipulative efforts of institutional actors, shaping our relation with the environment. In that light, I argue that it is essential to understand the political goals of participatory propaganda.

Participatory propaganda restores state sovereignty from within. It aims to build walls in the inner spaces of the subject by shaping categories of perception of the environment. First, it constructs the object of a conflict that can potentially divide people. Second, relying on the design of social networks that combine information proliferation with personal interaction as well as the mediated mobility of devices, it makes this conflict an omnipresent and integral part of everyday life. Third, it offers a range of simple and immediate opportunities for participation in conflict-related activity. Fourth, it increases the importance of conflict in shaping the structure of people’s social categorizations. Finally, it relies on the affordance of disconnectivity to mitigate the capacity of horizontal networks to cross borders and challenge a state’s sovereignty.

What does this sort of propaganda do to us as a society? It is designed to implement new forms of sovereignty. It is designed to replace networked structures of society with fragmentation and polarization. It helps to pull people apart by forcing them into the role of combatants rather than citizens. It is designed to destroy horizontal relationships that offer alternative sources of information and that can potentially be transformed into independent collective action and a broad opposition to institutional actors. It is designed to divide and rule. It produces a reality with new walls and borders that can sever personal relationships and weaken critical thinking capabilities.

Participatory digital propaganda enables the private, everyday identity of users to be occupied and taken over by the institutional actors that propagate it. Addressing these effects of propaganda requires that we lessen the significance of conflict-related categorization for the interpretation of everyday life and offer alternative forms of subject-object and subject-subject relationships that are not driven by conflict. The protection of identity in a conflict-prone digital environment may rely on the user’s capacity to control the scale of their engagement in the conflict and may mitigate the role of conflict-related classification in the interpretation of social relations and everyday life. It also requires that counter-propaganda offer not an alternative view of specific events, but alternative classification structures that protect the autonomy of the subject, horizontal networks, and independent forms of collaboration.

In 2014 and 2015, something strange happened in a place apparently quite far from any conflict: the Russian-speaking segment of Tinder. One could see that an increasing number of users wrote as a part of their personal description either “Crimea ours” or “Crimea not ours.” The relationship with a conflict became not only a signifier for evaluating existing relationships, but also a driver for forming new romantic relationships and friendships. The Crimea conflict found its way into one of the most intimate aspects of life. I argue that the way to counter propaganda is not to convince others whose Crimea it is, but to weaken the role of propaganda in shaping our relations and follow Brodsky’s vision of humanistic social classification. That means we judge and love one another not on the basis of political categories that are created to divide us, but on our everyday deeds and actions.

Andrejevic, M. (2007). iSpy: Surveillance and Power in the Interactive Era . Lawrence, KS.: University Press of Kansas.

Anilovskaya, I. (2014) The War: the correspondence between classmates. Kiev: Alfa Reklama.

Asmolov, G. (2018). The Disconnective Power of Disinformation Campaigns. Journal of International Affairs , Special Issue 71(1.5): 69-76.

Asmolov, G. & Kolozaridi, P. (2017). The Imaginaries of RuNet: The Change of the Elites and the Construction of Online Space. Russian Politics , 2: 54-79.

Benkler Yochai, Robert Faris, and Hal Roberts (2018 ), Network Propaganda Manipulation, Disinformation, and Radicalization in American Politics , Oxford University Press.

Bruntz G. C. (1938). Allied Propaganda and the Collapse of the German Empire in 1918. Stanford, CA.: Stanford University Press.

Castells, M. (1999). Grassrooting the Space of Flows. Urban Geography, 20 (4):294-302.

Coser, L. A. (1956). The Functions of Social Conflict . New York: The Free Press.

Deibert, R. J., Rohozinski, R., and Crete-Nishihata, M. (2012). Cyclones in cyberspace:

Information shaping and denial in the 2008 Russia-Georgia war. Security Dialogue , 43(1).

Edwards, S. & S. Livingston (2018). Fake news is about to get a lot worse. That will make it easier to violate human rights — and get away with it. The Washington Post, April 3, 2018

https://www.washingtonpost.com/news/monkey-cage/wp/2018/04/03/fake-news-is-about-to-get-a-lot-worse-that-will-make-it-easier-to-violate-human-rights-and-get-away-with-it/

Ellul, J. (1965). Propaganda: The Formation of Men's Attitudes . New York: Alfred A. Knopf.

Habermas, J. (1989). The Structural Transformation of the Public Sphere. An Inquiry into a Category of Bourgeois Society . Cambridge MA.: MIT Press.

Haigh M., Haigh, T., & Kozak, N. I. (2017). Stopping Fake News. Journalism Studies , 1-26.

Hartmann, M. (2013). From Domestication to Mediated Mobilism. Mobile Media & Communication , 1 (1): 42–49.

Hunter, M. (2018). Crowdsourced War: The Political and Military Implications of Ukraine’s Volunteer Battalions 2014-2015. Journal of Military and Strategic Studies, Volume 18, Issue 3, 78-124.

Institute for Propaganda Analysis. A. McClung Lee & E. Briant Lee (Eds.) (1972). The Fine Art of Propaganda . New York: Octagon Books.

John, N. A. & Dvir-Gvirsman, S. (2015). ‘I Don’t Like You Any More’: Facebook Unfriending by Israelis During the Israel-Gaza Conflict of 2014. Journal of Communication, 65 (6): 953-974.

John, N. A. & Gal, N. (2018). “He’s Got His Own Sea”: Political Facebook Unfriending in the Personal Public Sphere. International Journal of Communication, 12 : 2971–2988.

Johnson, B. (2019) Deepfakes are solvable — but don’t forget that “shallowfakes” are already pervasive, MIT Technology Review, Mar 25, 2019 https://www.technologyreview.com/s/613172/deepfakes-shallowfakes-human-rights/

Kaptelinin, V. (2014). The Mediational Perspective on Digital Technology: Understanding the Interplay between Technology, Mind and Action. In S. Price, C. Jewitt & B. Brown (Eds.), The Sage Handbook of Digital Technology Research (pp. 203-217). London: Sage.

Kaptelinin, V. & Nardi, B. A. (2006). Acting with Technology: Activity Theory and Interaction Design. Cambridge, MA.: MIT Press.

Khaldarova I & Pantti, M. (2016). Fake News. Journalism Practice , 10 (7): 891-901. DOI: 10.1080/17512786.2016.1163237

Laswell. H. (1927). Propaganda Technique in the World War. New York: Alfred A. Knopf.

Le Bon, G. (1903). The Crowd . London: Unwin.

Light, B. (2014). Disconnecting with Social Networking Sites . Basingstoke, UK: Palgrave Macmillan.

Mejias U. A., & Vokuev, N. E. (2017). Disinformation and the Media: The Case of Russia and Ukraine Media. Culture & Society , 39 (7): 1027– 1042.

Oates, S. (2016). Russian Media in the Digital Age: Propaganda Rewired. Russian Politics , 1 : 398-417.

Ostrovsky, A. (2019). Russians are Shunning State-Controlled TV for YouTube.

The Economist , March 7 th . https://www.economist.com/europe/2019/03/09/russians-are-shunning-statecontrolled-tv-for-youtube

Patrikarakos, D. (2017). War in 140 Characters. How Social Media is Reshaping Conflict in the Twenty-First Century . New York: Basic Books.

Sanovich, S. (2017). Computational Propaganda in Russia: The Origins of Digital Disinformation. In S. Woolley & P. N. Howard (Eds.), Working Paper 2017.3 . Oxford: Project on Computational Propaganda.

Schattschneider, E. E. (1975). The Semisovereign People: A Realist's View of Democracy in America . Harcourt Brace College Publishers.

Schwarz, O. & Shani, G. (2016). Culture in Mediated Interaction: Political Defriending on Facebook and the Limits of Networked Individualism. American Journal of Cultural Sociology, 4 : 385–421.

Shkurko, A. V. (2014). Cognitive Mechanisms of Ingroup/Outgroup Distinction. Journal for the Theory of Social Behaviour, 45 (2): 188-213.

Silverstone, R. (1994). Television and Everyday Life . London: Routledge.

Silverstone, R. (2002). Complicity and Collusion in the Mediation of Everyday Life. New Literary History, 33 (4): 761-780.

Stuart, C. (1920). Secrets of Crewe House: The Story of a Famous Campaign . London, New York, and Toronto: Hodder and Stoughton.

Toler A. (2018) Crowdsourced and Patriotic Digital Forensics in the Ukrainian Conflict. In: Hahn O., Stalph F. (eds) Digital Investigative Journalism. Palgrave Macmillan, Cham

Vygotsky, L. S. (1978), Mind in Society: The Development of Higher Psychological Processes. Cambridge, MA.: Harvard University Press.

Wooley S. C., & Howard, P. N. (2016). Political Communication, Computational Propaganda, and Autonomous Agents. International Journal of Communication , 10 : 4882–4890.

Zuckerman, E. (2018) Four problems for news and democracy, Medium.com. https://medium.com/trust-media-and-democracy/we-know-the-news-is-in-crisis-5d1c4fbf7691

Propaganda, Persuasion and Public Relations Essay

  • To find inspiration for your paper and overcome writer’s block
  • As a source of information (ensure proper referencing)
  • As a template for you assignment

The ethics of persuasion

Propaganda is a method of communication which is used to influence the attitudes of specific groups of individuals towards a particular cause or position (Propaganda, 2010).

In essence, instead of a sense of impartiality propaganda actually presents information in such a way so as to influence an audience through selective dissemination of information in order to create an emotional rather than a rational response to certain issues (Propaganda, 2010).

For example in the case of the Australia’s cancellation of the Fuel Watch program Senator Xenaphon utilized propaganda stating that Fuel Watch was not an effective means of helping consumers stating the need to tackle the big four oil companies using another method, what most people fail to notice is that he omits the details the successes the Fuel Watch program actually had which indicates possible ulterior motives on his part (Battersby, 2008).

His actions resulted in the end of the National Fuel Watch scheme which to an extent could be considered a step back from giving consumers more control over how they purchase gasoline (VACC, 2008). What must be understood is that propaganda utilizes elements such as loaded questions, partial synthesis or even lying by omission in order to gain the desired response (Wilcox & Cameron 2009).

One use of effective propaganda can be seen in the online article “Cultural Cringe where the writer selectively introduces facts which lambasts and derides the Australian video presentation for its World Cup 2022 bid (Hunter, 2010).

Throughout the article there is little mention of the creativity that went into the video, the unique approach that Australia took or the overwhelming positive response viewers had for the commercial, rather, what is mentioned is nothing more than a continuous tirade focused against commercial itself (Hunter, 2010).

It must be noted though the use of the term propaganda, as stated by Wilcox, has been connected to falsehoods, lies and deception (Wilcox & Cameron, 2009). It is true though that propaganda used by various PR departments have been utilized in various political campaigns as a form of political warfare where detrimental facts on rival candidates are released to the general public (Propaganda, 2010).

On the other hand propaganda is also used in various public information campaigns by governments for positive effects such as the Australian governments fight against illegal downloads connoting their use with stealing and its use by the U.S. during the invasion of Afghanistan and Iraq as a supposed “war on terror”. In essence the use of propaganda and its effects can be associated with the ethical reasoning behind its usage.

Wilcox states “perception is interpreted as being used in the following manner: to change or neutralize hostile opinions, to crystallize latent opinions and positive attitudes, and finally to conserve favorable opinions” (Wilcox & Cameron, 2009). As such the importance of persuasion to successful contemporary public relations all boils down to its ability to influence individuals towards a certain train of thought.

As such it can be stated that persuasion shapes perceptions and thus the way people interpret and accept information. As seen in the examples related to propaganda, persuasion should always attempt to follow a certain ethical guideline when used in Public Relations.

The concept of corporate social responsibility should be considered an integral part of most PR practices due to its ability to sway public opinion either in favor for or against a particular company (Berenbeim, 2006). For PR departments what should be considered good for the company should also be directly proportional to what is beneficial for consumers.

In such cases where the good of the company is put above that of the consumer that in itself is in direct violation of the ethical guidelines of persuasion (Messina, 2007). One example of honest and effective persuasion can be seen in the Bowen article summarizing the necessity for the fuel watch scheme and outlining exactly what it entails (Bowen, 2008).

On the other hand an example in Australia of the ethical violation of persuasion is the production and sale of vitamin water by Glacéau in which the company states that the water being sold has been “enriched” with vitamins in order to aid people attain a healthy lifestyle (Adam, 2008).

Far from actually contributing to a person’s health and well being vitamin water and its additives could potentially cause health problems in the future, especially if the product is consumed on a regular basis as a replacement for water (Glaceau lands Coke in deep water, 2010).

On average a single bottle of vitamin water produced by Glacéau contains 32 grams of crystalline fructose which is nothing more than a derivative of high fructose corn syrup which numerous scholarly articles and independent journals have linked to the rapid onset of obesity in various populations.

In this case not only is the company marketing drinks with vitamins which might not even be absorbed but the amount of sugars present in each drink is actually detrimental for the future health of a person especially if they replace ordinary water with vitamin drinks.

This example is a clear case of what not to use persuasion for, not only is it in direct violation of corporate social responsibility but convincing people to think that a drink is healthy when in fact it could cause health is problems is highly unethical by most standards.

Persuasion should be done when either trying to establish an idea, state relevant facts or modes of thought, it should not be used to directly lie to an audience and convince them to do something which could possibly endanger their well being (Messina, 2007).

Based on the information presented it can be stated without question that effective persuasion truly does shape perceptions and thus the way people interpret and accept information which makes it an important tool in contemporary public relations. It must be noted though that just because a persuasive argument is effective does not make it ethical.

Examples such as the case of vitamin water show that at times persuasive arguments are used in such a way that their results are actually detrimental towards people. It is up to people taking PR whether to discern through proper ethical reasoning whether what they use persuasive skills for will result in either beneficial or detrimental results.

Adam, C. 2008, Coke uncorks water brand in Australia, B&T Magazine , 58, 2646, p. 3, Business Source Premier, EBSCO host .

Battersby, L. 2008, Senate kills off FuelWatch , The Age , p. 1. Web.

Berenbeim, R. E. 2006, ‘Business Ethics and Corporate Social Responsibility’, Vital Speeches of the Day , 72, 16/17, pp. 501-504, Academic Search Premier, EBSCO host .

Bowen, C. (MP) 2008, A national fuelwatch scheme, joint media release with Hon . Kevin Rudd MP, Australian Government Treasury, p. 1. Web.

Glaceau lands Coke in deep water 2010, Marketing Week (01419285) , 33, 31, p. 12, Vocational and Career Collection, EBSCO host .

Hunter, T. 2010, Cultural cringe: World Cup roo has critics hopping mad , The Age Online , p. 1. Web.

Messina, A. 2007, ‘Public relations, the public interest and persuasion: an ethical approach’, Journal of Communication Management , 11, 1, pp. 29-52, Business Source Premier, EBSCO host .

Propaganda 2010, Columbia Electronic Encyclopedia, 6th Edition , p. 1, Literary Reference Center, EBSCO host .

VACC 2008, VAAC welcomes the end of the national Fuelwatch Scheme , media release, p. 1. Web.

Wilcox, D. L. & Cameron, G. T. 2009, Public relations: strategies and tactics , 9th edn, p. 229 – 242,(international edn), Pearson Education, Boston, Massachusetts.

  • Value of Persuasion: Spin-Doctoring
  • What Are the Elements of Persuasion
  • Commercial Advertising as a Propaganda System
  • Global Health and IT Solutions
  • Ellen Goodman’s “In Praise of a Snail’s Pace”
  • Communication Culture: Hall's High and Low-Context Model of Culture
  • Communication Perspectives
  • Effect of Social Media on Depression
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2019, May 2). Propaganda, Persuasion and Public Relations. https://ivypanda.com/essays/propaganda-persuasion-and-public-relations-essay/

"Propaganda, Persuasion and Public Relations." IvyPanda , 2 May 2019, ivypanda.com/essays/propaganda-persuasion-and-public-relations-essay/.

IvyPanda . (2019) 'Propaganda, Persuasion and Public Relations'. 2 May.

IvyPanda . 2019. "Propaganda, Persuasion and Public Relations." May 2, 2019. https://ivypanda.com/essays/propaganda-persuasion-and-public-relations-essay/.

1. IvyPanda . "Propaganda, Persuasion and Public Relations." May 2, 2019. https://ivypanda.com/essays/propaganda-persuasion-and-public-relations-essay/.

Bibliography

IvyPanda . "Propaganda, Persuasion and Public Relations." May 2, 2019. https://ivypanda.com/essays/propaganda-persuasion-and-public-relations-essay/.

Home — Essay Samples — Literature — Animal Farm — Propaganda In Animal Farm

test_template

Propaganda in Animal Farm

  • Categories: Animal Farm George Orwell Propaganda

About this sample

close

Words: 1353 |

Published: Apr 29, 2022

Words: 1353 | Pages: 3 | 7 min read

Works Cited

  • Fitzpatrick, S. (n.d.). Propaganda on Animal Farm. Retrieved from https://www.johndclare.net/AnimalFarm_Fitzpatrick.htm
  • Orwell, G. (1945). Animal Farm. New York, NY: Harcourt Brace Jovanovich.
  • Robb, G. (2017). Political Propaganda: George Orwell's Animal Farm.
  • Roland, C. G. (2015). Techniques of Propaganda in Animal Farm.
  • Whitman, R. G. (2013). Animal Farm and Soviet History.
  • Biondich, M. (2006). The Power of Propaganda: A Comparative Analysis of Animal Farm and North Korea.
  • Forbes, S. (2014). Propaganda in Animal Farm and Nineteen Eighty-Four. Retrieved from https://dalspace.library.dal.ca/bitstream/handle/10222/51526/Forbes-Sarah-MLIS-MLIS-July-2014.pdf
  • Kalu, V. O., & Ukonze, C. O. (2017). Propaganda Techniques in George Orwell's Animal Farm. Retrieved from https://www.researchgate.net/publication/319546375_Propaganda_Techniques_in_George_Orwell's_Animal_Farm
  • Krockel, M. (2012). Animal Farm: A Study Guide.
  • Perri, D. (2016). Propaganda, Persuasion, and Animal Farm.

Image of Dr. Charlotte Jacobson

Cite this Essay

Let us write you an essay from scratch

  • 450+ experts on 30 subjects ready to help
  • Custom essay delivered in as few as 3 hours

Get high-quality help

author

Dr. Heisenberg

Verified writer

  • Expert in: Literature Sociology

writer

+ 120 experts online

By clicking “Check Writers’ Offers”, you agree to our terms of service and privacy policy . We’ll occasionally send you promo and account related email

No need to pay just yet!

Related Essays

2 pages / 697 words

3 pages / 1300 words

3 pages / 1286 words

4 pages / 1725 words

Remember! This is just a sample.

You can get your custom paper by one of our expert writers.

121 writers online

Still can’t find what you need?

Browse our vast selection of original essay samples, each expertly formatted and styled

Related Essays on Animal Farm

George Orwell's Animal Farm is a political allegory that satirizes the Russian Revolution and the rise of Stalinism. The novel explores the corrupting influence of power and the manipulation of language to control the [...]

George Orwell's Animal Farm is a classic novel that serves as a powerful allegory of the Russian Revolution and the subsequent rise of Stalinism. One of the key themes in the novel is irony, where there is a stark contrast [...]

One of the key elements of the novel is the 7 Commandments that are established early on in the story and are meant to govern the behavior of the animals on the farm. However, as the story progresses, these commandments are [...]

Similes in Animal Farm serve as powerful tools in conveying complex ideas, illuminating the animal world, and engaging readers in a deeper exploration of the themes and messages of the novel. By drawing parallels between animal [...]

George Orwell’s Animal Farm exemplifies the influence of literacy on power, and draws a direct relationship between the two, attributing the novel’s leaders’ rise to power due to their abilities to read and write. Using [...]

In George Orwell's allegorical novel Animal Farm, the character of Squealer serves as the voice of propaganda and manipulation, using his intelligence and cunning to manipulate the other animals on the farm. Squealer's ability [...]

Related Topics

By clicking “Send”, you agree to our Terms of service and Privacy statement . We will occasionally send you account related emails.

Where do you want us to send this sample?

By clicking “Continue”, you agree to our terms of service and privacy policy.

Be careful. This essay is not unique

This essay was donated by a student and is likely to have been used and submitted before

Download this Sample

Free samples may contain mistakes and not unique parts

Sorry, we could not paraphrase this essay. Our professional writers can rewrite it and get you a unique paper.

Please check your inbox.

We can write you a custom essay that will follow your exact instructions and meet the deadlines. Let's fix your grades together!

Get Your Personalized Essay in 3 Hours or Less!

We use cookies to personalyze your web-site experience. By continuing we’ll assume you board with our cookie policy .

  • Instructions Followed To The Letter
  • Deadlines Met At Every Stage
  • Unique And Plagiarism Free

introduction of propaganda essay

introduction of propaganda essay

At The Brink

An Introduction: It’s Time to Protest Nuclear War Again

Kathleen Kingsbury, Opinion Editor

The threat of nuclear war has dangled over humankind for much too long. We have survived so far through luck and brinkmanship. But the old, limited safeguards that kept the Cold War cold are long gone. Nuclear powers are getting more numerous and less cautious. We’ve condemned another generation to live on a planet that is one grave act of hubris or human error away from destruction without demanding any action from our leaders. That must change.

In New York Times Opinion’s latest series, At the Brink, we’re looking at the reality of nuclear weapons today. It’s the culmination of nearly a year of reporting and research. We plan to explore where the present dangers lie in the next arms race and what can be done to make the world safer again.

W.J. Hennigan, the project's lead writer, begins that discussion today by laying out what’s at stake if a single nuclear weapon were used, as well as revealing for the first time details about how close U.S. officials thought the world came to breaking the decades-long nuclear taboo.

Russia’s president, Vladimir Putin, threatened in his 2024 annual speech that more direct Western intervention in Ukraine could lead to nuclear conflict. Yet an American intelligence assessment suggests the world may have wandered far closer to the brink of a nuclear launch more than a year earlier, during the first year of Mr. Putin's invasion.

This is the first telling of the Biden administration’s efforts to avoid that fate, and had they failed, how they hoped to contain the catastrophic aftermath. Mr. Hennigan explores what happened during that tense time, what officials were thinking, what they did and how they’re approaching a volatile future.

In the first essay of the series, W.J. Hennigan lays out the risks of the new nuclear era and how we got here. You can listen to an adaptation of the piece here .

Within two years, the last major remaining arms treaty between the United States and Russia is to expire. Yet amid mounting global instability and shifting geopolitics, world leaders aren’t turning to diplomacy. Instead, they have responded by building more technologically advanced weapons. The recent intelligence on Russia’s development of a space-based nuclear weapon is the latest reminder of the enormous power these weapons continue to wield over our lives.

There is no precedent for the complexity of today’s nuclear era. The bipolarity of the Cold War has given way to a great-power competition with far more emerging players. With the possibility of Donald Trump returning as president, Iran advancing its nuclear development and China on track to stock its arsenal with 1,000 warheads by 2030, German and South Korean officials have wondered aloud if they should have their own nuclear weapons, as have important voices in Poland, Japan and Saudi Arabia.

The latest generation of nuclear technology can still inflict unspeakable devastation. Artificial intelligence could someday automate war without human intervention. No one can confidently predict how and if deterrence will work under these dynamics or even what strategic stability will look like. A new commitment to what could be years of diplomatic talks will be needed to establish new terms of engagement.

Over the past several months, I’ve been asked, including by colleagues, why I want to raise awareness on nuclear arms control when the world faces so many other challenges — climate change, rising authoritarianism and economic inequality, as well as the ongoing wars in Ukraine and the Middle East.

Part of the answer is that both of those active conflicts would be far more catastrophic if nuclear weapons were introduced into them. Consider Mr. Putin’s threat at the end of February: “We also have weapons that can strike targets on their territory,” the Russian leader said during his annual address. “Do they not understand this?”

The other answer lies in our recent history. When people around the world in the 1960s, ’70s, ’80s and early ’90s began to understand the nuclear peril of that era, a vocal constituency demanded — and achieved — change.

Fear of mutual annihilation last century spurred governments to work together to create a set of global agreements to lower the risk. Their efforts helped to end atmospheric testing of nuclear weapons, which, in certain cases, had poisoned people and the environment. Adversarial nations started talking to each other and, by doing so, helped avoid accidental use. Stockpiles were reduced. A vast majority of nations agreed to never build these weapons in the first place if the nations that had them worked in good faith toward their abolishment. That promise was not kept.

In 1982 as many as a million people descended on Central Park calling for the elimination of nuclear arms in the world. More recently, some isolated voices have tried to raise the alarm — Jamie Dimon, the chief executive of JPMorgan Chase, said last year that “the most serious thing facing mankind is nuclear proliferation” — but mostly such activism is inconceivable now. The once again growing threat of nuclear weapons is simply not part of the public conversation. And the world is less secure.

Today the nuclear safety net is threadbare. The good news is that it can be restitched. American leadership requires that Washington marshal international support for this mission — but it also requires leading by example. There are several actions that the U.S. president could take without buy-in from a Congress unlikely to cooperate.

As a first step, the United States could push to reinvigorate and establish with Russia and China, respectively, joint information and crisis control centers to ensure that misunderstandings and escalation don’t spiral. Such hotlines have all but gone dormant. The United States could also renounce the strategy of launching its nuclear weapons based only on a warning of an adversary’s launch, reducing the chance America could begin a nuclear war because of an accident, a human or mechanical failure or a simple misunderstanding. The United States could insist on robust controls for artificial intelligence in the launch processes of nuclear weapons.

Democracy rarely prevents war, but it can eventually serve as a check on it. Nuclear use has always been the exception: No scenario offers enough time for voters to weigh in on whether to deploy a nuclear weapon. Citizens, therefore, need to exert their influence well before the country finds itself in such a situation.

We should not allow the next generation to inherit a world more dangerous than the one we were given.

  • Share full article

Advertisement

IMAGES

  1. World War I: Propaganda Essay Example

    introduction of propaganda essay

  2. The Use Of Propaganda In The Nazi Regime Essay Example

    introduction of propaganda essay

  3. Propaganda model essay

    introduction of propaganda essay

  4. Propaganda Poster Essay.docx

    introduction of propaganda essay

  5. Propaganda: Definition and Useful Examples in Spoken & Written Language

    introduction of propaganda essay

  6. Essay About Propaganda In Media

    introduction of propaganda essay

VIDEO

  1. ПРОПАГАНДА

  2. CSS 2024 Essay Paper Solved

  3. How Russian propaganda rewrites history

  4. Propaganda Study Institute

  5. Warum »Remigration« so aufregt

  6. The Year That Created Hitler: Shorted

COMMENTS

  1. Propaganda

    propaganda, dissemination of information—facts, arguments, rumours, half-truths, or lies—to influence public opinion.It is often conveyed through mass media.. Propaganda is the more or less systematic effort to manipulate other people's beliefs, attitudes, or actions by means of symbols (words, gestures, banners, monuments, music, clothing, insignia, hairstyles, designs on coins and ...

  2. Propaganda, misinformation, and histories of media techniques

    This essay argues that the recent scholarship on misinformation and fake news suffers from a lack of historical contextualization. The fact that misinformation scholarship has, by and large, failed to engage with the history of propaganda and with how propaganda has been studied by media and communication researchers is an empirical detriment to it, and

  3. Definition and Examples of Propaganda

    Propaganda is a form of psychological warfare that involves the spreading of information and ideas to advance a cause or discredit an opposing cause. In their book Propaganda and Persuasion (2011), Garth S. Jowett and Victoria O'Donnell define propaganda as "the deliberate and systematic attempt to shape perceptions, manipulate cognitions, and ...

  4. Defining propaganda: A psychoanalytic perspective

    This essay proposes to define propaganda through psychoanalytical research pioneered by Erich Fromm on symbiotic relations. Symbiotic relations, when transferred from biology to psychology and sociology, describe a process of allowing a person to merge with something big and important, therefore creating meaning beyond an individual's life. ...

  5. Propaganda

    Propaganda is a modern Latin word, the neuter plural gerundive form of propagare, meaning 'to spread' or 'to propagate', thus propaganda means the things which are to be propagated. Originally this word derived from a new administrative body (congregation) of the Catholic Church created in 1622 as part of the Counter-Reformation, called the Congregatio de Propaganda Fide (Congregation for ...

  6. What Is Propaganda, and What Exactly Is Wrong with It?

    Frequently, the purpose of propaganda is to support the inter-. ests of a country, or of a political party, government or regime that. directs the affairs of the country. But other groups or ...

  7. PDF What Is Propaganda, and How Does It Differ From Persuasion?

    Doob, who defined propaganda in 1948 as "the attempt to affect the person-alities and to control the behavior of individuals towards ends considered unscientific or of doubtful value in a society at a particular time" (p. 390), said in a 1989 essay that "a clear-cut definition of propaganda is neither possible nor desirable" (p. 375).

  8. Introduction: Thirteen Propositions About Propaganda

    Abstract. In an effort to reorient the field of propaganda studies, this essay offers thirteen interrelated propositions about propaganda. The concept is defined as a mode of mass persuasion with a distinct historical genesis predating its modern use strictly as a term of disrepute.

  9. Introduction Propaganda: 'a good word gone wrong'

    Introduction . For most people, the word 'propaganda' conjures up all sorts of negative connotations - from brainwashing to dirty tricks to outright lying. Theoretically, this is misguided. However, although it is now probably too late in the day to attempt to strip away the negative connotations in the popular mind of this 'P word ...

  10. The Oxford Handbook of Propaganda Studies

    Abstract. This handbook includes 23 essays by leading scholars from a variety of disciplines, divided into three sections: (1) Histories and Nationalities, (2) Institutions and Practices, and (3) Theories and Methodologies. In addition to dealing with the thorny question of definition, the handbook takes up an expansive set of assumptions and a ...

  11. Nazi Propaganda Visual Essay

    This visual essay includes a selection of Nazi propaganda images, both "positive" and "negative.". It focuses on posters that Germans would have seen in newspapers like Der Stürmer and passed in the streets, in workplaces, and in schools. Some of these posters were advertisements for traveling exhibits—on topics like "The Eternal ...

  12. Project MUSE

    5 Paul A. Smith describes political warfare as "the use of political means to compel an opponent to do one's will" and "its chief aspect is the use of words, images, and ideas, commonly known, according to context, as propaganda and psychological warfare." 6 Carnes Lord notes a "tendency to use the terms psychological warfare and political ...

  13. Propaganda

    Propaganda By Eric Brahm August 2006 Overview The term propaganda has a nearly universally negative connotation. Walter Lippmann described it as inherently "deceptive" and therefore evil.[1] Propaganda is more an exercise of deception rather than persuasion. Partisans often use the label to dismiss any claims made by their opponents while at the same time professing to never employ propaganda ...

  14. The Power of Propaganda: Dsicussion

    Introduction. Without a doubt, propaganda is one of the most powerful forms of advertisement. It operates with the sole purpose of influencing the beliefs and opinions of a large group of individuals. Merriam-Webster Online (2007) operationally defines it as "the spreading of ideas, information, or rumor for the purpose of helping or injuring ...

  15. Propaganda

    Old-style propaganda, characterized by static, bold images, an art form in itself, began to evolve with the introduction of talk radio, then emails, blogs, podcasts, and the extensive array of ...

  16. PDF Understanding Media Propaganda in the 21st Century

    propaganda model, some revisions, and a clear-eyed application." —Professor Alison Edgley . Author, The Social and Political Thought of Noam Chomsky "Simon Foley's Understanding Media Propaganda in the 21st Century is a worthy successor to Herman and Chomsky's Manufacturing Consent. The book

  17. PDF Propaganda, misinformation, and histories of media techniques

    This essay argues that the recent scholarship on misinformation and fake news suffers from a lack of historical contextualization. The fact that misinformation scholarship has, by and large, failed to engage ... Introduction Propaganda has a history and so does research on it. In other words, the mechanisms and methods ...

  18. How to Write a Propagandist Essay

    Propaganda is an attempt on the part of the writer to influence the opinion of the audience, often by using selective wording or by omitting certain truths or ideas. Writing a propagandist essay is similar in form to writing any other type of essay, but your research, tone and word choice will be quite different. ...

  19. The Effects of Participatory Propaganda: From Socialization to

    Introduction: Back in the USSR. It is October of 1986. I am one of 25 children in the pre-school group of a kindergarten in the Leninsky district of Moscow. ... In order to address some of these challenges, I focus my attention in this essay on one particular aspect of propaganda: its role in the mobilization of individuals and groups. Gustave ...

  20. 103 Propaganda Essay Topics & Examples

    In your propaganda essay, you might want to focus on the historical or ethical aspects of the issue. Another interesting option would be to focus on a particular case and discuss the effectiveness of propaganda. In this article, we've gathered a list of top propaganda topics to write about. They will suit for essays, research papers, speeches ...

  21. Propaganda, Persuasion and Public Relations Essay

    Propaganda is a method of communication which is used to influence the attitudes of specific groups of individuals towards a particular cause or position (Propaganda, 2010). We will write a custom essay on your topic. In essence, instead of a sense of impartiality propaganda actually presents information in such a way so as to influence an ...

  22. Propaganda

    Explore our free top-notch 'Propaganda' essay examples for insights and inspiration. Craft your own paper with our comprehensive database. Free essays. My List(0) About us; Our services ... Introduction Propaganda, the art of influencing opinions and shaping collective actions, manifests in two distinct forms—rational propaganda aligned with ...

  23. Propaganda In Animal Farm: [Essay Example], 1353 words

    Animal Farm is certainly among George Orwell's most famous works. It is an allegory of totalitarian regimes and how they functioned. A very important tool used at the farm is propaganda. Propaganda is the key source from which the pigs gain their power. Although propaganda can be a broad term and there are a lot of different ways of spreading ...

  24. Opinion

    Guest Essay. I've Seen How the Biden-Trump Rematch Ends, and It's Pretty Scary ... widespread repression and the introduction of stringent censorship. ... far from an organ of state propaganda ...

  25. Opinion

    In the first essay of the series, W.J. Hennigan lays out the risks of the new nuclear era and how we got here. You can listen to an adaptation of the piece here.. In the first essay of the series ...