Definitions
Below is a list of common terms related to misinformation, disinformation, and media manipulation, as well as specific terms the Casebook uses to code cases.
Featured Definitions
As the practice of media manipulation evolves in response to changes in policies, regulations, and politics, so do the tactics. Below are some of the ways campaign operators take advantage of our networked media ecosystem.
4
A website that hosts message boards on a variety of topics from music to politics to anime. The site is broken down into distinct message boards where mostly anonymous users post and reply to threads. Threads that garner new replies are "bumped" to the top of the respective message board. 4chan is a Casebook value under the "Network Terrain" variable in the code book.
8
A recently defunct image board owned by Jim Watkins that has been linked to the propagation of white supremacy, neo-Nazism, and the dissemination of manifestos by multiple mass shooters. In 2019, the website was taken offline after its Domain Name System (DNS) and domain registration was terminated. The site has since reformed and rebranded itself as "8kun," primarily comprised of QAnon-related imageboards. 8chan is a Casebook value under the "Network Terrain" variable in the code book.
A
Accounts may be suspended if they violate the terms of service—also referred to as community guidelines—of a social media platform. The suspension is often flagged by both automated processes and human review. Users can often appeal a suspension if they believe their account was suspended by mistake. Accounts may be permanently removed if there have been repeated violations of the terms of service. Account suspension is a Casebook value under the "Mitigation" variable in the code book.
A period of time when the normal state of affairs is interrupted by unforeseen events that are troubling and potentially dangerous. Active crises trigger confusion and require urgent action and immediate attention. Due to the increased media attention and importance of any decisions made during this time, active crises are vulnerable to being exploited by media manipulation. Active crisis is a Casebook value under the "Vulnerabilities" variable in the code book.
Individuals or groups that campaign for social, political, or legal change. They may be formally organized (e.g., registered non-governmental organization) or loosely affiliated (e.g., advocacy networks). Activists are a Casebook value under the "Attribution" variable in the code book.
Individuals or groups that campaign for social, political, or legal change. They may be formally organized (e.g., registered non-governmental organization) or loosely affiliated (e.g., advocacy networks). Activists are a Casebook value under the "Targets" variable in the code book.
Advertising is a tactic in which campaign operators pay for promotional content—often in the form of paid search results, classified ads, press releases, or celebrity/influencer endorsements. While advertising is most often used to market products or services, it can also be used to advocate for a cause, promote a political party, and/or frame issues in a particular light. It is a noun and verb as well as a process and result. Advertising can be purchased on websites, on social media platforms and in print and broadcast media. Examples include banner ads, pop-ups, advertorials, billboards, native advertising, sponsored content, EINPresswise, PR newswire, Google ads, and ads on YouTube, Facebook, Twitter, or Instagram.
Advertising is a Casebook value under the "Tactics" variable in the code book.
Astroturfing refers to the strategy by which established, politically motivated groups (such as corporations, interest groups, political campaigns, etc.) impersonate grassroots activist movements for political gain. Online or digital astroturfing is the specific use of the internet—and particularly social media platforms—to create the illusion of organic, widespread support for a specific cause or candidate. Astroturfing is a Casebook value under the "Strategy" variable in the code book.
B
Launched in 2017, Bitchute is a video-hosting platform that positions itself as a “free speech” alternative to YouTube. The site hosts videos promoting violent far-right ideology, medical disinformation, and conspiracy theories. Bitchute links are blocked on Twitter as of August 2020. Bitchute is a Casebook value under the "Network Terrain" variable in the code book.
Blocking is something one user (of a platform or website) does to another. When one user does not want to see what another user posts or how that user engages on the site, they can opt to "block" that user. After blocking, the blocked user will no longer appear in the blocker's account. Blocking is a Casebook value under the "Mitigation" variable in the code book.
Self-published websites or web pages, with no editorial oversight, that are usually run by an individual or small group and are regularly updated with new content. Blogs are a Casebook value under the "Network Terrain" variable in the code book.
Within the study of disinformation and media manipulation, bots typically refer to social media accounts that are automated, sometimes deployed for deceptive purposes, such as to artificially amplify a message, game a trending or recommendation algorithm, or inflate an account's engagement metrics. These accounts are typically centrally controlled or in coordination with each other. The term 'bot' is also used as a pejorative to discredit political opponents online. Bots are a Casebook value under the "Tactics" variable in the code book.
Breaking news events are (often unexpected) occurrences of local, national or international importance, which demand special, immediate attention by the news media.
In the midst of a breaking news event—and particularly one related to a crisis, such as a mass shooting—news outlets and social media users sometimes publish information or details about the event that have not yet been confirmed or vetted by official bodies, in an effort to keep the public informed as the event is unfolding.1 This leads to a period of confusion where legitimate information and misinformation may be indistinguishable, and thus creates opportunities for malicious actors to manipulate media coverage or social media attention in order to sow confusion, target individuals, turn a profit, or shape certain narratives.2
Example cases in which campaign operators took advantage of a breaking news event to spread misinformation include “Misidentification: How The #Antifafires Rumor Caught on Like Wildfire” and “Misidentification: Parkland Shooter.”
Breaking news event is a Casebook value under the "Vulnerabilities" variable in the code book.
- 1Justin Lewis and Stephen Cushion, “The Thirst to Be First,” Journalism Practice 3, no. 3 (August 1, 2009): 304–18, https://doi.org/10.1080/17512780902798737.
- 2Jane Lytvynenko, “4. Monitoring for Fakes and Information Operations during Breaking News,” Verification Handbook For Disinformation And Media Manipulation (European Journalism Centre), https://datajournalism.com/read/handbook/verification-3/investigating-actors-content/4-monitoring-for-fakes-and-information-operations-during-breaking-news.
A business is any for-profit entity. Business is a Casebook value under the "Attribution" variable in the code book.
Butterfly attacks occur when imposters mimic the patterns of behavior of a social group (usually a group that has to fight for representation). Imposters pretend to be part of the group in order to insert divisive rhetoric and disinformation into popular online conversation or within the information networks used by these groups. Distinct from astroturfing, which tries to falsify grassroots support for an issue, butterfly attacks are designed to infiltrate existing communities, media campaigns, or hashtags to disrupt a social group's operations and discredit the group by sowing divisive, inflammatory, or confusing information. These mimicry campaigns exploit social wedges and prejudice, and encourage targeted harassment.
Coined by Patrick Ryan to describe a series of manipulation campaigns he claims to have orchestrated in 2013, the term butterfly attack is inspired by the mimicry behavior of certain species of butterflies, who impersonate the fluttering patterns of other species to confuse predators.1
Butterfly attack is a Casebook value under the "Strategy" variable in the code book.
- 1Patrick Ryan, “The Butterfly War,” October 13, 2017, https://cultstate.com/2017/10/13/The-Butterfly-War/.
C
A cheap fake is altered media that has been changed through conventional and affordable technology. Social media examples of cheap fake techniques include photoshopping (including face swapping), lookalikes, as well as speeding and slowing video. A cheap fake is easier to produce than a deep fake, which requires advanced technology and machine learning. 1
NPR reported that the effectiveness of cheap fakes, for the most part, kept deep fakes out of the 2020 presidential election.2 Who needs to spend money on AI when basic photoshopping will do? Unlike deep fakes, cheap fakes are common. In August 2020, a video of Nancy Pelosi was slowed down to make it appear like she was intoxicated. A video with the same claim went viral in May 2019. The 2019 post originated on TikTok and was posted to YouTube, Facebook, and Twitter. Facebook did not remove the post, but the other three platforms did. 3
- 1Britt Paris and Joan Donovan, “Deepfakes and Cheap Fakes” (Data and Society Research Institute, September 18, 2019), https://datasociety.net/library/deepfakes-and-cheap-fakes/.
- 2Tim Mak and Dina Temple-Raston, “Where Are The Deepfakes In This Presidential Election?” NPR, October 1, 2020, https://www.npr.org/2020/10/01/918223033/where-are-the-deepfakes-in-this-presidential-election.
- 3Hannah Denham, “Another Fake Video of Pelosi Goes Viral on Facebook,” Washington Post, August 3, 2020, https://www.washingtonpost.com/technology/2020/08/03/nancy-pelosi-fake-video-facebook/.
Groups or organizations engaged in advocating for certain issues, educating the wider public, holding the government accountable, or promoting civil and human rights. They may be formally organized or loosely coordinated and include non-governmental organizations (NGOs), community groups, labor unions, educational organizations, faith-based organizations, professional associations, nonprofit think tanks, and foundations.
Civil society response refers to actions taken by members or groups of civil society in an attempt to mitigate a campaign's harms or spread. Civil society response is a Casebook value under the "Mitigation" variable in the code book.
A legal proceeding by a private party or parties against another in a civil court of law that seeks remedy for a wrongdoing or harm. Civil/private lawsuit is a Casebook value under the "Mitigation" variable in the code book.
The use of scientific jargon and community norms to cloak or hide a political, ideological, or financial agenda within the appearance of legitimate scientific research. This may include the use of technical language, difficult-to-understand graphs and charts, or seemingly scientific data presented as empirical evidence to lend credibility to the claims being made. Cloaked science may be seeded onto public preprint servers, in data repositories, journals, or publications with lax review standards, through press releases, or by baiting journalists who may not be able to scrutinize the claims thoroughly.
This definition builds upon Jessie Daniel's research on cloaked websites, which she describes as "sites published by individuals or groups who conceal authorship in order to disguise deliberately a hidden political agenda,"1 and is inspired by Sarah Richardson's description of transphobic politics being "cloaked in science."2 Science scholar Timothy Caulfield uses the term “scienceploitation”3 to describe a similar phenomenon—the use of scientific language to mask otherwise unscientific motives (e.g., financial gain).
Note that cloaked science as a tactic is the deliberate use of information masquerading as science and should not be confused with “junk science,” which is a term that has been used to discredit scientific findings, claims, and data as fraudulent or misleading in a similar way that “fake news” can be used to dismiss critical news coverage.4
Cloaked science is a Casebook value under the "Tactics" variable in the code book.
- 1Jessie Daniels, “Cloaked Websites: Propaganda, Cyber-Racism and Epistemology in the Digital Era,” New Media & Society, July 21, 2009, https://journals.sagepub.com/doi/10.1177/1461444809105345.
- 2Sarah Richardson, “Transphobia, Cloaked in Science,” Blog//Los Angeles Review of Books (blog), November 8, 2018, https://blog.lareviewofbooks.org/essays/transphobia-cloaked-science/.
- 3Timothy Caulfield, “Pseudoscience and COVID-19 — We’ve Had Enough Already,” Nature, April 27, 2020, https://doi.org/10.1038/d41586-020-01266-z.
- 4Jonathan M. Samet and Thomas A. Burke, “Turning Science Into Junk: The Tobacco Industry and Passive Smoking,” American Journal of Public Health 91, no. 11 (November 1, 2001): 1742–44, https://doi.org/10.2105/AJPH.91.11.1742.
The large-scale screening by automated systems and humans of content uploaded to social-media sites to remove material that violates the law or individual platforms' terms of service.
When groups or individuals who are targets of manipulation campaigns expose impersonation attempts or false claims made about them. These community-driven debunkings play out visibly on social media, and do not always receive press attention or acknowledgement.
Community mitigation are attempts by the public to counter what they perceive to be erroneous, misleading, or harmful narratives. This can happen across various platforms and take on several forms, including hashtag hijacking, mass reporting of the accounts of campaign participants, as well as employing counter speech that injects an alternative narrative into the discourse. When this happens on a scale large enough to impact or overshadow the original media manipulation campaign, we deem it “community mitigation.”
Community mitigation is a Casebook value under the "Mitigation" variable in the code book.
Community moderation is the "cleaning up" of a social media site by users. It involves flagging, closing, commenting, editing, and sometimes deleting posts that violate terms of service and community standards.
Individuals or groups that actively propagate unfounded or unverified narratives and frames. Conspiracy techniques often include speculation, unsubstantiated claims, and explanations predicated on secretive and powerful actors scheming with malicious intent. Conspiracists are a Casebook value under the "Attribution" variable in the code book.
A conspiracy theory is an attempt to explain the true, ultimate causes of significant social and political events as the result of secret plots by two or more powerful actors, rather than as overt activities or natural occurrences. While a "conspiracy" refers to a true causal chain of actions and events, a "conspiracy theory" refers to an accusatory perception that may or may not be true.
Content removal is the act of platforms taking down specific pieces of content, like videos, tweets, posts, etc. The platform's terms of service are often a guideline for what can be removed, though these are rarely enforced uniformly or consistently. Content removal is a Casebook value under the "Mitigation" variable in the code book.
A term coined by Facebook to describe the use of multiple Facebook or Instagram assets working in concert to misrepresent themselves, artificially boost the popularity of content, or engage in behaviors designed to enable other violations under the platform's community standards, and where the use of fake accounts is central to the operation.
A portmanteau of “copy,” “paste,” and “pasta,” copypasta refers to any block of text that is repeatedly reposted, often on social media, messaging apps, online discussion forums, and comments sections. Copypasta is a Casebook value under the "Tactics" variable in the code book.
Corporate adoption occurs when a corporation or other business entity embraces or co-opts a phrase, term, issue, idea, or movement in order to obtain a strategic advantage over their competitors or shape a specific narrative in their interest. Corporate adoption is a Casebook value under the "Observable outcomes" variable in the code book.
All activities involved in the process of investigating and prosecuting a crime, including collecting evidence or information pertaining to a crime, apprehending a suspect, and any subsequent related proceedings such as a trial or sentencing.
Critical press refers to press coverage that is critical of a manipulation campaign. Articles may debunk false claims or investigate the origins and motivations of a campaign. Critical press is a Casebook value under the "Mitigation" variable in the code book.
A person or account who disseminates political propaganda on the internet, particularly on social media platforms. They may be paid or unpaid, working independently or in tandem with a group or campaign, and may be automated or manual.
D
Coined and theorized by Michael Golebiewski and danah boyd,1 this refers to unique topics or terms that result in minimal, low-quality, or manipulative information from search engine queries. Data voids are social or technical security risks depending on the subject matter of the query. Data void is a Casebook value under the "Vulnerabilities" variable in the code book.
- 1Michael Golebiewski and danah boyd, "Data Voids: Where Missing Data Can Easily Be Exploited," Data & Society, October 29, 2019, https://datasociety.net/library/data-voids/.
Removing a link or other content from search results. The content or website in question is still available but will not be included in a search engine's, website's, or platform's results. De-indexing is a Casebook value under the "Mitigation" variable in the code book.
The removal of individuals or groups from a platform, preventing them from using the platform’s services even if they try to create new accounts. De-platforming is often enforced when a user has violated the terms of service, and may also include removing any existing content previously created by the user. De-platforming is a Casebook value under the "Mitigation" variable in the code book.
Exposing false or misleading claims, such as sham miracle drugs or exaggerated advertising slogans. Debunking includes fact-checking efforts, research and investigation, exposés, and other critical writing that attempts to correct the false claims. Debunking is a Casebook value under the "Mitigation" variable in the code book.
The use of "deep" or machine learning to hybridize or generate human bodies and faces. The word is a portmanteau of “deep learning” and “fake."
Removing the ability for an account, channel, or individual to earn revenue from their content on a platform.
Digital blackface is the use of visual or linguistic signifiers of Black technoculture by non-black people for the purpose of deception, manipulation, or exploitation.
Discord is an instant messaging software that launched in 2015. Users can send text, video, images, and audio through its channels. Discord is a Casebook value under the "Network Terrain" variable in the code book.
Discussion forums are online places where people can ask questions and post text, images, video, or other content. As the name implies, these platforms act as forums for discussion, allowing replies to posts or other forms of engagement.
Information that is deliberately false or misleading, often spread for political gain or profit, or to discredit a target individual, group, movement, or political party.
Distributed amplification is a tactic whereby a campaign operator explicitly or implicitly directs participants to rapidly and widely disseminate campaign materials, which may include propaganda and misinformation as well as activist messaging or verified information via their personal social media accounts. TaSC researchers identified and named the tactic in spring 2020 based upon observations of the manner in which the “Plandemic” Covid-19 hoax documentary was spread online.1
The strategy relies on many campaign participants to individually circulate campaign materials throughout their own communities, either via personal social media accounts or within offline spaces (such as on community bulletin boards, in the public square, etc.). This strategy is often employed by campaign operators in an effort to complicate mitigation by seeding the campaign to as many platforms as possible (so that if one platform mitigates the campaign, it remains active on other platforms), or to overwhelm the information ecosystem with a specific narrative or message.
The term was inspired by a technique in the field of electrical engineering, where “distributed amplification” refers to a configuration of simple electron tubes that allows engineers “to obtain amplification over much greater bandwidths than would be possible with ordinary circuits.”2 Manipulation campaign operators use participants in a similar way that electrical engineers use electron tubes to increase electrical bandwidth by operationalizing their audiences to increase the amplification power of a disinformation campaign or artifact on social media.
Related terms may include “flooding,” “flooding the zone,” or “flooding the zone with shit,” which refer specifically to intentionally contaminating an information ecosystem with misinformation with the express goal of sowing confusion and distrust among the audience as to what is real and what is not.3 The latter term was popularized by Steve Bannon, the former head of Breitbart News and chief strategist to President Donald Trump.4 Margaret Roberts uses the term “flooding” to describe the Chinese Communist Party’s approach to internet censorship, which includes injecting “large amounts of information online to distract attention away from ongoing events.”5
As was seen in the case of “Distributed Amplification: The Plandemic Documentary,” by parceling out content into an easily shareable format, and providing detailed instructions on how social media users may share that content as widely as possible, campaign operators were able to achieve greater amplification than they would achieve as a single source. It is also employed in “The Abortion-breast Cancer Myth: A Cloaked Science Case Study,” where anti-abortion activists encouraged others to push pamphlets containing disinformation related to the non-existent link between abortion and breast cancer.
- 1Sheera Frenkel, Ben Decker, and Davey Alba, “How the ‘Plandemic’ Movie and Its Falsehoods Spread Widely Online,” The New York Times, May 20, 2020, https://www.nytimes.com/2020/05/20/technology/plandemic-movie-youtube-facebook-coronavirus.html.
- 2E.L. Ginzton et al., “Distributed Amplification,” Proceedings of the IRE 36, no. 8 (August 1948): 956–69, https://doi.org/10.1109/JRPROC.1948.231624.
- 3Sean Illing, “‘Flood the Zone with Shit’: How Misinformation Overwhelmed Our Democracy,” Vox, January 16, 2020, https://www.vox.com/policy-and-politics/2020/1/16/20991816/impeachment-trial-trump-bannon-misinformation.
- 4Brian Stelter (@brianstelter), “I Nominate This Steve Bannon Line as the Most Memorable Quote of the Trump Years: ‘The Democrats Don’t Matter. The Real Opposition Is the Media. And the Way to Deal with Them Is to Flood the Zone with Shit.’ Https://T.Co/1esoag3VBi,” Twitter, November 1, 2020, https://twitter.com/brianstelter/status/1323044740632350722.
- 5“Molly Roberts on Censored: Distraction and Diversion Inside China’s Great Firewall,” Asia Experts Forum, October 7, 2020, http://asiaexpertsforum.org/molly-roberts-censored-distraction-diversion-inside-chinas-great-firewall/.
The act of publishing on the internet private or identifying information about a specific individual, usually with malicious intent (e.g., retaliation, punishment). This type of activity is often banned on platforms and forums. Dox is a Casebook value under the "Observable outcomes" variable in the code book.
E
Refers to the time leading up to an election when candidates have begun campaigning. Depending on the country, there may be legal limits to what constitutes a campaign period. Election period is a Casebook variable under the "Vulnerabilities" variable in the code book.
An evidence collage is a collection of screenshots and text that is assembled into a shareable document and presented as evidence. No professional computer applications are required to make an evidence collage; they can be made with basic image editing applications. Evidence collages are timed with breaking news events, and are meant to influence both the general public and journalists and preempt authoritative reporting. Evidence collages are a call for civilian research, which itself can lead to the dissemination and propagation of unverified information.
By modeling evidence collages after infographics or official documents and by using graphic design to introduce associated keywords and URLs (links to sites with more disinformation), they direct viewers where to investigate and invite them to support the associated campaign’s sites, resources, hashtags, and communities. Often, evidence collages include both verified and unverified information. This can make them more difficult to debunk. A Casebook example that used evidence collages can be found in “Evidence Collage: Unite the Right Rally,” where participants interpreted these images as clues and used them to incorrectly identify the driver in a fatal attack. This misidentification led to targeted harassment.
Evidence collages are a Casebook value under the "Tactics" variable in the code book.
Groups or individuals that espouse right-leaning radical or violent positions, often associated with organized white supremacy or other prejudice-driven ideologies. Extremists (right wing) is a Casebook value under the "Attribution" variable in the code book.
F
A social networking website that allows registered users to create unique profiles, have public and private conversations, join groups, create events, upload photos, etc. The company's source of revenue is selling ads on social media websites and mobile applications. Facebook is a Casebook value under the "Network Terrain" variable in the code book.
The act of investigating information presented as facts in order to determine its veracity and correctness. In some cases, a fact check will result in some kind of public refutation, if the information investigated was found to be erroneous. Furthermore, fact checking is not always bipartisan or universally accepted and may be contested by interested parties.
Financial gain is an observable outcome in which there is evidence that the campaign operator made money—personally, for a business, or for a non-profit organization—as a result of the campaign. Financial gain is a Casebook value under the "Observable outcomes" variable in the code book.
A sociotechnical mechanism for reporting harmful or offensive content to an online social media platform or company. Content can be flagged by an algorithm, content moderator, or another user. If the content is flagged by another user, an employee from the company often reviews the content and if it violates the terms of service, the user is notified and the content is removed. Flagging is a Casebook value under the "Mitigation" variable in the code book.
The creation and distribution of a fake document with intent to deceive. Forgery is a Casebook value under the "Tactics" variable in the code book.
G
A social networking service launched publicly in May 2017. It is known for its right-leaning user base, and as a safe haven for far-right extremists. Gab is a Casebook value under the "Network Terrain" variable in the code book.
Attempting to manipulate an algorithm in order to gain attention. This may include tactics that elevate content into a platform’s trending list, being recommended to other users, or placing in the top ten of a search engine’s results. Gaming an algorithm is a Casebook value under the "Strategy" variable in the code book.
Founded in 1998 and headquartered in Mountain View, CA, the multinational company provides internet-related services and products, such as a search engine, cloud computing, online advertising technologies, and software and hardware. Google is a Casebook value under the "Network Terrain" variable in the code book and specifically refers to Google’s flagship search engine product and not to the company’s subsidiaries or other products.
H
Targeted and repeated behavior towards an individual or group of people that causes mental, physical or emotional distress. Harassment includes but is not limited to unwanted threats, insults, touching, and offensive language. Harassment is a Casebook value under the "Observable outcomes" variable in the code book.
The unauthorized use of an individual’s account, typically accessed through stolen credentials or hacking. Hijacked accounts can refer to email accounts, social media profiles, messaging apps, or any other account associated with a digital service or product. Hijacked accounts are a Casebook value under the "Tactics" variable in the code book.
Press that is sharply divided along binary partisan positions, reflecting fierce disagreement with the opposing side.
I
Pretending to be another person or member of a social identity group, either by mimicking their behavior or creating a falsified online presence. Impersonation is a Casebook value under the "Tactics" variable in the code book.
Inconsistent regulatory enforcement is a vulnerability in which government regulations are not enforced broadly or evenly, and/or they lack follow-up. These discrepancies allow campaign operators to promote and spread otherwise illicit content regarding scam medical procedures, fraudulent health care opportunities, harmful and unproven therapies, and the like. These inconsistencies could include agencies pursuing certain companies but not others for similar conduct or censuring a company for a violation but never verifying whether that company has remediated as required. The failure of government agencies to consistently enforce regulatory requirements regarding the promotion, sale, or use of specific medical or health-related treatments can result in the spread of misleading or false claims, unauthorized advertising, or other non-government-sanctioned activities.
Political administrations can also be a factor in the prioritization of regulatory enforcement, despite the regulations themselves staying consistent. For example, under former U.S. President Donald Trump, the Food and Drugs Administration, a federal regulatory body, sent 45% fewer “Official Action Indicated” reports (cases in which regulatory and/or administrative actions are recommended) than during an equivalent period under former President Barack Obama, 33% fewer warning letters (where the FDA identifies the violation and makes clear that the company must correct the problem),1 and 26% fewer injunctions (which are a “civil judicial process initiated to stop or prevent violation of the law”).2 The FDA sent fewer letters in Trump’s second year than in his first, indicating a policy of inconsistent enforcement of healthcare regulations even within an administration.3
More broadly, inconsistent regulatory enforcement leads to a risk assessment by campaign operators: If profiteers expect little to no consequences for their actions or believe penalties will not necessarily harm or impede their operations, it disincentivizes adherence to government regulations. This creates a social, economic and/or political vulnerability for campaign operators to exploit.
Inconsistent regulatory enforcement is a Casebook value under the "Vulnerabilities" variable in the code book.
- 1 “About Warning and Close-Out Letters,” Food and Drug Administration, April 29, 2019, archived on Perma.cc, perma.cc/2SAA-NSYS.
- 2“Injunctions,” Food and Drug Administration, March 29, 2019, archived on Perma.cc, perma.cc/F869-BPQW; Charles Piller, “Exclusive: FDA enforcement actions plummet under Trump,” Science, July 2, 2019, archived on Perma.cc, perma.cc/7Y9N-Q5KQ.
- 3 Piller, “Exclusive: FDA enforcement actions plummet under Trump.”
A single person. Individual is a Casebook value under the "Targets" variable in the code book.
Visible pundits, journalists, or public figures who drive conversation around particular topics in broadcast media and online networks. Influencers are a Casebook value under the "Attribution" variable in the code book.
Information operations is a broad category of activity but generally refers to the strategic use of technological, operational, and psychological resources to disrupt an adversary's informational capacities. For more information, see Caroline Jack's Lexicon of Lies (Data and Society, 2017) and Martin Libicki's "What is Information Warfare" (in Toward a Revolution in Military Affairs?, edited by Thierry Gongora and Harald von Riekhoff, 2000).
Acquired by Facebook in 2012, Instagram is a social network platform that enables users to edit, upload, and comment on photos and short videos, broadcast live videos, and have private conversations using the chat feature. Instagram is a Casebook value under the "Network Terrain" variable in the code book.
K
Keyword squatting is the tactic of creating online content — including social media accounts — around a specific search-engine-optimized term so as to determine the search results of that term. Since future searches around that term will return the squatted accounts and content, manipulators are able to direct online traffic to their pages and to influence (to varying success) the narrative around the term.1
The term keyword squatting is adapted from “domain squatting,” which is the process of anticipating which domain names will become popular, buying them, and selling them for a profit when they become sought-after URLs. Both keyword and domain squatting can be methods of online impersonation: by acquiring “misleading account names, URLs, or keywords,” manipulators can appear online as their opponents or as the people/groups that they target.
Through search engine optimization, manipulators can make it so that their hashtags and content tags appear before authentic accounts in search results lists. This tactic can be particularly useful in cases of data voids and hidden virality (where there is a lack of authoritative information), but there is growing interest in the term outside the mainstream view.2 Keyword squatting allows manipulators to appropriate keywords around breaking news events, social movements, celebrities, and wedge issues. From there, they can use the accounts they created to flood conversations on the given topic with inaccurate or malicious information. Keyword squatting is a tactic manipulators use to instigate media coverage and shape trending conversations on social media.
A Casebook example of keyword squatting can be found in “Targeted Harassment: The Ukraine Whistleblower,” where manipulators encouraged participants to share the name and photos of an alleged whistleblower before platforms intervened. In this case, a media blackout and asymmetry within the media ecosystem created the perfect conditions for successful keyword squatting by motivated manipulators.
Keyword Squatting is a Casebook value under the "Tactics" variable in the code book.
- 1Joan Donovan and Brian Friedberg, “Source Hacking: Media Manipulation in Practice” (Data and Society Research Institute, September 4, 2019), https://datasociety.net/library/source-hacking-media-manipulation-in-practice/.
- 2Brian Friedberg, “The Dark Virality of a Hollywood Blood-Harvesting Conspiracy,” Wired, July 31, 2020, https://www.wired.com/story/opinion-the-dark-virality-of-a-hollywood-blood-harvesting-conspiracy/.
L
The use of coded language to discuss topics that are often automatically or manually flagged as breaking a site's terms of service, community standards, or user policies, in an attempt to circumvent censorship, account bans, or other forms of information control.
Labelling refers to the application of informational labels to social media posts, accounts, channels, or other content by the host platform in an effort to give viewers additional context. This may include labelling content that could be potentially sensitive or graphic, affiliated with a nation state, containing false or misleading claims, or at risk of inciting violence. Labelling is a Casebook value under the "Mitigation" variable in the code book.
A lax security practice is anything that makes the user more vulnerable to security attacks or scams, like phishing. An example of a lax security practice is having a password that can be guessed easily or is repeated across multiple accounts. Lax security practice is a Casebook value under the "Vulnerabilities" variable in the code book.
The unauthorized release of sensitive materials or documents.
M
Self-imposed or state-mandated censorship of a certain news topic. Media blackout is a Casebook value under the "Mitigation" variable in the code book.
Media ecosystems are complex combinations of print, broadcast, digital, and social media that work together to create a self-referential information environment.
Coverage and reporting by journalists in popular or mainstream media. Media exposure is a Casebook value under the "Observable outcomes" variable in the code book.
We define media manipulation as the sociotechnical process whereby motivated actors leverage specific conditions or features within an information ecosystem in an attempt to generate public attention and influence public discourse through deceptive, creative, or unfair means. Campaigns or operations that engage in media manipulation may use several tactics, such as memes, viral videos, forged documents, or leaked information.
Media manipulation is not exclusive to any actor or group, nor is it inherently good or bad. Activists, constrained by heavy censorship in traditional media, for example, may rely on media manipulation in the digital space to circumvent such information controls. However, extremists may likewise use the same platforms and tactics to mainstream hateful and dangerous speech. Furthermore, media manipulation is a broad term in that it can be used to define a variety of other terms, such as disinformation, information operations, or influence operations.
Note that media manipulation is distinct from media control, which occurs at the top level by the state and private sector. Media control would instead refer to activity like ISP-level content blocking, government censorship agencies, media ownership, content filtering, or distribution and licensing regimes.
News and entertainment publishers that provide news and feature stories to the public and are not owned or controlled by the state. They may be distributed over broadcast (TV and radio), online, or print media. This variable includes independent and alternative media, mainstream corporate press, and publicly funded media that are free from state interference (e.g., BBC and NPR). Media outlets are a Casebook value under the "Network Terrain" variable in the code book.
Medical misinformation is, most broadly, any health claim that lacks evidence.1 Alternatively framed, it is any health claim that goes against current evidence.2 These claims—based on anecdotes, untested theories, limited research, and/or false information3 —can be a means for political, social, or economic gains.
Medical misinformation has been divided into two categories: (1) health ideas that have been practiced alongside biomedicine for decades and (2) new, unproven health ideas that are spread in an attempt to replace evidence-based medicine.4 The latter category is the type of medical misinformation included in the Media Manipulation Casebook. At its worst, it is harmful—or fatal—to the people who believe it.
Current medical misinformation is most visibly and widely shared through major social media platforms, such as Facebook, Instagram, Pinterest, and Twitter. But, it wasn’t the platforms that introduced medical misinformation. Medical misinformation predates the internet, and it has been a part of the internet since its beginning. In 1997, findings from “Medical information on the internet,” published in the Journal of General Internal Medicine, concluded that “medical information available on Internet discussion groups may come from nonprofessionals and may be unconventional, based on limited evidence, and/or inappropriate.”5 In a 1999 issue of Journal of Public Health Medicine, Dr. Vince Abbott warned, “The WWW should not be considered as a reliable source of information on subjects about which one knows little. This is especially true for medical information, as...much of what a typical user may find will be inaccurate or biased.”6
Medical misinformation drives the disinformation campaigns discussed in “Trading Up the Chain: The Hydroxychloroquine Rumor” and in “Cloaked Science: The Yan Reports.”
- 1Paul Armstrong and C. David Naylor, “Counteracting Health Misinformation: A Role for Medical Journals?” JAMA, April 22, 2019, 321(19):1863–1864, doi:10.1001/jama.2019.5168.
- 2Wen-Ying Sylvia Chou, Anna Gaysynsky, and Joseph N. Cappella, “Where We Go From Here: Health Misinformation on Social Media,” American Journal of Public Health, October 01, 2020, 110:S273-S275, https://doi.org/10.2105/AJPH.2020.305905.
- 3Victor Suarez-Lledo and Javier Alvarez-Galvez, “Prevalence of Health Misinformation on Social Media: Systematic Review,” Journal of Medical Internet Research, January 2021, 23(1):e17187, https://www.jmir.org/2021/1/e17187/.
- 4Armstrong and Naylor, “Counteracting Health Misinformation: A Role for Medical Journals?”
- 5Jean Deason Culver, Fredric Gerr, and Howard Frumkin, “Medical information on the internet,” Journal of General Internal Medicine, August 1, 1997, 12: 466–470, https://doi.org/10.1046/j.1525-1497.1997.00084.x.
- 6Vince Abbot, “Web Page Quality: Can We Measure It and What Do We Find? A Report of Exploratory Findings,” Journal of Public Health Medicine, June 2000, 22(2): 191–97, https://academic.oup.com/jpubhealth/article/22/2/191/1514240.
The intentional propagation of political memes on social media for the purpose of political persuasion or community building, or to strategically spread narratives and other messaging crucial to a media manipulation campaign. Meme war is a Casebook value under the "Strategy" variable in the code book.
Memes, a term coined by Richard Dawkins (1976), are “units of culture” that spread and evolve through the public discourse.1 Internet memes may take the form of any easily shared or repeated piece of media, including images, catchphrases, video clips, songs, etc. and are often (though not always) intended to be humorous.
As a meme spreads across various digital platforms and online communities, it is often “remixed,” or altered slightly by users other than the creator.2 Therefore, any one meme may have innumerable iterations, and is typically understood as “the collective property of the culture,” rather than belonging to any single user, group, or website.3
For this reason, internet memes are often employed by white supremacist groups such as the “Alt-Right” and other reactionary groups online as a strategic means of skirting online moderation tools and exposing more people to exclusionary politics under the guise of irony or humor.4 When memes are manipulated for purposes of political persuasion or community building, they are said to be weapons in a “meme war.”
A Casebook example of memes can be found in Viral Slogan: It’s OK to be White. In this case, a group of 4chan users led a formal campaign to turn a phrase rooted in racial prejudice and extremist, white supremacist rhetoric into a viral slogan, using memes, posters, and hashtags.
Memes are a Casebook value under the "Tactics" variable in the code book.
- 1Richard Dawkins, The Selfish Gene, New York: Oxford University Press, 1976.
- 2An Xiao Mina, Memes to Movements: How the World’s Most Viral Media Is Changing Social Protest and Power (Boston: Beacon Press, 2019), http://www.beacon.org/Memes-to-Movements-P1410.aspx.
- 3 Joan Donovan, “How Memes Got Weaponized: A Short History,” MIT Technology Review, October 24, 2019, https://www.technologyreview.com/2019/10/24/132228/political-war-memes-disinformation/.
- 4Viveca S. Greene, “‘Deplorable’ Satire: Alt-Right Memes, White Genocide Tweets, and Redpilling Normies,” Studies in American Humor 5, no. 1 (2019): 31–69, https://doi.org/10.5325/studamerhumor.5.1.0031.
Erroneously identifying an individual as someone else, intentionally or accidentally. Misidentification is a Casebook value under the "Observable outcomes" variable in the code book.
Misinfographics are infographics with false or misleading information. In some cases, they may also be classified as a forgery when they borrow an existing organization's brand aesthetics and logo in order to make it seem as if the content was coming from the organization.
The #SaveTheChildren hashtag within QAnon uses misinfographics to publicize human trafficking statistics. Vox notes that moms on Instagram sharing “aesthetically pleasing” posts have been “critical” to its spread.1
A Casebook example of a misinfographic can be found in “Misinfographic: The Spread of 'JihadChart' in India,” where a misinfographic titled “Jihad: The Holy war to spread Islam” appeared on Facebook, Reddit, and Twitter to capitalize on and further promote anti-Muslim bias.
Misinfographics are a Casebook value under the "Tactics" variable in the code book.
- 1Rebecca Jennings, “We’re in the Middle of Another Moral Panic. What Can We Learn from the Past?,” Vox, September 25, 2020, https://www.vox.com/the-goods/2020/9/25/21453036/save-the-children-qanon-human-trafficking-satantic-panic.
Attempts, measures, and other actions taken by the private sector, government, media organizations, and civil society in an attempt to contain or prevent the continuation of a campaign, its effects, or messaging. Mitigation is both a stage in the life cycle, and one of the variables in the code book under which Casebook codes fall.
Muddy the waters refers to the creation of a confusing and disorienting information environment, in which it is difficult for a layperson to distinguish what is accurate from what is false, as a direct result of a media manipulation campaign. As credible or authoritative sources are forced to compete with speculation, unfounded claims, or outright false information, the authenticity and veracity of information surrounding the target subject becomes more confusing.
The origin of the idiom is disputed, but among the earliest recorded figurative uses of the William Ames’ 1633 “A Fresh Suit Against Human Ceremonies in God's Worship” (in which the phrase was “he mudds the water, and so would mislead the simple”).1 Its first appearance in the Oxford English Dictionary was in 1653. A Chinese proverb, “muddy waters make it easy to catch fish,” uses the same imagery to reference how it is possible to use a chaotic period or a crisis for personal gain.2
The expression evokes the mental image of mud being stirred up on the bottom of a river bed, making the water less clear and difficult to navigate. Similarly, this observable outcome is a helpful descriptor for when manipulated media makes verified information less clear or difficult to locate. While TaSC uses the phrase to describe a particular outcome of a disinformation campaign, its use is also common outside of media manipulation (“muddle the waters” is another form of the expression).
“Muddy the waters” is used within the Casebook whenever there is evidence that the disinformation campaign caused confusion among the public. Examples include social media posts and Q-drops causing confusion during the 2020 wildfire season in “Misidentification: How The #Antifafires Rumor Caught On Like Wildfire,” President Trump causing confusion about the efficacy of HCQ for COVID-19 treatment in “Trading Up the Chain: The Hydroxychloroquine Rumor,” and fake antifa accounts being perceived as authentic in “Butterfly Attack: The Origins of Fake Antifa Social Media Accounts.”
Muddy the waters is a Casebook value under the "Observable outcomes" variable in the code book.
- 1 “A Fresh Suit Against Human Ceremonies in God’s Worship by William Ames (1576-1633),” A Puritan’s Mind, 2021, https://www.apuritansmind.com/puritan-favorites/william-ames/a-fresh-suit-against-human-ceremonies-in-gods-worship-by-william-ames-1576-1633/.
- 2 Jane Li, “Why Chinese Youngsters Are Embracing a Philosophy of ‘Slacking-Off,’” Quartz, December 28, 2020, https://qz.com/1938809/why-chinese-youngsters-are-embracing-a-culture-of-slacking-off/.
N
Tacit coalitions or groups of people who share some, but not all, political or social positions, primarily congregate online (though not exclusively), and often come together as a swarm to act in unison. Networked factions maintain these coalitions using shared phrases, hashtags, memes, or similar media. These factions can form and dissolve according to the political or social context. Networked faction is a Casebook value under the "Attribution" variable in the code book.
Organizations operating with a mission or intent to advocate for, promote, or advance specific values or goals other than making a profit. Nonprofit organizations are a Casebook value under the "Targets" variable in the code book.
Cases where there is no or insufficient evidence to definitively identify a person or group as having been targeted by the campaign or its operators. Not applicable is a Casebook value under the "Targets" variable in the code book.
O
In-person events (such as rallies, marches, conferences, etc.) organized by campaign operators for the sake of recruitment, garnering press attention for the campaign, or furthering the campaign’s goals.
Offline gatherings is a Casebook value under the "Tactics" variable in the code book.
Open collaboration tools are services such as Pastebin or Google Docs, which are open-access, easy-to-use services for hosting, crowdsourcing, and sharing information. The openness of these tools presents an opportunity for campaign planners and participants, giving them a frictionless, easy-to-share repository for collaboration, coordination, and information distribution. Depending on the intentions of the campaign, they can be used for advocacy, resource-sharing, and activism, as well as more malicious means, such as housing false or misleading information, unauthorized leaks of personal and private information, or harassment campaign instructions. Other examples of open collaboration tools include Dropbox, Jira, Asana, Trello, etc. Open collaboration tools are a Casebook value under the "Network Terrain" variable in the code book.
Platforms that have both an editorial arm and a self-publishing arm that allows users to publish and post their own articles and other content. Examples include Medium and Buzzfeed Community. Open editorial platforms are a Casebook value under the "Network Terrain" variable in the code book.
Open science is an approach to scientific inquiry that advocates for collaboration, accessibility, and transparency in an effort to increase the dissemination of scientific knowledge and participation of individuals from diverse backgrounds. Common practices include making research data public, campaigning for open access, and communication strategies that are inclusive of a wide audience. 1
Although open science as a movement confers multiple benefits, 2 its openness and inclusivity can be exploited by motivated actors intent on seeding false or misleading content. Digital data repositories and preprint servers, for example, are an outcome of the movement for open science, but because of their lack of peer review they can be misused or abused to spread poor quality research or disinformation masked as science.3 Publicly available data, even if credible and from authoritative sources, can also be manipulated to mislead or undermine scientific consensus.4
Open science is a Casebook value under the "Vulnerabilities" variable in the code book.
- 1“Open Science Movement | United Nations Educational, Scientific and Cultural Organization,” accessed January 3, 2021, http://www.unesco.org/new/en/communication-and-information/portals-and-platforms/goap/open-science-movement/; “What Is Open Science? Introduction,” Foster Open Science, accessed January 3, 2021, https://www.fosteropenscience.eu/content/what-open-science-introduction.
- 2Christopher Allen and David M. A. Mehler, “Open Science Challenges, Benefits and Tips in Early Career and Beyond,” PLOS Biology 17, no. 5 (May 1, 2019): e3000246, https://doi.org/10.1371/journal.pbio.3000246; Martin Lakomý, Renata Hlavová, and Hana Machackova, “Open Science and the Science-Society Relationship,” Society 56, no. 3 (June 1, 2019): 246–55, https://doi.org/10.1007/s12115-019-00361-w.
- 3 Amy Koerber, “Is It Fake News or Is It Open Science? Science Communication in the COVID-19 Pandemic,” Journal of Business and Technical Communication 35, no. 1 (January 1, 2021): 22–27, https://doi.org/10.1177/1050651920958506; Joan Donovan, Irene Pasquetto, and Jennifer Pierre, “Cracking Open the Black Box of Genetic Ancestry Testing,” Proceedings of the 51stHawaii International Conference on System Sciences, https://doi.org/10.24251/HICSS.2018.218; Aaron Panofsky and Joan Donovan, “Genetic Ancestry Testing among White Nationalists: From Identity Repair to Citizen Science,” Social Studies of Science 49, no. 5 (October 1, 2019): 653–81, https://doi.org/10.1177/0306312719861434.
- 4Crystal Lee, Tanya Yang, Gabrielle Inchoco, Graham M. Jones, and Arvind Satyanarayan, “Viral Visualizations: How Coronavirus Skeptics Use Orthodox Data Practices to Promote Unorthodox Science Online,” CHI ’21, May 8–13, 2021, Yokohama, Japan, https://arxiv.org/pdf/2101.07993.pdf.
The part of the web that is not protected by passwords, and thus accessible to anyone.
P
Parler was marketed as the free speech alternative to popular social media. The social media site was launched in 2018 with an interface similar to Twitter’s, and gained users rapidly after the 2020 presidential election. After the January 6, 2021, attack on the U.S. Capitol, it was removed from the Apple and Google app stores, and dropped by its server host, Amazon, based on its alleged use by people involved in the insurrection. Parler is a Casebook value under the "Network Terrain" variable in the code book.
A strong supporter or committed member of a party, cause, or person. Partisans are a Casebook value under the "Attribution" variable in the code book.
Fraudulently posing as a trustworthy entity in a malicious attempt to access confidential information such as usernames, passwords, and credit card details, usually by the means of email. Phishing is a Casebook value under the "Tactics" variable in the code book.
Web-based technology that allows users of that platform to generate content and engage in peer-to-peer conversations and other forms of engagement (e.g., likes, follows, retweets).
Political adoption occurs when a political entity (politician, political party, etc.) embraces or co-opts a phrase, term, issue, idea, or movement in order to advance a political goal or shape a specific narrative.
A Casebook example of political adoption can be found in “Trading Up The Chain: The Hydroxychloroquine Rumor,” which chronicles the popularization of an anti-malarial drug as a supposed cure for COVID-19. In this instance, Donald Trump and others who pushed for a “reopening” of the economy during the coronavirus pandemic adopted the hydroxychloroquine myth because it promised a speedy return to normalcy in the face of an increasingly chaotic situation.
Political adoption is a Casebook value under the "Observable outcomes" variable in the code book.
A group of people sharing similar ideology or political positions who participate in elections by fielding candidates that will then carry out their goals and policies. Political party is a Casebook value under the "Targets" variable in the code book.
A politician is a person engaged in party politics or occupying public office. Because of their visibility, policies, or affiliations, a politician can be the target of disinformation campaigns. Politician is a Casebook value under the "Targets" variable in the code book.
Individuals who engage in activity designed to elicit a reaction from a target purely for fun or mischief. Pranksters are a Casebook value under the "Attribution" variable in the code book.
Preconceived assumptions that are biased against an individual or group based on race, religion, sexual orientation, age, class or other demographic identifier. These judgements are not based on facts and perpetuate negative stereotypes against people. Within the study of media manipulation, prejudice acts as a vulnerability as it creates divisions between groups of people that can be exploited in an effort to seed false or damaging narratives, reinforce in-group biases, or encourage harassment.
As seen in the case of "Targeted Harassment: The Media-Led Witch Hunt of Rhea Chakraborty," online users accused Chakraborty of being a witch because she was from the Indian state of West Bengal, where women are stereotyped as seductresses and using black magic. Indian media outlets promoted prejudiced narratives against Indian women by publishing misogynistic stories of Chakroborty being a ”manipulative girl” without any evidence.
Other cases of prejudice can be found in "Butterfly Attack: Operation Blaxit" and "Misinfographic: The Spread of the Jihad Chart in India."
Prejudice is a Casebook value under the "Vulnerabilities" variable in the code book.
A preprint server is a data repository that hosts scholarly articles before they have been formally peer-reviewed and published. Typically, the author(s) of a paper will upload a draft version to the preprint server, which generates a publicly accessible URL where the paper and any other relevant data may be viewed, downloaded, or shared. Because the peer review process can often take months or even years, preprints are a quick way to circulate research outputs and data. However, authors of preprints may seek formal publication after uploading to a preprint server. Preprint servers can be discipline-specific (e.g., bioRxiv), regional (e.g., AfricArXiv), multidisciplinary (e.g., arXiv, SocArXiv), or general purpose (e.g., Zenodo). Preprint server is a Casebook value under the "Network Terrain" variable in the code book.
Technology that allows for peer-to-peer interactions that are private by default and require users to be invited and give consent to participate. Information sent via private messaging may or may not be encrypted.
The deliberate spread of information or ideas to influence a person, group, institution, or nation in support of—or in opposition to—a particular cause. Propaganda is often coded as "white," "grey," or "black." White referring to overt propaganda where the source of the content is clear, grey referring to propaganda with muddy or unclear origins, and black referring to propaganda that disguises its origins, often portraying itself as the target it is trying to discredit.
Publicly available information pertaining to individuals, organizations, companies, or any other entity that has been aggregated into an accessible, searchable, and organized format. Public directory is a Casebook value under the "Vulnerabilities" variable in the code book.
A private company that engages in public relations, branding, advertising and sales, or any other type of activity related to marketing, typically in service to a client. Marketing firms and public relations companies have been used in media manipulation campaigns to game engagement metrics, create the false sense of grassroots support (i.e. astroturfing), and amplify specific narratives or pieces of content for their clients.1 Where possible, the Casebook identifies the clients who have contracted or hired the marketing company. Public relations or marketing firm is a Casebook value under the "Attribution" variable in the code book.
- 1 Jonathan Ong and Jason Vincent Cabañes, “Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines,” 2018, https://doi.org/10.7275/2cq4-5396; Craig Silverman, Jane Lytvynenko, and William Kung, “Disinformation For Hire: How A New Breed Of PR Firms Is Selling Lies Online,” BuzzFeed News, January 6, 2020, https://www.buzzfeednews.com/article/craigsilverman/disinformation-for-hire-black-pr-firms; Samantha Bradshaw, Hannah Bailey, and Philip N. Howard, “Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation,” Oxford Internet Institute, 2021, https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/127/2021/01/CyberTroop-Report-2020-v.2.pdf.
R
Information that is deliberately false or misleading, intended specifically to exploit wedge issues related to race, racial justice, or communities of color.
Formalized by sociologists Michael Omi and Howard Winant in 1986, “racialization” is understood as “the sociohistorical process by which racial categories are created, inhabited, transformed, and destroyed.”1 As a tool of disinformation, “racialization” typically refers to the assignment of an issue, narrative, movement, or belief to a specific racial or ethnic group, by operators who do not identify with that group.
Immigrant rights lawyer Kimberly Grambo describes this process as a kind of “group defamation,” noting that it often “inflict[s] dignitary harm on individual members of minority groups” or “even dehumanizes individual members of those groups.”2
Much like disinformation more generally, racialized disinformation is often employed in the pursuit of political gain or profit, or to discredit a target individual, group, movement, or political party.
An example of a racialized disinformation campaign from the Casebook is the coopting of #Blaxit, an organic Black Twitter hashtag, by 4chan users attempting to sow anti-Black sentiment and suppress voter turnout for the Democratic Party.
- 1Michael Omi and Howard Winant, Racial Formation in the United States: From the 1960s to the 1980s (Routledge & Kegan Paul, 1986), https://books.google.com/books/about/Racial_Formation_in_the_United_States.html?id=fylRAQAAIAAJ.
- 2Kimberly Grambo, “Fake News and Racial, Ethnic, and Religious Minorities: A Precarious Quest for Truth,” University of Pennsylvania Journal of Constitutional Law 21, no. 5 (January 1, 2019): 1299, https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=1700&context=jcl.
Recognition by target is when a target of a media manipulation or disinformation campaign acknowledges and responds to the campaign's activities or operators. Recognition by target is a Casebook value under the "Observable outcomes" variable in the code book.
Recontextualized media is any image, video, or audio clip that has been taken out of its original context and reframed for an entirely different purpose or narrative frame. While cheap fakes, more broadly, alter the media, recontextualized media uses unaltered images, video, or audio but presents them in a new or false context according to the manipulators’ agenda.
During the early protests against the murder of George Floyd in June 2020, many recontextualized images spread on social media. One showed an image from the TV show Designated Survivor but claimed it was from a Black Lives Matter protest; another photo of a McDonald’s burning in 2016 was reframed as though it was from a current protest.1
A Casebook example of recontextualized media can be found in the case “Targeted Harassment: The spread of #Coronajihad,” where videos were re-captioned to exploit bias against Muslims and to blame Muslims for the spread of coronavirus in India.
Recontextualized media is a Casebook value under the "Tactics" variable in the code book.
- 1Jane Lytvynenko and Craig Silverman, “We’re Keeping A Running List Of Hoaxes And Misleading Posts About The Nationwide Police Brutality Protests,” BuzzFeed News, June 5, 2020, https://www.buzzfeednews.com/article/janelytvynenko/hoax-misleading-claims-george-floyd-protests.
Activities with the goal of enlisting or drawing new followers or members to a political party, social movement, extremist organization, ideology, or other distinct movement, group, or idea.
Reddit is a website where users can post information and prompts. These posts and prompts get responses and up or down votes from other users, which ranks the display of the content. The website is divided into user-created categories called "subreddits." The San Francisco-based site was founded in 2006. Reddit is a Casebook value under the "Network Terrain" variable in the code book.
Reputation management is a strategy in which campaign operators issue widespread press, content, or marketing materials in order to hide negative information, while amplifying positive news about their product, service, or cause. Reputation management is a strategy bigger than an individual refuting false claims about themselves; instead, reputation management refers to a concerted and deliberate strategy backed by resources (i.e. funding) and is professionalized, meaning those running the campaign are either paid or working on behalf of a company, client, or organization. Reputation management requires money and access to the professionals who provide these services, as the costs of reputation management services are only available to those already in positions of (financial) power.
Reputation management occurs online (on wire services, websites, and social media platforms) as well as offline (ex. billboards). Some of the tactics involved include using search engine optimization (increasing the likelihood that specific keywords appear first in search results), wire services to distribute materials across local news services, making YouTube videos that explain and promote their product, highlighting customer testimonials about the product on their homepage, and/or paying influencers to promote a service or product .
Reputation management may be referred to as “crisis communications,” which is when an organization, individual, or company seeks to protect itself against a public threat to its image—such as the emergence of controversial or reputation-damaging news. This strategy has been used by celebrities as well as extractivist industries, tobacco companies, and countries who wish to create a more positive image of themselves. For an example of reputation management, see the case study on “Astroturfing: How Hijacked Accounts And Dark Public Relations Faked Support For China's Response To Covid-19” and “Cord Blood and Medical Misinformation: The Big Business of Unproven Stem Cell Treatments."
Reputation management is a Casebook value under the "Strategy" variable in the code book.
Individual or coordinated group efforts to establish the origins and impact of a manipulation campaign. Research and investigation is a Casebook value under the "Mitigation" variable in the code book.
S
Individuals or groups involved in scientific research, medicine, or healthcare. This may include scientists, researchers, research labs, scientific organizations, health authorities, doctors, nurses, and other healthcare professionals. Scientific and medical community is a Casebook value under the "Targets" variable in the code book.
Snapchat is a mobile social media platform specializing in photo sharing and instant messaging. The app is known for its “disappearing” public posts and private messages, which are visible for a limited period of time. Notably, Snapchat was the first popular social media application to launch the “Stories” feature in 2013, which allows users to create short, temporarily visible image slideshows. The feature has since been emulated and become a staple across other popular platforms, including Instagram, Facebook, and YouTube. Snapchat is a Casebook value under the "Network Terrain" variable in the code book.
Manipulation with the aim of getting people to give up confidential information through trickery and deception rather than technical exploits. These attacks often take advantage of emotions, trust, or habit in order to convince individuals to take actions, such as clicking a fraudulent link, visiting a malicious website, or giving up login credentials. (Adapted from Forcepoint.)
Groups defined by some social, physical, or mental characteristics. Examples include race, ethnicity, gender, social class, sexual orientation, or religious beliefs. Social identity group is a Casebook value under the "Targets" variable in the code book.
Groupings of individuals or organizations that focus on political or social issues.
Comprised of a network of websites and software that link internet users together, often with the intention of fostering social interaction and promoting the exchange of goods and services.
Used as an adjective, the word sociotechnical typically describes something that exists due to a mix of both social and technical conditions. Within the study of media manipulation, for example, sociotechnical is often used to describe specific campaigns or other media phenomena as they are only emergent from the combination of social conditions and technical features.
A false online identity typically created by a person or group in order to promote a specific narrative or opinion, sow division, or circumvent a previous account ban.
A versatile set of techniques for feeding false information to journalists, investigators, and the general public during breaking news events or across highly polarized wedge issues. Specifically, source hacking exploits situations and technologies to obscure the authorship of false claims. For more, read "Source Hacking" (Data and Society, 2019) by Joan Donovan and Brian Friedberg.
An entity that is a part of, or which operates licitly or semi-licitly on behalf or in service of, a government agency. Within media manipulation, this may refer to state-run media outlets, operatives working for or with an intelligence or security agency (or other government agency), or other parties that are deliberately working to advance a given state’s objectives with the support, encouragement, or compulsion of the state actor. State actor is a Casebook value under the "Attribution" variable in the code book.
Media outlets that are under editorial control or influence by a country’s government. The articles and stories produced by these state media outlets may be distributed over broadcast (TV and radio), online, or print media. State-controlled media is a designation that applies when editorial freedom has been taken away by government influence, pressure, or money. These outlets can be used to push government propaganda. The label does not necessarily apply to all media that receives funding from the public. Media organizations that receive public funds but maintain their editorial freedom, such as the British Broadcasting Corporation (BBC) or Canadian Broadcasting Corporation (CBC), are not designated as state media.
State-controlled media is a Casebook value under the "Network Terrain" variable in the code book.
Best practices for ensuring responsibility and accountability when producing news content and the algorithmic systems that help spread it. For more, read "Stop the Presses? Moving From Strategic Silence to Strategic Amplification in a Networked Media Ecosystem" by Joan Donovan and danah boyd (American Behavioral Scientist, September 2019, doi:10.1177/0002764219878229).
The use of editorial discretion for the public good. For example, journalistic or editorial standards to not report on suicide. For more, read "Stop the Presses? Moving From Strategic Silence to Strategic Amplification in a Networked Media Ecosystem" by Joan Donovan and danah boyd (American Behavioral Scientist, September 2019, doi:10.1177/0002764219878229).
When loosely organized online groups come together for specific objectives or campaigns, such as spamming a comment section, engaging in harassment, or obfuscating a hashtag. Swarming is a Casebook value under the "Tactics" variable in the code book.
T
The continuation of a media manipulation or disinformation campaign with adjustments to the tactics. Tactical adjustment is a Casebook value under the "Campaign adaptation" variable in the code book.
The redeployment of a media manipulation or disinformation campaign's tactics. Tactical redeployment is a Casebook value under the "Campaign adaptation" variable in the code book.
Coordinated and organized online harassment of an individual or groups of individuals to threaten, censor, or upset them or to disrupt their operations or behavior. Targeted harassment is a Casebook value under the "Strategy" variable in the code book.
The practice of engaging in active care and maintenance of digital places as both a defense mechanism against manipulation and disinformation and in service of preserving the health of intra-cultural expression online.
Telegram is a free, open source, encrypted chat application. Its uses range from communication during pro-democracy protests in Hong Kong and Iran to use by extremist groups in the United States. It was launched in 2013 for iOS and 2014 for Android. Telegram is a Casebook value under the "Network Terrain" variable in the code book.
Testimonials are a tactic in which personal stories are used to elicit an emotional reaction, behavioral change, sway opinion, or persuade a reader or viewer of a specific viewpoint. These stories, which are often highly emotive, may be found on a company’s website, to promote a product or service; as part of an advocacy campaign for a social cause or policy; on social media, regarding a rumor or claim; and in press coverage, to give perspectives, insight, or confirmation to a news story. Testimonials may also be offered at in-person events such as rallies and religious gatherings. Testimonials are often quite effective. According to marketing research, “92% of consumers read online reviews and testimonials when considering a purchase. 88% of consumers trust online reviews as much as personal recommendations. 72% of them say positive reviews and testimonials make them trust a business more.”1
Research suggests testimonials may be effective because they convey authenticity, credibility, and empathy to its target audience.2 This can lead to a parasocial relationship whereby the audience develops a one-sided relationship with the person giving the testimony.3 In “The Inclusion of Patient Testimonials in Decision Aids: Effects on Treatment Choices,” Drs. Peter Ubel, Christopher Jepson, and Jonathan Baron found that giving patient testimonials to study participants significantly influenced participants’ treatment choices within the study.4 In “What works best: objective statistics or a personal testimonial? An assessment of the persuasive effects of different types of message evidence on risk perception,” researchers John B F de Wit, Enny Das, and Raymond Vet found that participants were more likely to get an HBV vaccination after hearing testimony (in comparison with giving patients objective statistics about vaccination) about the risk of not receiving the vaccination.5
Testimonial is a Casebook value under the "Tactics" variable in the code book.
- 1Emily Cullinan, “How to Use Customer Testimonials to Generate 62% More Revenue From Every Customer, Every Visit,” accessed on November 15, 2021, archived on Perma.cc, perma.cc/2Q2T-TLQQ.
- 2 Jihyun Kim and Hocheol Yang, “How Does a Radio Host’s Testimonial Influence Media Experiences? The Indirect Effect of Social Presence,” July 3, 2019, archived via Perma.cc, perma.cc/UHZ2-CCCW; Fred W. Sanborn and Richard Jackson Harris, “Google Books: A Cognitive Psychology of Mass Communication,” July 18, 2013, Google Books, archived on Perma.cc, perma.cc/9GDT-67NN.
- 3David C. Giles, “Parasocial Interaction: A Review of the Literature and a Model for Future Research,” Media Psychology, November 17, 2009, Vol. 4, No. 3, pages 279-305, archived on Perma.cc, perma.cc/9JKQ-P7SB; Leslie Rasmussen, “Parasocial Interaction in the Digital Age: An Examination of Relationship Building and the Effectiveness of YouTube Celebrities,” The Journal of Social Media in Society, Spring 2018, Vol. 7, No. 1, pages 280-294, archived on Perma.cc, perma.cc/H82Y-4WE9.
- 4P. A. Ubel, C. Jepson, and J. Baron, “The inclusion of patient testimonials in decision aids: effects on treatment choices,” January-February 2001, Vol. 21, No. 1, pages 60-68, archived on Perma.cc, perma.cc/WB8T-VH5U.
- 5John B F de Wit, Enny Das, Raymond Vet, “What works best: objective statistics or a personal testimonial? An assessment of the persuasive effects of different types of message evidence on risk perception,” Health Psychology, January 2008, Vol. 27. No. 1, pages 110-115, archived on Perma.cc, perma.cc/X5TK-KR35.
An online video-sharing platform owned by Chinese company ByteDance. After initially launching as “Douyin” within mainland China in 2016, TikTok became available to an international audience in 2017, when it was launched for iOS and Android operating systems. TikTok is a Casebook value under the "Network Terrain" variable in the code book.
Trading up the chain is the process of getting a story from a small, local, or niche platform or media outlet to a more popular, national news service.1 If an article or post gains enough traction on the small, local, or niche platform, it may be reported on by a larger site—either to be promoted or debunked.2 Regardless, the additional coverage spreads the original story to a new and larger audience. If successful, a line from a blog post or tweet can become national news.
Frequently, trading up the chain depends on a story not being fact-checked3 or amplification on social media where likes, shares, and retweets increase a story’s visibility and can “potentially reflect a tacit endorsement” of a post.4
P.M. Krafft and Joan Donovan describe a path of trading up the chain “from far-right blogs and forums to conservative media personalities on Twitter or YouTube to television media outlets like Fox News.”5 Alice Marwick and Rebecca Lewis describe Trump as a major amplifier of stories that have been traded up the chain: “If he tweeted about a conspiracy theory or made a false claim at a rally, it was considered newsworthy because of his candidacy.”6
Trading up the chain is described in Ryan Holiday’s 2012 book, Trust Me, I’m Lying: Confessions of a Media Manipulator.7
Disinformation campaigns using trading up the chain as a strategy can be found in "Trading Up the Chain: The Hydroxychloroquine Rumor," "Copycat Websites: The Endless Mayfly Network," "Viral Slogan: Hammer and Scorecard," "Targeted Harassment: The Ukraine Whistleblower," "Viral Slogan: 'It’s OK To Be White,'" "Viral Slogan: 'Jobs Not Mobs,'" "Misidentification: Republic of Florida Hoax," "Misidentification: How The #Antifafires Rumor Caught On Like Wildfire," and "Cloaked Science: The Yan Reports."
Trading up the chain is a Casebook value under the "Strategy" variable in the code book.
- 1 Alice Marwick and Rebecca Lewis, “Media Manipulation and Online Disinformation,” Data & Society, May 15, 2017, https://datasociety.net/library/media-manipulation-and-disinfo-online/.
- 2Marwick and Lewis, “Media Manipulation and Online Disinformation.”
- 3Marwick and Lewis.
- 4 P.M. Krafft and Joan Donovan, “Disinformation by Design: The Use of Evidence Collages and Platform Filtering in a Media Manipulation Campaign,” Political Communication, March 5, 2020, 37(2): 194-214, https://www.tandfonline.com/doi/full/10.1080/10584609.2019.1686094.
- 5P.M. Krafft and Joan Donovan, “Disinformation by Design: The Use of Evidence Collages and Platform Filtering in a Media Manipulation Campaign.”
- 6Marwick and Lewis, “Media Manipulation and Online Disinformation."
- 7Ryan Holiday, Trust Me, I’m Lying: Confessions of a Media Manipulator (Portfolio, 2012), https://ryanholiday.net/trust-me-im-lying-confessions-of-a-media-manipulator/.
Engaging in inflammatory, divisive, or distracting behavior in an online community with the goal of provoking readers or viewers into an emotional, often negative, response (e.g., anger, outrage, offense). Trolling is a Casebook value under the "Tactics" variable in the code book.
Individuals who engage in inflammatory, divisive, or distracting behavior in an online community with the goal of provoking readers or viewers into an emotional, often negative, response (e.g., anger, outrage, offense). Trolls are a Casebook value under the "Attribution" variable in the code book.
Twitter is an app and website where logged-in users can post up-to-280-character messages—called "tweets"—and up-to-140-second video/audio messages. These user accounts can like, comment on, and share other users' messages. Some user accounts are "verified" by the company, which bestows special privileges on these accounts, such as more moderation options. Users can choose if they want their profile to be public or private. Anyone without an account can access public tweets but cannot engage with them. The San Francisco-based site was founded in 2006. Twitter is a Casebook value under the "Network Terrain" variable in the code book.
The intentional registration of a domain name that incorporates typographical variants of the target domain name in order to deceive visitors. This may involve misspelling a domain or using a different top-level domain. Typosquatting is a form of cybersquatting, or an attempt to mislead users by fraudulently posing under someone else's brand or copyright. Typosquatting is a Casebook value under the "Tactics" variable in the code book.
U
Cases where there is insufficient evidence to definitively identify campaign operators or participants. Unclear attribution is a Casebook value under the "Attribution" variable in the code book.
Cases where there is insufficient evidence to identify or attribute specific attempts to suppress or dismiss the campaign, its effects (real or perceived), spread, and messaging. Unclear mitigation is a Casebook value under the "Mitigation" variable in the code book.
Cases where there is insufficient evidence to suggest adaptation, redeployment, or any other tactical change by campaign operators or participants. Unclear or no observable adaptation is a Casebook value under the "Campaign adaptation" variable in the code book.
There is no discernible strategy based on the available evidence. Unclear strategy is a Casebook value under the "Strategy" variable in the code book.
Media manipulation campaign operators use uncorroborated anonymous sources as a tactic to support their campaign’s goals or methods. By keeping the identity of the alleged source of information anonymous, the operator restricts the ability for others to critically evaluate and confirm the claim. Most professional news media organizations usually disclose named sources and apply strict guidelines when and if they attribute news and information to anonymous sources. The Associated Press, for example, restricts the use of anonymous sources to when they are vital to a story and the story could not be reported without that source. Campaign operators exploit this practice by using uncorroborated sources to cover a lack of evidence supporting their campaign, to push a narrative, and/or to make it seem as if they have special access to secret or guarded information.
Uncorroborated anonymous sourcing is a Casebook value under the "Tactics" variable in the code book.
V
Vimeo is a video-sharing platform launched in 2004. The site does not run ads, but users pay for subscriptions to the site. Vimeo is a Casebook value under the "Network Terrain" variable in the code book.
is the tactic of creating short, catchy phrases intended to deliver persuasive, disruptive messaging. Viral slogans may highlight social wedges, and sow additional divisions along political or cultural lines by capturing social media attention, provoking media coverage, and sometimes garnering institutional responses. These often divisive phrases are used on and offline, and spread virally through , hashtags, posters, and videos.
To succeed as viral slogans, they must expand past their community of origin and creators into the public sphere. With this scale and distance, authorship and origin are often concealed, enabling mainstream media coverage and further amplification without attribution.1 Successful viral slogans often capitalize on breaking news events, and can themselves be the catalyst for news coverage. As such, the outcome of viral sloganeering often is the popularization of previously underused words or phrases—effective tools in or the filling of data voids, which are terms and search queries about which there isn’t much content.2
Current examples of viral sloganeering include “Lock Her Up” (aimed at Hillary Clinton), “Send Her Back” (aimed at Ilhan Omar),3 and “Quarantine is when you restrict movement of sick people. Tyranny is when you restrict the movement of healthy people.”4 Casebook examples of viral sloganeering can be found in “Jobs Not Mobs” and “It’s OK To Be White,” both of which mainstreamed xenophobic and racist talking points.
Viral sloganeering is a Casebook value under the "Tactics" variable in the code book.
- 1Joan Donovan and Brian Friedberg, “Source Hacking: In Practice” (Data & Society Research Institute), 2019, https://datasociety.net/wp-content/uploads/2019/09/Source-Hacking_Hi-re….
- 2Michael Golebiewski and danah boyd, “Data Voids: Where Missing Data Can Easily Be Exploited” (Data & Society Research Institute), 2018, https://datasociety.net/output/data-voids-where-missing-data-can-easily….
- 3Technology and Social Change Project, “Lock Her Up? The Long Tail of Viral Slogans,” Weekly, April 28, 2020, https://medium.com/memewarweekly/lock-her-up-the-long-tail-of-viral-slo….
- 4Jazmine Ulloa, “How memes, text chains, and online conspiracies have fueled coronavirus protesters and ,” The Boston Globe, May 6, 2020, https://www.bostonglobe.com/2020/05/06/nation/how-memes-text-chains-onl….
Artificially boosting or decreasing the ratings on websites that feature crowd voting by coordinating large groups of people to submit (often false or misleading) reviews or votes with the aim of affecting scores on websites.
W
A page or series of connected pages on the internet located under a single domain name. When employed as part of a media manipulation campaign, a website is typically used by campaign operators and participants to host campaign materials and/or instructions to disseminate the materials. These web pages are often promotional in nature, and sometimes are created for the specific purpose of promoting the campaign. Pre-existing websites that promote organizations, ideas, or networks related to the campaign are sometimes also used by operators to house campaign materials.
For the purposes of the Casebook, this code does not include online news media publications, forums, or social media platforms. If a case is coded using this variable, it is referring specifically to a discrete website, a central purpose of which is to make campaign materials available on the internet. Websites are a Casebook value under the "Network Terrain" variable in the code book.
Sometimes referred to as “hot button” or “third rail” issues, wedge issues are political or social topics of public debate that are both polarizing in nature and incredibly salient among the mass public. Often, wedge issues cause a rift or disagreement between members of what would otherwise be a unified group, such as the citizens of a country or the members of a political party.1
They are also similar to “culture war” issues in that they bisect a group of people according to strongly-held beliefs or values that are not easily reconciled.2 This creates the perception that if one side is right, the other must be wrong. In the United States, popular wedge issues include access to abortion, racial justice, and gun ownership.3
Politicians, political influencers, and those running for office often draw attention to specific wedge issues as a means of exploiting social or cultural cleavages between people, and mobilizing single-issue voters in their favor.4
Wedge issue is a Casebook value under the "Vulnerabilities" variable in the code book.
- 1 Tim Heinkelmann-Wild et al., “Divided They Fail: The Politics of Wedge Issues and Brexit,” Journal of European Public Policy 27, no. 5 (May 3, 2020): 723–41, https://doi.org/10.1080/13501763.2019.1683058.
- 2 Paul Goren and Christopher Chapp, “Moral Power: How Public Opinion on Culture War Issues Shapes Partisan Predispositions and Religious Orientations,” American Political Science Review 111, no. 1 (February 2017): 110–28, https://doi.org/10.1017/S0003055416000435.
- 3 Matt Peterson and Abdallah Fayyad, “The Irresistible Effectiveness of Wedge Politics,” The Atlantic, December 8, 2017, https://www.theatlantic.com/membership/archive/2017/12/the-irresistible-effectiveness-of-wedge-politics/547946/.
- 4 Fredel M. Wiant, “Exploiting Factional Discourse: Wedge Issues in Contemporary American Political Campaigns,” Southern Communication Journal 67, no. 3 (September 1, 2002): 276–89, https://doi.org/10.1080/10417940209373236.
A Chinese social media site that primarily hosts short user-generated content (Weibo means "microblogging”), though it also allows for longer blog posts and multimedia content.
Modeled upon Twitter, Weibo has become the most popular microblogging service in China (and boasts 573 million active users worldwide1
).
Due to its popularity and public-facing features, Weibo has served as users’ news and information sources and as an arena for public discussions and breaking stories. However, it is also constantly subject to censorship burdens due to content laws and regulations in China.
Weibo is a Casebook value under the "Network Terrain" variable in the code book.
- 1“China: MAU of Sina Weibo 2021,” Statista, accessed May 18, 2022, https://www.statista.com/statistics/795303/china-mau-of-sina-weibo/.
A wire service distributes content, such as news articles, photos, videos, or press releases, to a list of syndicates. The distributed content is frequently run on the wire service’s site, as well as on outlets associated with the client list. Examples of wire services include the Associated Press, a American news wire that delivers articles to newspapers across the United States and the world, and Cision PR Newswire, which distributes press releases to news organizations. Advertising—sometimes labelled as “paid advertising,” “sponsored content,” or “press release”—can be included in a package. These services can be exploited as a network terrain for media manipulation when campaign operators purchase advertisements that are false or malicious for syndication on a newswire.
Wire service is a Casebook value under the "Network Terrain" variable in the code book.