Our upcoming #CSCW2018 paper examining information operations on Twitter in the context of the ongoing civil war in Syria. In this work, we look at how Russian, Syrian and Iranian info ops were integrated into an online community of information activists. faculty.washington.edu/kstarbi/cscw-W…
This paper focuses on the case of the “Aleppo Boy” (Omran Daqneesh) who was photographed after his family’s home was bombed in an airstrike conducted by Syrian or Russian forces in 2016 & who re-appeared on Syrian state television almost a year later. en.wikipedia.org/wiki/Omran_Daq…
The original (2016) photo of Omran served to garner attention to and sympathy for the Syrian people living in rebel held areas of the country, who were suffering impacts from military actions perpetrated by the Syrian government and their allies, including Russia.
Our paper describes how a seemingly “organic” online community worked to assemble and spread “undermining” narratives that sought to disrupt the prevailing narrative about Omran’s family and the causes of their suffering.
These actions served to sow doubt and create confusion about the causes and impacts of the civil war in Syria—likely serving to diffuse sympathy from Western audiences for the Syrian people and to demotivate action from Western governments.
We consider these info ops as collaborative work w/in an online crowd & conceptualize this “work” as not simply coordinated, but as an assemblage of diverse actors, driven by a variety of motivations, loosely collaborating in the production & propagation of strategic narratives.
This view extends previous descriptions of online info ops as perpetrated by armies of automated accounts (or “bots”) and factories full of paid trolls, and suggests a complex and in some ways organic system with emergent properties. Cultivation, rather than orchestration.
• • •
Missing some Tweet in this thread? You can try to
force a refresh
Excited to release our upcoming #CSCW2018 paper describing how disinformation agents from the Internet Research Agency in Russia participated in online discourse about the BlackLivesMatter movement and police-related shootings in the U.S. during 2016: faculty.washington.edu/kstarbi/BLM-IR…
Following up on network analysis showing RU-IRA agents were active on both “sides” (pro- and anti-) of the politicized BLM conversation, we conducted a deep qualitative analysis, showing how RU accounts impersonated “authentic” voices.
The content on one “side” was typically very different from the other—(on the left) exhibiting pride in African American identity, (on the right) using racist memes targeting African American people. But it did converge around some themes, e.g. attacking the "mainstream media”.
News coverage yesterday noted research documenting how Russian trolls spread anti-vaccine content (online) in 2016. I think it’s important to step back and see the larger picture… this is one of MANY conversations they infiltrated. (Thread) theguardian.com/society/2018/a…
I won’t pretend to understand ALL of their motivations, but there are a few trends here. 1) Infiltrating and cultivating online activism. 2) Fostering an epistemology that “questions more”—questioning experts in science, journalists, etc.
They cultivate online activism communities, become part of them, and then work to bridge those communities into other topics that serve their strategic interests. We’ve seen this in diverse online activism that crosses the political spectrum.
New research shows how the Russian Internet Research Agency targeted African Americans before the 2016 election, first with a benign ad fostering solidarity, then right before the election with an ad encouraging them not to vote. nytimes.com/2018/08/16/tec…
This study was led by University of a Wisconsin-Madison professor Young Mie Kim. This article about that study talks about how initial messaging made “identity” appeals, likely to gain trust. Latter messaging aligned with RU’s strategic aims (in this case to reduce Dem votes)
In work led by PhD Ahmer Arif, we saw similar trajectories in Twitter acccounts associated with the IRA... lots of work to enact an identity within a targeted group, and then occasion moves to push certain RU narratives including converging on anti-Hillary & anti-media messages
This story has a huge impact on how we understand the “fake news” ecosystem... it is very hard to disentangle financial and political motivations for the different entities in this ecosystem, and there are efforts to hide political motives beneath financial ones.
It also provides some insight into the “why” of the content sharing practices that are rampant in this ecosystem - because it’s free content. That likely incentivizes those who wish to propagate certain narratives to make their content easily accessible to these sites.
Another interesting piece is how invisible partnerships between politically motivated operatives and these websites can shape the kinds of information flowing there. We’ve theorized about these in our research, but this provides evidence that they do indeed exist.
In our research on polarized discourse we’ve seen quoted tweets enabling a kind of back alley ambush... pulling a person (the quoted author) out of their bubble & into yours, so you & your friends can dogpile them, where their friends can’t see and therefore don’t know to help.
And by *you* here I mean anyone, all of us who use the feature to challenge someone else on Twitter. Not specifically Charlie.
Adding a link. Our first paper on this talks about “framing contests” w/in the Black Lives Matter conversation, and shows how the quoted tweet function was used to challenge and reframe the other side’s narrative. faculty.washington.edu/kstarbi/Stewar…
This article on “flat earth” believers describes some of the “ways of knowing” that support this and other, related ideologies and highlights how social media platforms help draw people into these epistemic communities. newyorker.com/science/elemen…
"The conference audience was frequently encouraged to 'do your own research,' which mostly seemed to involve watching more YouTube videos and boning up on Scripture."
This idea of “doing your own research” is prevalent across the alternative media ecosystems that we’ve studied… in studying exploring the phenomena of denying mass shooting events and claiming those and other events are staged by “crisis actors”.