Digital Threat Digest - 2 March 2022
PGI’s Digital Investigations Team brings you the Digital Threat Digest, daily insights into disinformation, misinformation, and online harms.
Today we assess why Inception is the perfect model to build an Influence Operation around and take a look at Wikipedia truth wars.
The façade of harm
Cause and effect basically dictates that to achieve an impact you have to do something. Influence Operations come with a spectrum of cause, in that there are varying levels of things that you can do to achieve an impact. And it’s not always the most complex or technically sophisticated cause that will lead to the most severe impact; if anything it’s easier than ever for the least complex cause—rumour—to have the highest impact.
Early during the first lockdown, everyone started using the Houseparty app – it had 50 million downloads in April 2020. Then, rumours began to circulate that the app had been hacked and was insecure. Almost immediately its user base evaporated and just over a year later it was fully shut down. The hack was never proven, despite Houseparty offering a USD 1m bounty for anyone who could provide evidence of the hack.
Signal is an encrypted messaging service which understandably has seen a significant uptick in use in Eastern Europe over the past couple of weeks. In many ways it’s a competitor to Telegram. And, like clockwork, earlier this week rumours began to circulate online that Signal had been compromised and that a certain invasion-prone government was monitoring comms on the platform. The rumours started in a coordinated fashion, according to info from Signal, but subsequently began to spread organically.
To understand and eventually attribute the cause, you have to consider the effect. Rumours of compromise logically lead to service abandonment. Abandoning Signal pushes people to… Wickr? Unlikely, it’s still underused. Telegram? Almost definitely. Other less secure alternatives that may actually already be compromised or are easier to compromise? For sure. An Influence Operation doesn’t have to be a full scale content creation, content seeding, and content amplification model. Sometimes placing a singular targeted idea in the right place at the right time can have the greatest impact.
The Great Edit Wars
I’m not saying I didn’t pay attention in maths while at high school but I did spend most of fourth year playing six degrees of Wikipedia, which follows a rule of the internet stating that you can navigate between any two articles on Wikipedia in six clicks or fewer by following links to other pages. These days it’s even wrapped in a nice webapp.
Thankfully, my dedication to Wikipedia ended there, but there exists a whole community of dedicated users who have devoted thousands upon thousands of hours of their time to curating and maintaining the site and its content. This is great, because you have crowdsourced content curation. But it is also terrible, because you have crowdsourced moderation. Its an unwritten rule of the internet that, much like politicians, every forum mod starts with the best of intentions and is immediately corrupted by power. It happens on gossip forums, it happens in online dictionary forums, it happens on games, and it happens on Wikipedia.
Of course, because it’s Wikipedia, all edit wars are open source and documented. So, where it becomes interesting, is when edit wars play out not at the behest of power hungry individuals, but as individuals acting under either directed or self-assumed nationalism. Slate have a really good writeup of how the Russian invasion of Ukraine is playing out in edit wars on Wikipedia and all the associated nuanced difficulties of documenting a conflict in real time for the future. Is it a conflict or an invasion? What about in the Russian language version? And beyond—Russia is engaged in strategic operations in the Central African Republic—how is the invasion framed in the French language version of Wikipedia in order to support their strategic goals in the region? When there’s no consensus on any political topic in the world how can there be consensus about how we document those topics?
More about Protection Group International's Digital Investigations
PGI’s Social Media Intelligence Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.
Disclaimer: Protection Group International does not endorse any of the linked content.