Digital Threat Digest - 1 September 2022
PGI’s Digital Investigations Team brings you the Digital Threat Digest, SOCMINT and OSINT insights into disinformation, influence operations, and online harms.
Why are we here?
Sometimes I think about the fact that my job relies on the existence of bad people doing bad things online. I also struggle to understand why people do this – why are people just consistently out there to push hate and lies. I’ve been angry at how casually racism, homophobia, and sexism is shared simply because an actor thinks this is the ‘truth’. Recently, I’ve had to reframe this anger a) because I have shouted at my screen too many times for it to be normal and b) because there has to be a deeper reason why people do bad things. I have shifted to being genuinely interested in how people find logical reasoning behind what they do online and from a behavioural point of view, it is fascinating to see how this reflects human nature.
I came across this article from ‘Psychology Today’, not a regular source but the title—‘What makes people share misinformation on social media’—piqued my interest. The article goes into several studies which have aimed to understand the drivers of disinformation; they considered age, gender, educational background, and political ideologies.
The study found that males, older people, and people with limited education were more likely to engage with misinformation. The study also found that there was a massive correlation with narcissism, psychopathy, sociopathy, and Machiavellianism. Perhaps there’s a sense of achievement that comes from either sharing a post you know is misinformative – like you can gain back some control in life or you’re genuinely led to believe that this is the truth based on your lived experiences.
What I found most interesting was that the majority of people who engage with misinformation are simply sharing because its been shared before – the illusionary truth effect where people believe information if they’ve heard it before, even if it isn’t true. Think about that on a wider scale and that’s how we’ve ended up where we are today. Ultimately, there is a psychological part to it.
The point of this is to say that while so much of what we look into is automated, there is a shift in the digital world where its real people doing bad things. Understanding the behaviour of these actors—why they do it—could really help us understand their next steps and also stop us from focusing on short-term solutions. These people have real feelings, real beliefs and, while it might seem crazy to some of us that someone can think climate change isn’t real or that drinking bleach is actually a path to health, deep down someone has some trait, some life experience that led them to it. We just have to dig deeper in order to actually combat the threat these actors pose online.
Reith 3.0
Last week, former BBC presenter Emily Maitlis accused her former employer of losing its way. She stated that the public broadcaster has become passive and self-censoring in an attempt to navigate the UK’s political divide. She goes on to say that the BBC has a “public duty” to challenge false claims and provide a reliable interpretation of facts.
However, we no longer live in the era of the 10 o’clock news. Social media means we are exposed to a constant stream of information and so surely, these days, it's more of a personal duty to filter through our feeds and identify what’s valuable, important, and true. Basically, the 24/7 online news cycle means that ordinary people no longer need a media elite to interpret the world for us.
Although, to caveat all of that, we must recognise that the information we access online is always pre-selected. I don’t just mean platform algorithms, but rather the active role of rogue states, shady PR firms, and dodgy think tanks all set out to infect our digital space and influence our opinions…and even voting patterns. If you have the means and the money, social media allows you to promote your conspiracy theory and state propaganda without fear or favour.
Maitlis’ vision likewise needs to adjust to this reality. Truth and verification take time. The right lie, engineered to exploit certain biases, can spread across social media much quicker than it can be debunked. Chances are that if a piece of disinformation has become noteworthy enough to fact-check it may already be too late.
Ensuring the public is reliably informed is no easy task in 2022. The BBC and other likeminded outlets cannot be held entirely responsible for mis and disinformation, nor can they act by themselves to counter these problems. Instead, we need a broader approach by all of society, starting with more critical engagement by users and greater action to expose malicious actors.
More about Protection Group International's Digital Investigations
PGI’s Social Media Intelligence Analysts combine modern exploitative technology with deep human analytical expertise that covers the social media platforms themselves and the behaviours and the intents of those who use them. Our experienced analyst team have a deep understanding of how various threat groups use social media and follow a three-pronged approach focused on content, behaviour and infrastructure to assess and substantiate threat landscapes.
Disclaimer: Protection Group International does not endorse any of the linked content.