Wasn’t 2024 supposed to be the year of election disinformation? So, where were the deepfake videos? Well, we had them, but they were mostly people dancing and we all knew they were fake anyway. In the run up to the US election, we were so primed to expect rampant disinformation that Donald Trump’s conventional looking victory has left everyone a little speechless.
It’s not that manipulation was absent: It’s that it was already baked into the ecosystem. The right time to worry about disinformation was 2016-2022. Now, bots are like background noise. Sure, you might find them promoting a hard right politician or a crypto scam. But you are as equally likely to find them praising something as criticising it. The reasons behind such contradictory behaviour might be fairly obvious (the bots are part of a network that can be hired for the right price) or something more opaque, like a pump and dump scheme targeting a particular stock.
Let’s start with some definitions. The word “disinformation” ends up doing so much heavy lifting that it has become a stand in for “stuff I don’t like”. So, let’s be more specific and talk about manipulation.
Thinking of “disinformation” as a removable layer provides a kind of comfort. It allows us to believe that with enough political will, better regulation of social media platforms, or new technology, we can purge this problem from our digital lives. In reality, it’s now an intrinsic characteristic of the information ecosystem.
When it comes to government legislation a recent report by NATO’s Strategic Communications Centre of Excellence showed that the EU’s Digital Services Act, considered to be the most robust social media regulation, had “minimal impact” during the two years it’s been in force. In terms of social media companies themselves, pretty much every platform has been scaling back content moderation teams.
The Need to Outcompete
If you can’t rely on governments or social media companies, the inescapable conclusion is that protecting yourself is down to you. As a report published by the Journal of Communication Management said; “future disinformation management efforts will need to rely on content influence”.
Online manipulation, at its heart, is an effort to attract your attention and get into your head. Successful manipulation (and there is plenty that isn’t successful) is a complex endeavour; you need to know the audience well enough to understand what will appeal to them, build (or rent) a distribution network, and constantly test and refine your output.
If you understand it in these terms, you can take steps to decrease the chances you are the victim of short sellers, competitors, hostile actors, malicious scammers, hackers, online blackmailers etc etc.
There are three key capabilities you need to develop:
Understand What People Are Engaging With and WhyManipulation succeeds when it connects with people’s emotions, beliefs, or fears. To counter it, we need to understand what content resonates with audiences. One of the main things bots are used to do, is to make us think a piece of content is really popular with others (fake social proof). So the tools we use must be able to differentiate between real and fake interest. This requires tools that can monitor online discourse, identify bots, and recognise emerging disinformation strategies. Current software tools often fail in these areas, leaving us blind to the little slivers of opportunity that bad actors are so quick to capitalise on
Know What to Respond To—and HowNot every falsehood needs to be addressed. In some cases, amplifying a false narrative can do more harm than good. Knowing when to intervene, what form that intervention should take, and how to do so effectively is key. Responses must be timely, targeted, and informed by a deep understanding of the audience.
Measure Impact and AdaptSuccess requires ongoing evaluation to understand what’s working and what isn’t. Metrics should focus not just on reach but on actual behaviour. Only by measuring the impact of our responses can we refine our strategies and make meaningful progress.
We built our AI tool Ariadne to help organisations deal with the information environment as it really is. Click on the button to see how it can help.
To learn more about how to protect yourself from online manipulation, click to the button below to follow. In the next post, we will cover the three most common manipulation plays, and how you can avoid the traps