By Pavlina Pavlova
Theories combining psychology and warfare are thousands of years old, but big data and social media have dramatically changed their operational landscape. As people’s lives become increasingly digitalised, the target of anyone aiming to influence democracy is clear: it is your data.
The primary goal of a military commander is to influence the decision-making of the enemy to his advantage. In the words of Chinese strategist Sun Tzu: “All warfare is based on deception”. As such, reflexive control is not a result of theoretical innovation. Formulated during the Cold War in the USSR, it bears resemblance to an old Soviet fairy tale. The Kingdom of Crooked Mirrors (Russian: Королевство Кривых Зеркал) tells a story of a little girl, Olya, who steps into the Kingdom where false mirrors show distorted reality and the characters cannot say the truth apart from lies — allowing the King to reign over his subjects. Released in 1963, the movie itself has been allegedly directed against the hypocrisy of western countries criticising Soviet propaganda. What makes reflexive control increasingly relevant today is how it has successfully established itself in the cyberspace.
Reflexive control, referred to also as perception management, is based on behaviourism and the premise that our reality is shaped by our perceptions.1 A coordinated influence over these perceptions can consequently change decisions and guide our actions. The ‘reflex’ in the theory involves the specific process of imitating the enemy’s reasoning and changing his views of the situation to make him take a decision which is in result advantageous for his opponent. The doctrine’s key method is “information operations” feeding targeted deceptive information to the selected audiences over time and using his weak points until they are accepted as the truth. We can imagine the process as the crooked mirrors showing us altered realities — a sustained campaign that feeds adversary constructed messages. Do it on a large scale and you can change an election outcome or sway referendums.
Perception management is not about advocating for a cause or changing an opinion — it is about forming a mindset, controlling attitudes and setting an outcome. In a political setting, it is an attack on democratic processes. One example is behaviour change agencies, such as the infamous Cambridge Analytica, a data analytics company in the middle of a global controversy about data misuse. Former CEO Alexander Nix boasted about black operations, such as orchestrating an effective grassroots campaign to increase political apathy among Trinidad and Tobago’s men and women to prevent them from voting in the 2013 presidential election. The most widely known use of their strategy was exploiting profiles of millions of Facebook users to target the US electorate in the 2016 presidential elections.
The method behind the company’s success was bringing big data and social media to the established military methodology. By harvesting private personal information from Facebook accounts, they were able to create complex psychological and political profiles. These models were applied for targeting selected groups of people with political ads designed to work on their particular set of values and opinions. This is a poignant example of weaponising personal data with the reflexive-control technique.
Overwhelmed by an unprecedented amount of information, we use preconceived mental frames to turn the chaos into order, subconsciously deciding which information we accept and which we cast aside, as argued among others by the research of George Lakoff.2 These frames are conceptual sets we hold as truth that act as filters. The essential task of reflexive control is to locate the weak link of the filter and take advantage of it. Such as, for instance, our fear of the unknown or repeated pieces of information to which our brain is more receptive to perceive as true — regardless of conflicting or contrary evidence. The main weapon of information war is individual memes, customised social media posts, fake news articles and content designed to have emotional impact on our self-perception and reaction to political incitements.
We are only starting to understand the structure of forces that came together to create the conditions for what the Mueller investigation of Russian interference in the 2016 US elections characterised as “information warfare”.3 One of the examples is a study published by Oxford’s Centre on Computational Propaganda showing fake news distributions by the state as observed in that year.4 The study operated with a premise that fake news was concentrated heavily in swing states. However, they concluded that the tie between the targeted states was actually a decreasing turnout. The voters were targeted by messages radicalising their political views. This tactics amplify both political spectrums to produce antagonism between voters that leads into an increased turnout. Used as a military theory, foreign governments employ similar techniques to generate chaos and target selected population groups, regions and countries with fake news narratives.
The business of doing politics has changed and the established voting patterns have broken down. Sharing of information operates in an imperfect digital system, where we have involuntarily become part of psychological games. We watched the technologies that many believed would contribute to a greater democracy, making democracy on many occasions more vulnerable to manipulation. The question remains how we can contest this process or minimize its negative repercussions.
One way could be a far-reaching regulation based on more accountability from the providers. Social media platforms are under growing pressure and criticism for their role in spreading deceitful information, especially as it relates to elections and political debate. Topical hearings have taken place on both sides of the Atlantic, but the First Amendment to the US Constitution guaranteeing the freedom of speech at almost any cost makes Europe a more probable leader in stricter privacy laws. A recent example is the European Court of Justice ruling that individual countries can order a host provider such as Facebook to remove illegal content, including hate speech from October this year.5 However, this landmark ruling has already been criticised for raising concerns that some countries could misuse it to silence critics. The argument between advocates for and opponents against regulations will be a lengthy one, as public security is always a trade-off between protection and freedom or convenience.
There is a story to be told about our data and how we can keep it safe. The mounting questions related to our privacy uncover unpleasant patterns and as citizens, we have the right to know who holds the power in the digital age. But we also need to reach to our ability to think critically and understand why certain pieces of information target us on social media platforms. Our integrity and determination to check the facts coupled with an up-to-date digital education can be the first step to counter manipulations. After all, when Olya defeats the evil, the Kingdom’s mirrors lose their power and its society becomes free.