We Need to Talk About Algorithms

8BC24547-4CD1-4B8A-AFE4-1B5BDA7C83A4

Algorithms are becoming increasingly important to diplomacy in the 21st-century. Yet few diplomats understand them, and even fewer understand their implications for the theory and practice of diplomacy. In this blog I will look at them in two particular contexts: geopolitical analysis and public diplomacy.

066A69AE-A064-48FE-B4F0-6787001D1D1C

The value of big data in the analysis and of likely future developments is evermore touted by artificial intelligent experts. The private sector is already making extensive use of big data in market and product analysis. Different approaches to predictive and prescriptive analysis are being developed. It is inevitable that governments too will increasingly rely on big data analysis to understand geopolitical trends and the reactions to their policy initiatives. The latter could be a particularly tempting solution to the challenges of evaluating the impact of public diplomacy. However, there are two problems: more information does not always make for better analysis; and big data analysis depends on algorithms which the users of the analytical product may not understand.

4CACFECB-6718-4AFA-BDE2-A3635620A42D

When I worked in Beijing I recall being invited to the Political Section of the US Embassy to exchange views on where China was going after Tianmen Square. It was a depressing place. Concerned that the Chinese would be spying on them, there were no windows and no natural light. However, what it lacked in natural light it made up in the enormous quantities of data about economic and agricultural development from across different Chinese provinces. The quantity of data was exponentially greater than what we collected in the British Embassy. However, I was not, and am not, convinced that it led to better analysis. Drowning in detail, many of the American political officers could not see the big picture or broader trends, which is what both of our political masters needed. Over the years I have been in many discussions about what makes a good political analyst. I have concluded that it is the knowledge of the subject that comes from years of experience, that gives the analyst an instinctive grasp of what she is looking at, combined with a capacity for self criticism which enables her constantly to question her own conclusions. Excessive data is not good for either.

6ABE408B-90BC-459D-A828-902ADA6CA6E4

It is clear that the vast quantities of data available on the Internet are vastly too great for any human analyst, or group of analyst, to manage. This is where algorithms come in. Algorithms allow the data to be scanned for key trends or indicators relating to current and future development. Ironically, the first effort at this kind of big data analysis work carried out by the KGB in the early 1980s. Concerned that President Reagan was preparing a first strike attack against the Soviet union, KGB Rezidenturas around the world were instructed to collect information with certain key factors being identified as predicting an attack. The failure of this exercise, which nearly led to nuclear war, should serve as a warning for big data advocates today. KGB had in effect created their own algorithm, although without the technology that accompanies it tod. The problem was that the algorithm was biased towards the prejudices and presuppositions of those who designed it. It remains a key problem.

94C629CA-8552-4F24-91C0-D2A7797151ED

Someone has to design the algorithm. The way in which the algorithm is designed and structured will affect analytical output from scanning the data. Algorithm design and construction is highly technical. How many diplomats understand how algorithms function or how they are designed? If you do not know the basis on which the algorithm is designed, and what epistemological biases are operating in that design, you do not know how reliable the output is. Most policy officers, and government ministers, relying on the big data analysis for their policy decisions will see the algorithm only as a magical black box. Because it is not human, they will regard it as more objective, over-looking the subjective element in its design. They will have excessive confidence in its conclusions.

77D34B5E-A447-4178-9E6D-5915B20851C6

It is not only a question of accuracy. One of the major problems in policy-making is the phenomenon of groupthink. Once a particular analytical or policy framework has been established within a group, no one is inclined to question it and it will continue even when evidence from the world contradicts it (I have written previously about this in the context of the former Yugoslavia). The danger is that big data analysis, dependent on its algorithms, will reinforce the epistemological framework created when the algorithm was designed. In other words the use of algorithms in big data analysis could reinforce the phenomenon of groupthink.

18B2F24B-C791-41FA-B207-5C8AD1E65A3C

Challenging existing analytical frameworks is a major problem in foreign policy making. Foreign ministries, in my experience, are really not very good at sitting down and questioning their analytical assumptions about the world. They are no better at questioning the policy that is constructed on the base of that analytical framework. This is why the US in particular has developed the technique of red teaming to question assumptions and prejudices. The term comes from war gaming during the Cold War when the red team was the Soviet Union (and the blue team was NATO). The task of a red team is to challenge all the assumptions behind the analysis and decisions in a particular policy area. It tests the analysis or policy recommendation to the point of destruction. It gives policymakers a better idea of how good the analysis is or how resilient is the policy. The issue becomes how can we red team algorithms. Do we need analysts capable of understanding the inside of the algorithms, or do we need to develop algorithms that themselves can red team other algorithms? If we cannot find effective ways by which analysts and policy makers can challenge the algorithms behind big data analysis, we are handing our destiny to those who designed the algorithms.

8E7FDCCD-52F3-4C7B-AB77-A4D3864C9223

Incidentally, this has happened before. One of the causes of the global financial crisis (or at least a major contributory factor) was the dependence of investment funds and banks on complex financial models. The models were designed by PhDs in maths and physics (“quants”) who knew little about the real economy. The executives who took the key investment decisions understood little, if anything, about how the mathematical models were designed or the epistemological biases they contained (this is captured when Kevin Spacey’s market trader in Margin Call complains he can’t understand the data on a screen). We could be in danger of creating a similar disjunct in Diplomacy.

C6DE4D48-6035-4210-BA6D-438234B78FFF

There is one further point worth bearing in mind about the use of algorithms and big data analysis. Algorithms by their very nature must be online. They must be integrated with the Internet where the data is to be found that they are analysing. However, this makes them inherently vulnerable to cyber attack. Redesigning algorithms online, changing the assumptions on which they are operating or the kind of trends they are looking for in the data, could be an effective way of undermining a rival’s analytical and policy-making capacities. This suggests that algorithmic big data analysis will not replace human analysts, who will still be needed to ensure that the algorithmic output makes sense when compared to the real world.

IMG_2082

Algorithms also pose problems for public and digital diplomacy. Foreign ministries have, with more or less enthusiasm, adopted social media as a valuable way of getting messages across to foreign publics. There are various problems with this. The obsessive focus on social media can result in success being measured in terms of likes and retweets rather than an analysis of impact on foreign publics. More important may be the role of algorithms in social media. All major social media companies have developed highly sophisticated algorithms to ensure that we receive the posts or tweets (and adverts!) that will most interest us. They will recommend to us friends or followers that are most likely to fit in with our existing social network. Facebook, for example, is unlikely to ever suggested a Donald Trump supporter as a friend for me. As has been frequently noted, this increases the echo chamber effect whereby we only exchange views with or receive information from people with whom we already agree. Over time, the effective operation of social media algorithms mean that the echo chambers get ever smaller and ever better defined. The impact is worsened by the growing number of adults who get their news from social media (Pew reports that 67% of US adults get news from social media, 20% often). The problem for public diplomacy is that if you are using social media to get your message across you are only going to reach those who already agree with you. It also means that agile non-state social media users can associate themselves with more popular causes to undermine a country’s public diplomacy. We may have seen that to some extent with the Catalan crisis, where Catalan separatists have been able to identify themselves with antiestablishment, anticapitalist and pro human rights operators on the Internet. On the one hand this has meant their successful entry into self reinforcing echo Chambers among these groups. On the other hand it makes it very difficult for the Spanish government to challenge the image been put across of an authoritarian Spain. The social media make it hard for the Spanish government to break into these echo chambers.

9BD81ACD-0198-4A80-84DC-0F2D87EDD481

The fragmenting of foreign public opinion into ever more restricted echo chambers questions the entire purpose of public diplomacy. Whereas in the past diplomats could engage with media in a foreign country knowing that they could reach the majority of the population, now no such assumptions can be made. As public opinion fragments so does public diplomacy. More sophisticated digital diplomats try to get round this by using hashtags and other ways of reaching out across the echo chambers. But this is not guaranteed to work either. As Ilan Manor recently pointed out (https://digdipblog.com/2017/10/20/the-personalization-of-digital-diplomacy/) social media algorithms are increasingly personalising what we receive, so that the context in which each individual receives her posts is now very different. This shapes the context in which information put out by government is received by individuals, and how those individuals interpret it. It makes a great deal of difference if I see a government or embassy post on Facebook immediately after a heartwarming clip of dog or a heart rending clip of the wreckage of Raqaa. This suggests that future diplomats will have to be far more aware of how social media algorithms function, and ways of gaming them. This applies equally to search machines, where far right groups in the past have shown their ability to game how they work to ensure that material comes up first on Google searches.

43B1C55D-7C29-4842-84D6-7D9F8C69BD1E

The upshot of these examples from geopolitical analysis and public diplomacy suggests that tomorrow’s diplomats must be much more aware of how algorithms work and better trained in making use of them. This is all the more important given the apparent Russian success with Bot factories in placing their interpretations or events in western social media.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>