The misuse of Artificial Intelligence can be a threat to democracy Lisanne Fridsma, Bob Lijnse, Mira Bandsom and Domiziana Scarfagna

Picture by Mantra AI

The 2020 United States presidential election will have one of the most digitized campaigns ever. Due to the corona virus political parties won’t try to gain votes by canvassing, using pamphlets and giving speeches in public. This election, political parties will embrace all online ways possible. New powerful ways to advertise online are using artificial intelligence in political microtargeting and bot usage. A new powerful marketing strategy is political micro-targeting. Micro-targeting is the of creating personalized messages aimed at individual voters, based on an intelligent prediction of personal preferences, most of the time derived from social networks. Another new online tool are bots, they are used to leave comments on social media to promote their political party or candidate. The misuse of political microtargeting and bots could pose a large threat to democracy. Therefore we should consider which effects will these new applications of artificial intelligence have on the elections?

Micro-targeting

Artificial intelligence has seen much progress over the last few decades. Due to the use of social networks such as Facebook and Instagram, large data sets are being collected. This data includes your network of friends, your online behaviour and your personal preferences. Based on all this data, an algorithm can predict who you are most likely to vote for or even what kind of advertising you might be susceptible to. Political parties and other groups with political interest can customize their messages to your personal profile.

In the Brexit election and the election of Trump in 2016, we have seen more discussion than ever about this microtargeting, fake news and whether or not Russia meddled in the elections. The question is how much this manipulative opinion-forming has affected the election results. It is hard to form an objective opinion when you don’t get information fed from the entire political spectrum, but the information is skewed towards a specific party based on your psychological profile.

Bots

Another form of artificial intelligence that is used in political campaigning is bots. These bots can generate and post messages on online platforms. Even when looking at an argument on a neutral platform, it could seem that one side of the argument is more sympathized due to comments made by bots. Because of this opinion-forming traffic, the line between human and artificial intelligence has become blurred. The problem is that it’s hard to distinguish a bot post from a human post. Therefore, it is important to establish rules to what extend robots will be allowed to carry human traits. Otherwise, there is a danger that social media bots will continue to be used as a deceptive tool.

Implications for future elections

An implication of the use of microtargeting and bots in political campaigns is the possible shift in power balance. The parties that invest the most in technology and work with the most advanced systems will gain an advantage in the political landscape. If this would result in different voting behaviour in the polling booth is unclear. However, it is prudent that there are clear risks here. For a properly functioning democracy in which people can form their own opinion freely, it is essential that people have access to truthful information and that this access is equal.

Therefore rules and regulations should be made to keep the democratic process as fair as possible. These regulations should restrict manipulative use of information and transparency should be demanded of all parties applying micro-targeting and bots. For example by making a ‘why am I seeing this add?’ button and disclaimers stating ‘this message was generated by a bot’ mandatory, voters can make a better distinction about the source of their information and the validity of it. Then, the US presidential elections of 2020 and beyond can be saved from unwanted opinion-forming traffic.