“We always knew that this election was going to be fought on tech.” These were the words of a prominent Kenyan online influencer who used his significant online following, ability to “game” social media algorithms and a variety of inauthentic amplification techniques, disinformation and even subterfuge, to ensure that his candidate got maximum exposure online during Kenya’s 2022 Presidential election campaign.
In Kenya’s rapidly expanding digital environment, the weaponization of information online was a tactic observed by all sides in last year’s election – a contest which saw William Ruto pitted against former President Uhuru Kenyatta’s favoured candidate, the veteran politician Raila Odinga. Our study for the African think tank the Institute for Security Studies (ISS) “A question of influence? Case study of Kenya elections in a digital age” used data analytics, open source intelligence (OSINT) and field research and examined some 8 million online documents. The results highlight the extent towards online influence has become a commodity – in many instances unrelated to ideology or conviction.
Furthermore, it shows that although in Kenya’s case influence campaigns were largely home grown, the same tools and techniques could in future be used by other actors including hostile nation states, transnational criminal networks and terrorist organiations to shape public debate, political narratives and undermine democratic institutions – at scale.
With elections on the horizon in other digitally advanced African states – most notably South Africa in 2024 – Kenya’s experience of influence operations and disinformation, offers important lessons.
There are more than 11.8 million users of social media in Kenya, a figure which has grown three-fold since 2014 according to Statistica. While increased access to digital technology has helped to advance freedom of speech and robust political debate online, it has also provided a platform for coordinated inauthentic activity designed to distort, intimidate and essentially “hack” the electorate.
Why does this matter? It matters because some users of social media during election time (including some newsrooms) may find themselves used as unwitting foot soldiers to amplify disinformation or conspiracy theories online. It matters because one of the key pillars of any democratic state – its media and legacy or traditional media in particular – find themselves unable to compete and verify, contextualise or correct, highly coordinated influence operations (often masquerading as “news”) in an information environment where speed is king.
During Kenya’s presidential contest last year, fact checking organisations did an admirable job of trying to call out disinformation and inauthentic activity. Much of this activity was aimed at confusing voters or voter suppression, i.e. frightening assumed rival supporters into staying away from polling stations in crucial constituencies. But fact checkers’ can only do so much in a limited time and as we head into an era of ever more advanced AI, increased automation is likely to outpace the human ability to verify information in a timely manner. In turn, this paves the way for an automated disinformation arms race, in which Africa’s most digitally connected democracies are the most vulnerable.
Our ISS research throws light on a highly lucrative market for influence and hashtags for hire in Kenya, largely driven by commercial gain rather than ideological conviction. Product influencers lend their networks to political actors in return for significant financial rewards, only to return to promoting products after the election dust has settled.
The study explores the market for influence in granular detail, setting out the prices and ecosystem of online influence during election time in Kenya. It suggests that the rise of political “influence as a service” enables a highly skilled cadre of individuals, to tap into existing fears, social cleavages and conspiracy theories, and amplify them online. Moreover, these activities often occur without the users of social media being aware that they are co-opted as cogs in a complex wheel of influence and often distortion. Simply by liking, sharing, or re-tweeting a piece of content, they may be inadvertently widening the dragnet of disinformation.
Kenya is no stranger to influence operations. Remember the online work of Cambridge Anaytica which deployed a campaign that sought to tap into ethnic prejudices and the fears of Kenyan voters? Kenya is not alone. Rather, elections like this also served as a testing ground for social media manipulation ahead of more lucrative polls, such as the 2016 US Presidential elections, in which the company also boasted its involvement.
Now with more of the Kenyan electorate online, the possibility of shaping domestic narratives at speed and at scale is vast. Kenya positions itself as a supporter of freedom of speech and has a proud tradition of a politically engaged population, with a highly educated cadre of digital marketeers. Yet, the prospect of Kenya outsourcing its own influence expertise to potential malign actors across the region and even globally, who want to do more than simply push products or merchandise, should not be discounted – and needs monitoring.
Furthermore, outside of an election setting, for instance during times of national crises, the tools and tactics observed in Kenya may be exported and mirrored elsewhere in future, whether it is to justify xenophobic narratives, harden attitudes against minority communities, or give legitimacy to coups. Wagner is doing this already. We are therefore likely to see more players enter this lucrative influence market presenting a real challenge to the principles of democracy unless we act, and act now.
On 8 August 2023, at 11:00 – 12:30 South Africa time (UTC +2), ISS will host an-in person and online discussion as a panel presents the Kenya research. The event will be moderated by the prominent South African journalist Ferial Haffajee and participants and panelists will investigate the lessons learnt for other African states preparing to hold elections, looking at South Africa in particular as it prepares to go to the polls in 2024.
To register, click here.
To download the report, click here.
Karen Allen is research lead and report co-author. Karen is also a consultant at the Institute for Security Studies an applied policy think tank operating across Africa. She focuses on emerging threats including disinformation and. Information operations, terrorism, cybercrimes, justice and other conflict related themes. Karen is a visiting fellow in the War Studies Department of King’s College London where she obtained an MA in International relations and contemporary war. She was formerly a BBC foreign news correspondent based in Nairobi, Johannesburg and Kabul. Karen operates from Johannesburg South Africa.
Jean le Roux is a co-author of the report and a disinformation researcher covering sub-Saharan Africa. He is a research associate with the Atlantic Council’s Digital Forensics Research Lab. He has undertaken work on digital propaganda campaigns and disinformation. Jean has a Bachelor of Law degree and is based in Cape Town South Africa.
Allan Cheboi is a senior manager for investigations at Code for Africa in Kenya. He is a digital forensics specialist and contributed to the ISS study on disinformation and digital influence during the Kenya election of 2022. Allan is a participant in a number of high level forums related to internet governance and has been part of a consortium seeking to develop early warning systems to flag up and counter hate speech online.
Ferial Haffajee will be the moderator for the discussion. Ferial is an accomplished journalist and newspaper editor. She’s also an information, misinformation and disinformation editor who has spoken, written and advocated on the issue extensively. Ferial is based in Johannesburg South Africa.
Is there a link to the streamed version of the panel discussion on Tuesday? I missed the last part of what was a really excellent discussion.
Hi we are not sure I am afraid, but hopefully the authors will see this and pick it up.