When we at Al Jazeera English (AJE) decided to produce ‘All Hail The Algorithm’, we approached our subject with a very specific vision and purpose – to tell a story of the phenomenon of algorithms in a manner that is relatable, clear and crucially, in a way that amplifies the voices we aren’t hearing enough of on this subject.
In reporting this series, our multimedia journalist Ali Rae travelled from Mexico to Jordan to Australia (and a few more points in between) piecing together a picture of algorithms at work across the globe. The anecdotes and research she brings together serve to shift the discussion of algorithms from the primarily first-world context in which they are so often spoken about, to reveal just how much of the development and impact of algorithms takes place in the global south. Also, while other news outlets do headline interviews with CEOs and prominent spokespersons of tech corporations, Ali shines a light on the scholars, activists and tech world insiders who are involved in critiquing, reforming and pushing back against the excesses of an algorithmically powered world.
The ultimate aim was to educate, empower and engage our online audiences. Algorithms may seem like a complex topic at first – but we hoped our less-formal, human-centred approach to the analysis would enable the content be more accessible and easily understood by a wide and varied audience. The series is presented, produced, filmed and edited by Ali Rae, alongside Executive Producer Meenakshi Ravi.
For more than six years Ali Rae has worked across AJE’s TV programme teams to bolster their presence online. This often meant converting existing TV content into formats appropriate for online platforms: social videos, Instagram stories and countless tweets and threads. After seeing first-hand the growth of AJE’s online audience - and the various ways audiences were consuming content differently to TV audiences - she proposed a new approach: a series that was made bespoke for our online platforms but that could also work for our global television audience.
With the backing of senior management, Meena and Ali set about experimenting with various styles and formatting techniques. There was no large camera crew / production teams. Ali filmed all the interviews herself - including using an Osmo to film pieces to camera, in a less formal ‘vlogger’ style. Knowing that the first 5 seconds are crucial to capture the attention of viewers scrolling through their feeds, each episode opens with a piece to camera with hints of light animation. The aim was to break down the barrier between a presenter and the audience - drawing viewers in from the get-go. When editing, "behind the scenes" footage was purposefully included to further this stylistic approach. From Twitter threads to IGTV and even a Facebook Watch Party - the series launched online with tailored content for each platform.
Tackling questions like trust, data colonialism, the power of algorithmic design and the pervasiveness of biometrics, this series weaves together what may seem like disparate tech debates into a comprehensive human-centred narrative. For example, in episode 4 Ali journeys to Jordan’s Zaatari refugee camp where the UN's World Food Programme is using iris scanning technology to provide food aid. While the ‘EyePay’ system is praised for helping track aid disbursement, smooth out payments and reduce the chances of corruption and fraud, Ali speaks to Stephanie Hare about the ethics of high-tech systems in so called ‘low-rights’ environments. Ali then connects the dots of similar stories of biometric use across the UK by filming a MET police facial recognition trial in London and speaking to children's rights campaigner Pippa King about why she thinks a surveillance-compliant society starts with normalising the technology with kids in schools. By focusing on shared themes (privacy, consent, legality, accountability) across these different scenarios, Ali is able to successfully engage non-tech literate audiences.
Joining the dots on complex narratives is made so much easier with clear, clever graphics. Animator Pierangelo Pirak not only visualised the seemingly invisible - algorithms - but produced an animation style that’s attention grabbing and editorially robust. A subtle handwriting font is used at points to give a ‘human’ touch to the mechanical algorithmic designs. There's also an effort made to never use tech cliches: 0s & 1s, personified robot figures or illuminati-esque symbols. These give a mythical nature to an issue that is at its core, very much about human control.
As AJE’s only digital-first programme, the series was initially released online - accumulating 2 million+ organic views. From Twitter threads to InstagramTV and even a Facebook Watch Party - the series launched with platform tailored content for younger audiences. The website landing page has had over 172,000 unique views. Extended TV versions were then produced for broadcast to 310 million+ unique homes in 150 countries. Each episode was repeated at 5 different time slots to ensure peak viewing times in different regions. Its popularity has lead to a re-broadcast in early 2020. Since release, several schools and universities have requested using the series for a variety of educational purposes.
There have been individual cases of impact - e.g in Australia, a single-mother who saw episode 1 on the country’s robo-debt scheme subsequently challenged her own automated welfare debt and managed to have it significantly reduced. Due to mounting criticism on several fronts, the scheme has been overhauled and is facing a class-action lawsuit. Episode 4’s investigation adds to mounting pressure for UK parliament to establish a legal framework for the use of biometric technology. To raise more awareness, Ali has attended UK luncheons for underrepresented groups in AI and technology, has upcoming speaking engagements (London 2020 DTG Summit) and attended community Q&A/screening. So far, the series has received 3 Lovie awards for European Internet Excellence and an Australia Walkley award for Current Affairs online reporting.