Search...
Explore the RawNews Network
Follow Us

Meet Mercy and Anita: African workers driving AI revolution for just $1 an hour!

[original_title]
0 Likes
July 6, 2024

Mercy took a deep breath before loading yet another task onto her computer, seeing disturbing videos appear one after the other on her screen. Mercy worked at an outsourced office in Nairobi as a Meta content moderator, where she was expected to process one ticket every 55 seconds during her 10-hour shift. One such ticket involved video footage showing a fatal car accident which had been uploaded onto Facebook where it had been flagged by users. Mercy had to determine whether any video had violated company guidelines that prohibited explicit violent or graphic material. She closely observed as the person filming zoomed in closer on a crash video clip. Mercy recognised one of the faces on screen just prior to it coming into focus as her grandfather; when it did so, her chair backwards and ran towards an exit through rows of colleagues looking on in shock; crying loudly along the way and after reaching outside she began calling loved ones immediately. She was taken aback – no one had yet heard the news – so her supervisor came out to comfort her but also remind her that in order to meet her targets for today, she must return to her desk immediately. As soon as she told him of what happened, her boss reminded her to finish out her shift despite all odds. Tickets displaying images of her grandfather repeatedly colliding into vehicles appeared onscreen; every ticket included another crash report featuring his death. She began seeing videos shared by others – not only the same one but from various perspectives; with pictures and descriptions of both car, dead and scene. Gradually she recognized everything. Soon enough it had all come flooding back – her neighborhood, just hours ago at sunset which she had traversed many times over time; becoming familiar streets where she had spent much time. Mercy had lost four family members within hours. Her shift seemed endless as we spoke with several workers like Mercy from three data annotation and content moderation centers run by one company across Kenya and Uganda. Content moderators are professionals tasked with manually going through social media posts in order to remove harmful or violative posts and flag any violations against company policies. Meanwhile, data annotators label data with relevant tags so it can be read easily by computer algorithms. Hinter the scenes, these two forms of “data work” make our digital lives possible. Mercy’s story was particularly upsetting but by no means remarkable. Moderators witness suicides, torture and rape nearly daily…you normalise things that just aren’t normal…You become physically exhausted as well as mentally fatigued- one data worker from Nigeria said of their job experience there. Shifts at Mercy can be long, and workers must meet aggressive performance targets based on speed and accuracy. Their work also demands full concentration: content moderators must correctly tag videos according to strict criteria in order to do their jobs successfully. Videos must be examined to identify which violation(s) most violate Meta’s policies; violence and incitement for instance are more serious offenses than simple bullying and harassment, thus it’s not enough just to identify one violation and stop there. “What was most distressing was not just violence,” an experienced moderator explained, but rather sexually explicit content which contained explicit material that might cause discomfort to viewers. Moderators witness suicides, torture and rape “almost daily”, commented one moderator; you normalise things that should never have become normalized in society. Workers at moderation centres are continually exposed to graphic videos or still images and have little time or opportunity to process what they witness firsthand. Everyday they must process between 500 and 1,000 tickets; many reported never feeling quite the same after having taken on this responsibility, with consequences often devastating in nature. “Most of us are psychologically damaged; some have attempted suicide; spouses have left and we can no longer get them back,” noted one moderator who had been terminated from his company. Another claimed “company policies were even more strenuous than our job itself”. Workers at one of the content moderation centres we visited were left distraught after watching beheading videos, leading them to tear up and shake in response. Management encouraged their workers to visit an “wellness counsellor”, an individual without formal psychotherapeutic training who provided 30 minute breaks as needed throughout each week. Workers who left their desks due to what they saw were informed they had violated company policy by failing to enter an idle/bathroom break code into their computer – meaning their productivity scores could be affected accordingly. There were numerous tales to tell: I collapsed at work”; I experienced severe depression”; “I needed hospital treatment”; and, “they showed little concern for our welfare”. Workers reported to us that management typically kept tabs on hospital records to determine whether employees took legitimate sick days; but not necessarily as an expression of genuine concern for their wellbeing or to wish them better health. By employing AI products we are directly impacting workers from around the globe.’ Photograph by Frank Nowikowski/Alamy Job security is low at this specific company – most workers we interviewed were employed on rolling one or three month contracts that could end as soon as client work had been completed. Working in rows, up to one hundred employees toil on production floors of a darkened building that forms part of an expansive business park on the outskirts of Nairobi. Their employer was one of the clients of Meta, an established BPO firm headquartered in San Francisco with delivery centres located across East Africa to offer low-pay, secure work. Local employees could work at any of Meta’s firm. Mercy herself had lived in Kibera slum – Africa’s most dense urban slum – prior to joining this company under the promise that they were helping disadvantaged workers gain employment formalization. Unfortunately, however, many workers fear raising complaints against management for fear that doing so might jeopardise their employment opportunities. Workers reported that any employees who complain were told to shut up, with reminders being given of how easily their replacement could come along. While many moderators we interviewed were Kenyan, others had come from various African nations just for this BPO job and to assist Meta with moderating other African languages. Numerous workers reported being vulnerable to harassment by Kenyan police as foreigners on the street; not only was police harassment an additional risk they faced but there was also risk of violence from criminal gangs that operated there. One woman we interviewed recalled how members of an African country with which Meta moderators disagreed were finding names and pictures online of those moderating and making threats towards those involved in moderation decisions that had already been made – this happened because these “liberation front” members disapproved with decisions made regarding moderation decisions that had already been implemented by those moderating. These employees were understandably shocked, so they took photos to the BPO for evaluation. The company informed Mercy and her colleagues they would assess improving security at production facilities; beyond this step they indicated there wasn’t anything further they could do; workers must simply “stay safe”. Most of us can only hope never to endure working conditions like those experienced by Mercy and her coworkers. But data work of this nature is performed by millions of workers around the globe and conditions vary greatly for different workers; after our research took place at one particular centre, some working conditions changed considerably. However, large companies such as Meta often contract out multiple moderation services providers who vie to secure lucrative contracts from Meta. Data work is integral to the daily products and services we rely on, from social media apps to chatbots and new automated technologies. Content moderators are essential in protecting social networks’ existence; without their work, social networks would quickly be overwhelmed with violent and explicit material that threatens users’ well-being. Without data annotators assembling datasets that teach AI the distinctions between traffic lights and street signs, autonomous vehicles would never be permitted on our roads. Without workers training machine learning algorithms, AI tools such as ChatGPT would not exist. We spoke with one such worker named Anita who works at a BPO in Gulu – Uganda’s largest city located north of Lake Victoria – working for an autonomous vehicle company project. Her job involves reviewing hours upon hours of footage showing drivers at the wheel and searching for signs that indicate lack of concentration or the presence of something that resembles “sleep state”. Manufacturers use this footage to develop an “in-cabin behavior monitoring system”, measuring driver facial expressions and eye movements to build their “in-cabin behaviour monitoring system”. Sitting for hours in front of a computer observing footage like this is draining work; having someone help monitor these scenes for you would certainly save some stress-inducing compulsion! Anita often feels her boredom physically as it sits her down in her chair and closes her eyelids; yet it remains important that she stays alert, much like those driving on screen. On average, data annotators work a full 45 hours each week under intense and stressful conditions, including possible unpaid overtime work, earning 800,000 Ugandan shillings a month – about US$200 in equivalent terms or approximately $1.16 an hour on production floors where hundreds of data annotators sit silently at rows of desks. Anybody familiar with call centres will recognize its setup – the system of management remains similar and light levels have been adjusted to reduce eye strain caused by nine hours of intense concentration. Workers at this factory work around-the-clock annotating images and videos in real time. Their screens constantly flicker with new requests requiring annotation. Like Anita, workers are trained to identify elements in an image in accordance with client specifications: for instance, drawing polygons around various objects like traffic lights or stop signs as well as human faces is one such technique they might employ. One such example can be seen near Gulu in Uganda where content moderators frequently flee rural poverty or urban slums for work as moderators. Photograph by Alan Gignoux/AlamyEvery aspect of Anita and her fellow annotators’ working lives are digitally monitored and recorded – from biometric scanning systems for entry to secure facilities to an extensive network of CCTV cameras, every detail is watched over closely by Anita’s employers and colleagues in Annotations Services (ATS). Every second of their shift must be tracked using efficiency-monitoring software on their computer. Workers we surveyed believe their managers have fostered an informer network among staff in order to detect attempts at union formation that go undetected. Working for extended hours on end is both physically and psychologically exhausting. Annotators in these settings do not have much room for self-direction; tasks are broken down to their core elements to maximize productivity and efficiency of workers, who perform repetitive actions at high speed. As such, they experience both complete boredom and overwhelming anxiety simultaneously. At the core of AI development lies people working under oppressive surveillance to keep their jobs and support their families. By purchasing AI products we insert ourselves directly into workers’ lives around the globe. When considering AI development our minds might wander towards engineers working from airy offices in Silicon Valley – this reality must also be taken into consideration when considering this field. Most people do not realise that nearly 80% of AI training time consists in annotating datasets. Gulu offers opportunities for pioneering technologies like autonomous vehicles, machines for nanosurgery and drones to develop further. As tech commentator Phil Jones put it: “the essence of machine learning lies within its tedious task of data labelling; here lies most of its laborious effort and time-intensive effort.” Data annotation has become an exploding global marketplace estimated to be worth $2.22bn in 2022 and expected to experience annual compounding growth rates of between 30-35% before exceeding $17bn by 2030. As AI technology becomes a mainstream part of retail, healthcare and manufacturing – just to name some of the sectors currently experiencing transformative effects – demand for quality data will only increase with time. Most workers in global south reside within informal employment sectors. Photograph by Yannick Tylle/Getty Images Modern tech firms can use their wealth and power to take advantage of an uneven distribution of AI work across the global economy, where most workers in countries of the global south work informally in digital fields such as AI. Unemployment remains at alarmingly high rates and it remains difficult for many workers to secure well-paying positions with employment protections. Vulnerable workers tend to accept lower wages; further, they tend not to push for improved working conditions because they know they can easily be replaced. Outsourcing work to the global South is becoming an increasingly attractive option for businesses not just because it creates economic opportunity for less fortunate communities but because it also leads to tighter workforce discipline, higher efficiency, and reduced costs. By employing AI products we are directly inserting ourselves into workers across the world’s lives. No matter our preference, all humankind is interlinked in some fashion or another. Like when drinking coffee, using search engines, chatbots or even smart robot vacuums engage us all in global production networks that extend from bean to cup production. Many tech companies take steps to hide how their products are made; many seek ways of concealing this reality from consumers and workers. These films present an unrealistic depiction of autonomous machines: computers sifting through mountains of data while teaching themselves along the way – rather than depicting their reality: poorly paid human workers trained them and managed them as part of an incessant cycle of training and management. Back home in Gulu, Anita had just arrived home after working an 8 hour day. She sits outside with her children under her mango tree in plastic chairs, growing increasingly tired as the sun falls below the horizon and her eyes start closing as darkness falls around them. Soon enough, they all go to bed – followed shortly by herself. Tomorrow morning at 5am she must begin annotating again – leaving no choice in leaving BPO! As they make their way towards work each morning they see former coworkers selling vegetables at markets or popcorn on streetsides – all these encounters reminding her why she left BPO so willingly in the first place! People would take any opportunity that arises. She just needs to stay on task, hit her targets, and ensure she does not become vulnerable in case her job dwindles away or changes workflow altogether. Perhaps another project will come her way or she could switch workflow. That would certainly provide some much-needed respite and something different to look forward to! Are You Labelling Streets, Outlining Signs or Exploring Other Countries Through AI by James Muldoon, Mark Graham & Callum Cant? For an edited excerpt of Feeding the Machine (Canongate PS20). Please support Guardian & Observer journalism by purchasing it through guardianbookshop.com; note delivery charges may apply).

Social Share
Thank you!
Your submission has been sent.
Get Newsletter
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus

Notice: ob_end_flush(): Failed to send buffer of zlib output compression (0) in /home3/n489qlsr/public_html/wp-includes/functions.php on line 5427