Social media

Meta’s east African content moderation hub shuts down


Meta’s east African content moderation hub is shutting down as the social media giant’s third-party contractor moves away from policing harmful content, cutting around 200 staff and leaving several employees without work permits.

The owner of Facebook, WhatsApp and Instagram first contracted Sama in 2017 to assist with labelling data and training its artificial intelligence, hiring around 1,500 employees.

But within two years, the Nairobi office was moderating some of the most graphic and harmful material on Meta’s platforms, including beheadings and child abuse.

Sama staff were told on Tuesday morning that the company would focus solely on labelling work — also known as “computer vision data annotation” — which includes positioning animations in augmented reality filters, such as bunny ears.

“The current economic climate requires more efficient and streamlined business operations,” Sama said in a statement encouraging employees to apply for vacancies at its offices in Kenya or Uganda. Some Sama staff rely on work permits to remain in the region.

Sama’s content moderation services will end in March, allowing for a transition period for Meta’s new third-party contractor. Meta will continue to employ 1,500 Sama staff for data labelling.

The news comes two months after Meta announced it would be cutting its global headcount by 13 per cent, or around 11,000 employees, as the social media company suffers from falling revenue, a slump in digital advertising and fierce competition from rivals, including TikTok.

One person familiar with the operations said Meta did not want a gap in services and used its position to push Sama to offer moderation services for longer than the contractor wanted.

The Nairobi office focused on content generated in the region, including about the civil conflict in Ethiopia, which Meta is being sued for over claims that the posts incited violence. Meta’s policies ban hate speech and incitement to violence.

The social media group, which employs more than 15,000 content moderators worldwide, said it had a new partner in place and that its moderation capabilities were the same.

Luxembourg-based Majorel, which has already led moderation services in Africa for short-form video app TikTok, is said to be taking on the contract, according to two people with knowledge of the changes.

“We respect Sama’s decision to exit the content review services it provides to social media platforms. We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content,” Meta added.

Sama is offering mental health support to staff affected by the cuts for 12 months after their employment ends and is paying undisclosed severance packages. Around 3 per cent of Sama staff are affected.

The cuts come as both Sama and Meta are being sued by a former employee Daniel Motaung, who has accused the companies of neglecting to provide adequate mental health support for moderators or fully informing them of the nature of the content they would be reviewing.

Motaung also claims that the companies transported workers from poorer regions of Africa, where they had no choice but to stay in their employment.

Meta has previously declined to comment directly on the lawsuit.

“We’ve seen the consequences of cut-rate moderation in the Ethiopian war — and just this week in the attack on Brazil’s democracy. These crises were fuelled by social media,” said Cori Crider, a director at Foxglove, who has been supporting Sama and other Facebook moderators in legal action against the two companies.



READ SOURCE

Business Asia
the authorBusiness Asia

Leave a Reply