TikTok is set to lay off hundreds of employees globally, including a significant number in Malaysia, as the company pivots towards AI to improve content moderation efficiency amid rising regulatory scrutiny.
TikTok Announces Job Cuts as AI Takes Over Content Moderation Roles
TikTok, the immensely popular social media app owned by China’s ByteDance, has made headlines with the announcement of global layoffs, affecting hundreds of employees worldwide, including a notable reduction in its Malaysian workforce. The strategic move comes as the company increasingly shifts towards artificial intelligence (AI) to manage content moderation, aiming to enhance operational efficiency.
The job cuts, which amount to fewer than 500 positions in Malaysia, were initially misreported as exceeding 700. Affected employees, many of whom were engaged in content moderation tasks, received their termination notices via email on Wednesday. This restructuring is part of TikTok’s broader strategy to streamline its global content moderation processes, with further layoffs potentially on the horizon as the company consolidates its regional operations.
Currently, TikTok relies on a mix of AI tools and human moderators to handle the large volume of user-generated content. A spokesperson for TikTok stated that the restructuring was essential to fortify the company’s global content moderation framework. As part of its ongoing commitment to safety and reliability, TikTok has announced a $2 billion investment in trust and safety measures globally for this year, with a significant portion of moderation tasks already automated. According to the company, 80% of harmful or violating content is now being removed using AI-based technologies.
The announcement of these layoffs coincides with a period of heightened regulatory scrutiny over tech firms in Malaysia, including TikTok. The Malaysian government has recently mandated that social media platforms obtain an operating licence by January, aiming to tackle the growing issue of cyber-related offences. The country has seen a surge in harmful online content, prompting authorities to demand improved monitoring efforts from companies like TikTok.
In the first half of 2023, TikTok, along with Meta, received a record number of content restriction requests from Malaysian authorities. The requests were focused on removing or limiting access to posts and accounts dealing with sensitive topics such as race, religion, and royalty. Meta, which manages Facebook and Instagram, restricted approximately 3,100 pages and posts within this period, marking a sixfold increase over the previous six months. These figures represent the highest level of content restriction since Meta began tracking such data in 2017.
TikTok, during the same period, reported receiving 340 removal requests, which resulted in the limitation or elimination of 815 posts and accounts. This number signified a tripling of content restrictions compared to the latter half of 2022, with Malaysia leading Southeast Asia in requesting content restrictions on the platform.
These developments highlight the growing challenges tech companies face in balancing operational efficiency with compliance to regulatory frameworks, particularly in regions with stringent content monitoring expectations. As TikTok continues to adapt its operations with increased reliance on AI, it remains to be seen how these changes will impact the platform’s ability to align with regulatory and safety requirements, especially in regions like Malaysia where government oversight is increasing.
Source: Noah Wire Services