TikTok, the popular short-video platform owned by ByteDance, disclosed that it removed 2.1 million videos uploaded by Nigerian users in the third quarter of 2024 for breaching its content guidelines.
Nigeria ranked among the top 50 countries contributing to content violations during the period. Globally, the platform removed a staggering 147.8 million videos, underscoring its commitment to maintaining a safe and respectful digital environment.
In its Community Guidelines Enforcement Report, TikTok noted that the top 50 offending countries accounted for approximately 90% of all global content removals. Violations spanned several categories, including Integrity and Authenticity, Privacy and Security, Mental and Behavioral Health, and Civility. The platform emphasized that the offending content jeopardized user safety and the authenticity of interactions on the app.
Beyond video removals, TikTok deleted 214.8 million accounts globally in Q3 2024. Of these, 187.3 million were identified as fake accounts, while 24.3 million belonged to users suspected to be under 13 years old. The company also removed 3.2 million accounts for unspecified reasons. Additionally, TikTok eliminated 1.3 billion comments, 1.1 billion video likes, and 57.2 million fake followers, which were deemed generated through automated or inauthentic mechanisms.
The platform also took action against its advertising ecosystem, removing 1.9 million ads in the third quarter for violating advertising policies. This marked a decline from 2.2 million ads removed in the previous quarter. TikTok reiterated its ongoing efforts to refine its systems to swiftly detect and remove ads that breach its guidelines, ensuring compliance with its Community Guidelines, Advertising Policies, and Terms of Service.
Despite these enforcement measures, TikTok continues to face scrutiny from various governments. In October, 13 U.S. states and the District of Columbia filed lawsuits accusing the platform of failing to protect young users from harm. The lawsuits allege that TikTok’s design exploits children’s vulnerabilities to maximize profits, raising questions about its commitment to content moderation and user well-being. These legal challenges further spotlight the complexities of balancing platform growth with regulatory compliance and ethical responsibilities.