NAIROBI, Kenya — TikTok removed more than 580,000 videos in Kenya between July and September 2025, underscoring the scale of content moderation on one of the country’s fastest-growing social media platforms.
In its latest transparency update released on Tuesday, February 17, the short-form video platform said the content violated its Community Guidelines, which prohibit harmful material including misinformation, hate speech and other policy breaches.
According to the company, 99.7% of the videos were taken down before they were reported by users, while 94.6% were removed within 24 hours of being posted.
The report further revealed that approximately 90,000 live sessions were interrupted during the period for failing to comply with platform rules, accounting for roughly 1% of all live streams in Kenya during the quarter.
“In the third quarter of 2025, TikTok removed more than 580,000 videos in Kenya for violating its Community Guidelines. Of these, 99.7% of them were proactively removed before anyone reported them, and 94.6% removed within 24 hours of posting.
Additionally, the third quarter of 2025 saw around 90,000 Live sessions interrupted for not following platform content guidelines, representing 1% of live streams in this time,” read the report in part.
TikTok said the removals were driven largely by automated detection systems supported by human moderators.
“This approach is vital in ensuring that we provide a safe platform for our community, as we uphold our policies against harmful content, including misinformation, hate speech, and other violations,” TikTok added.
Rising moderation in a rapidly growing market
The latest figures build on earlier enforcement actions in 2025. During the first quarter of the year, TikTok reported banning more than 43,000 accounts and removing over 450,000 videos in Kenya.
At the time, the company said 92.1% of the removed content was taken down before being viewed, and 94.3% was removed within 24 hours of posting.
The company reiterated its moderation framework in a previous statement:
“By integrating advanced automated moderation technologies with the expertise of thousands of trust and safety professionals, TikTok enables faster and consistent removal of content that violates our Community Guidelines,” the company stated in its report.
Regulatory pressure and digital safety concerns
The disclosures come amid heightened scrutiny of social media platforms in Kenya and globally. Lawmakers and regulators have increasingly demanded stronger oversight of digital spaces, particularly around misinformation, online harassment and harmful live-streamed content.
Kenya’s Communications Authority has previously raised concerns about the spread of false information and inappropriate material on digital platforms, especially during politically sensitive periods.
With a young, mobile-first population, Kenya represents one of Africa’s most dynamic social media markets, but also one where online regulation debates remain active.
TikTok, owned by Chinese technology firm ByteDance, has faced mounting regulatory challenges worldwide, including tighter data governance requirements in the European Union and scrutiny in the United States over national security and content control policies.
Also Read: TikTok removes over 450,000 videos in Kenya for violating community guidelines
While the company positions its enforcement figures as evidence of proactive moderation, digital rights advocates continue to debate the balance between content removal, algorithmic transparency and freedom of expression.
The removal of more than half a million videos in a single quarter illustrates both the scale of TikTok’s Kenyan user base and the complexity of moderating user-generated content in real time.
As Kenya’s digital economy expands, particularly in online commerce, influencer marketing and political discourse, platforms are likely to remain under pressure to demonstrate faster detection, clearer accountability mechanisms and stronger community safeguards.
Whether the rising removal figures reflect growing harmful content or improved moderation efficiency remains a subject of ongoing analysis.

