TikTok has just published its latest transparency report covering the first half of 2025, shedding light on how it is managing content moderation in the European Union. As part of its obligations under the Digital Services Act (DSA), TikTok provides regular updates showing how the platform is evolving in terms of user growth and handling problematic content. This is the fifth report TikTok has shared, and it reveals interesting trends about its progress and challenges in the region.
Strong Growth in European User Base
Over the past year and a half, TikTok’s presence in the European market has grown steadily. Between September 2023 and mid-2025, the number of active European users increased by 25 percent. Now, the platform serves around 170 million users across the EU. However, not every country has seen the same level of growth. For example, Estonia appears to be lagging, and countries like Sweden (+12 percent), Denmark (+15 percent), and the Netherlands (+17 percent) are experiencing slower adoption.
For context, TikTok’s global reach remains impressive, with about 100 million users in both Brazil and Indonesia. Meanwhile, its Chinese counterpart, Douyin, continues to dominate the local market with approximately 700 million users. This highlights how TikTok remains a major player worldwide, despite regulatory challenges and potential restrictions in markets like the United States.
Increased Content Removals Reflect Better Enforcement
During the first half of 2025, TikTok removed close to 24.5 million pieces of content that violated its Community Guidelines. This marks a 15 percent increase compared to the previous reporting period. Considering TikTok’s growing usage, this rise in removals seems natural. At the same time, the company has faced increasing criticism in Europe for allowing harmful, inappropriate, and sexualized content to spread on the platform.
However, the report indicates that TikTok is making real progress in improving content moderation. What stands out is that these improvements are happening even as the platform reduces its reliance on human moderators.
A Stronger Focus on Youth Safety and Wellbeing
One of the key areas of improvement is how TikTok handles youth safety and mental health concerns. The platform has taken a more proactive approach by using automated systems to detect posts that could harm young users or negatively affect mental and behavioral health.
Although TikTok has reduced its human moderation team by 26 percent since September 2023, its automated systems are now much more effective. Detection of mental health-related violations has increased from 49 percent in 2023 to 90 percent today. Similarly, violations connected to youth safety are now being detected automatically in 77 percent of cases, compared to only 38 percent in 2023.
Tougher Advertisement Monitoring
Between January and June 2025, TikTok removed 2.5 million advertisements—a notable jump from the 1.5 million removed in the second half of 2024. The platform is putting extra effort into removing ads that violate intellectual property rules, showing a 100 percent increase in such removals. This aligns with TikTok’s strategy to expand its in-app shopping features across Europe while ensuring a safe and fair environment for businesses.
Additionally, the number of ads removed for containing adult or political content has doubled, showing TikTok’s commitment to maintaining stricter ad policies.
Why This Matters for Businesses and Users
As TikTok pushes to enhance its shopping capabilities in Europe, it is important for the platform to maintain a strong enforcement strategy. Business users need to feel confident that their products are being advertised in a safe and controlled environment. The increase in content and ad removals reflects TikTok’s efforts to support this goal.
Remaining Challenges
Despite these positive steps, the report does highlight some continuing issues. For example, the automated detection of sexual content in advertisements has shown a decline. This suggests that while TikTok is moving in the right direction, it still faces challenges in some specific areas of enforcement.
TikTok’s latest DSA report shows clear progress in its approach to content moderation as it continues expanding across Europe. By focusing on automated detection systems and reducing moderation costs, TikTok is improving its efficiency in handling violations. Although there is still work to be done, especially regarding sensitive content in ads, the overall strategy appears to be focused on creating a safer platform for users and businesses alike. For those interested in the full details, TikTok’s official DSA Transparency Report for January to June 2025 is publicly available.