TikTok removed more than 49 million videos in just six months last year, the social media site’s latest transparency report has revealed.

The video app said it took down 49,247,689 videos in the second half of 2019, more than 98% of which were removed before they were reported by users.

According to TikTok’s report, 25.5% of those removed videos in December 2019 broke the site’s rules on adult nudity and sexual activities, with another 24.8% violating rules round the safety of minors.

The company said it was currently only able to publish figures on the type of video violation for one month of the period in question because its new content moderation system was only introduced then.

However, it said it would be able to share this data for the full time period of each report in future.

“Around the world, tens of thousands of videos are uploaded on TikTok every minute,” the app said in a blog post.

“With every video comes a greater responsibility on our end to protect the safety and well-being of our users.

“As a global platform, we have thousands of people across the markets where TikTok operates working to maintain a safe and secure app environment for everyone.”

According to the report, just over two million of the total removed videos were from users in the UK.

TikTok added that the 49 million removed videos accounted for “less than 1%” of all the videos created in the app.

The report also revealed the number of legal requests for user information TikTok received in the second half of 2019, which come from government agencies in connection with law enforcement investigations.

In the second half of the last year, TikTok said it received 500 requests in total, including 10 made in the UK.

The platform has come under increased scrutiny in recent weeks because of its Chinese roots, and the ongoing concerns, particularly in the US, over whether links to the country could be a security risk for users.

Earlier this week, US Secretary of State Mike Pompeo said that the Trump administration was “looking at” banning Chinese apps such as TikTok.

The video app did not address the issue directly in its transparency report, but said it was “committed to taking a responsible approach” to building its platform.

“We’re working every day to be more transparent about the violating content we take down and offer our users meaningful ways to have more control over their experience, including the option to appeal if we get something wrong,” the company said.