January 1, 2021 – March 31, 2021 Released: June 30, 2021
About this report
Millions of people around the world come to TikTok to express themselves creatively and be entertained. In order to promote a safe and welcoming environment, we work to uphold our Community Guidelines by removing accounts and content that don’t adhere to our policies. We strive to be transparent about how we enforce our policies to continue building trust with our community members. We regularly publish these reports with that aim in mind.
Starting with this report, insights related to the enforcement of TikTok’s Community Guidelines will be reported on a quarterly basis (every three months) instead of every six months. Information related to law enforcement, government, and intellectual property removal requests will continue to be published bi-annually due to the nature of processing and responding to those requests.
Our Community Guidelines apply to everyone and all content on our platform. Our TikTok team of policy, operations, safety, and security experts work together to develop equitable policies that can be consistently enforced. We do not and will not make policies solely at the request of an individual or organization nor can any one person at TikTok change our Community Guidelines. Our policies do take into account a diverse range of feedback we gather from external experts in digital safety and human rights, and we are mindful of the local cultures in the markets we serve. Our ultimate goal is to create guidelines that enable authentic and creative self-expression in a safe and entertaining community environment.
This report provides visibility into the volume and nature of content and accounts removed from our platform during the first three months of 2021 (January 1 – March 31). For the first time, we’re publishing the number of suspected underage accounts removed as we work to keep the full TikTok experience a place for people 13 and over. This report also includes information about our ongoing work to counter COVID-19 and vaccine misinformation and protect the security of our platform.
In the first quarter of 2021, 61,951,327 videos were removed globally for violating our Community Guidelines or Terms of Service, which is less than 1% of all videos uploaded on TikTok. Of those videos, we identified and removed 91.3% before a user reported them, 81.8% before they received any views, and 93.1% within 24 hours of being posted.
This chart shows the five markets with the largest volumes of removed videos.
Country / Market
We continue to rely on technology to detect and automatically remove violating content in some markets. Of the total videos removed globally, 8,832,345 were flagged and removed automatically for violating our Community Guidelines.
TikTok offers creators the ability to appeal their video’s removal. When we receive an appeal, we will review the video a second time and reinstate it if it had been mistakenly removed. Last quarter, we reinstated 2,833,837 videos after they were appealed. We aim to be consistent and equitable in our moderation practices and will continue our work to reduce false positives through ongoing education and training of our moderation team.
The following chart shows the volume of videos removed by policy violation. A video may violate multiple policies and each violation is reflected.
Of the videos removed by our safety team, the following chart shows the rate at which videos were proactively removed by policy reason. Proactive removal means identifying and removing a violative video before it’s reported to us. Removal within 24 hours means removing the video within 24 hours of it being posted on our platform.
Proactive removal rate
Removal within 24 hours rate
Adult nudity and sexual activities
Harassment and bullying
Illegal activities and regulated goods
Integrity and authenticity
Suicide, self-harm, and dangerous acts
Violent and graphic content
Adult nudity and sexual activities We strive to create a platform that feels welcoming and safe, and we remove nudity and sexually explicit content. Of the videos removed, 15.6% violated this policy, which is down from 20.5% in our last report. TikTok strictly removes porn and nudity. For content that may be appropriate for mature rather than general audiences and does not violate our policies, we may limit distribution by making the content ineligible for recommendation into anyone’s For You feed. We removed 86.1% of these videos before they were reported to us, and 89.8% were removed within 24 hours of being posted, which is a slight decrease from our last report.
Harassment and bullying We believe in an inclusive community and creative self-expression and do not tolerate members of our community being shamed, bullied, or harassed. Of the videos removed, 8% violated this policy, which is up from 6.6% in the last half of 2020. This increase reflects an update to our policies late last year which provide stronger protections against bullying behavior combined with modest progress by our systems that work to detect bullying and harassment. Of these videos, 66.2% were removed before they were reported to us, and 83.8% were removed within 24 hours of being posted. To help prevent bullying, we’ve introduced a number of tools to help creators manage their TikTok experience, including numerous commenting, reporting, and blocking tools, as we work to improve proactive detection in this area.
Hateful behavior TikTok is a diverse and inclusive community that has no tolerance for hateful behavior. Of the videos removed, 2.3% violated this policy, slightly up from 2% in the last half of 2020. We attribute this increase to both stronger protections against hate speech and hateful ideologies in our policies and a new model that’s incrementally better at detecting hate speech, resulting in more removals. We removed 67.3% of hateful behavior videos before they were reported to us, though 83.9% were removed within 24 hours of being posted.
Illegal activities and regulated goods We work to ensure TikTok does not enable criminal activities or the promotion or trade of drugs, tobacco, alcohol, and other controlled substances or regulated goods. During the first three months of this year, 21.1% of the videos removed violated this policy, which is an increase from 17.9% in the last half of 2020 as attempts to post this content on our platform have risen. Of these videos, 96% were removed before they were reported, and 95.6% were removed within 24 hours of being posted. This is an improvement over our last report thanks to continuous advancements in our detection models.
Integrity and authenticity We believe that trust forms the foundation of our community, and we do not allow content or accounts that involve fake engagement, impersonation, and misinformation. Of the videos removed, 2% violated this policy, down from 2.4% in the last half of 2020. Of these videos, 78.5% were removed before they were reported to us, and 88.9% were removed within 24 hours of being posted. We’ve continued to improve our misinformation detection models, and this work contributed to the notable increase in proactive detection during the first quarter of 2021 compared to the last six months of 2020.
Minor safety We are deeply committed to the safety of minors and regularly strengthen this policy and our processes that help keep minors safe. In the first quarter of 2021, 36.8% of content removed violated our minor safety policy, compared to 36% in the last half of 2020. Of those videos, 97.1% of videos were removed before they were reported to us, and 96.2% of videos were removed within 24 hours of being posted. This continued improvement in proactive detection is the result of advancing our models that work to identify and flag violations as content is uploaded, before it receives any views.
Suicide, self-harm, and dangerous acts We care about the well-being of the individuals that make up our community and work to foster a supportive environment. Of the videos removed, 5.7% violated these policies, which is a decrease from 6.2% in the last half of 2020. We believe this is due to a decrease in the overall volume of these videos as we make a concerted effort to quickly remove content and redirect people searching it to expert support and emergency resources. Of these videos, 93.7% were removed before they were reported, and 91.7% were removed within 24 hours of being posted.
Violent extremism We take a firm stance against enabling violence on or off TikTok. Of all videos removed, 0.5% violated this policy, compared to 0.3% in the last half of 2020. Of these videos, 82.1% were removed before they were reported, and 87.1% were removed within 24 hours of being posted. We believe this increase in removals is due to our guidelines now describing in greater detail what’s considered a violent threat and/or incitement to violence in terms of the content and behavior we prohibit. We’re developing our proactive detection to flag this type of content while we’re also reviewing reports from users and non-governmental organizations.
Violent and graphic content TikTok is a platform that celebrates creativity but not shock-value or violence. Of the total videos removed, 8.1% violated which is on par with content removed during the last half of 2020. Of these videos, 93.4% of videos were removed before they were reported, and 93.6% were removed within 24 hours of being posted. This is a modest improvement from our last report and the result of strengthened policies and improved detection of such content.
In the first quarter of 2021, 11,149,514 accounts were removed for violating our Community Guidelines or Terms of Service. This includes 7,263,952 suspected underage accounts that were removed from the full TikTok experience for potentially belonging to a person under the age of 13, which is less than 1% of all accounts on TikTok. In the US, we accommodate people 12 and under in TikTok for Younger Users, a curated viewing experience with additional safeguards and privacy protections designed especially for this audience. Through our partnership with Common Sense Networks, we work to provide these users with a safe and age-appropriate viewing experience.
On top of that, we prevented 71,470,161 accounts from being created through automated means and removed an additional 12,378,928 videos posted by spam accounts.
TikTok has strict policies to protect users from fake, fraudulent, or misleading advertising. Advertiser accounts and paid advertising are held to these policies and must follow our Community Guidelines, Advertising Guidelines, and Terms of Service. In the first quarter of 2021, we rejected 1,921,900 ads for violating advertising policies and guidelines. We are committed to building a positive, authentic, and joyful experience when it comes to seeing advertising on TikTok, and we will continue to invest in more people and technology to ensure advertising on our platform meets our standards.
In October 2020, we launched our global bug bounty program as an extension of our vulnerability management program to proactively identify and resolve security vulnerabilities. This program strengthens our overall security maturity by encouraging global security researchers to identify and responsibly disclose bugs to our teams so we can resolve them before attackers exploit them.
In the first quarter of 2021, TikTok received 33 valid submissions, and of those submissions 29 have been resolved. We received 8 public disclosure requests, and all 8 were published. On average, it takes TikTok 3 days to pay out a bounty. With regards to response efficiency, TikTok has an average first response time of 8 hours and resolution time of 30 days.
At TikTok, we work to create age-appropriate environments by developing policies and tools that help promote safe and positive experiences on our platform. In January 2021, we introduced new default privacy settings for teens, setting user accounts ages 13-15 to private and restricting who can download their videos and engage with their content via comments, Duet, and Stitch. For user accounts ages 16-17, Duet and Stitch are set to Friends, and the default setting for who can download their videos is Off.
This is a meaningful step towards driving higher default standards of privacy and safety for minors. And, these updates built upon previous age-restricted TikTok features, including direct messaging, live stream, and virtual gifting, all of which aren’t available to accounts under 16 or 18, respectively. Our goal is to help people make informed choices about what and with whom they choose to share, and by engaging teens early in their privacy journey, we hope to empower them to carefully manage and shape their online presence. We also aim to engage parents and guardians by equipping them with tools and resources to have conversations with their teens about digital literacy and safety. For instance, our Family Pairing features enable parents or guardians to link their TikTok account to their teen’s to enable a variety of privacy, browsing, and screen time settings.
The immense diversity of our community is part of what makes TikTok a wonderful place to explore and enjoy entertaining content, from new recipes to different cultural traditions and even ideas for a summer garden. We work to foster a supportive and welcoming environment for everyone, and in that spirit we introduced new tools to promote kindness and civility among our community.
We launched a prompt that asks people to consider the impact of their words before posting a potentially unkind comment.
We introduced a way for creators to filter all comments on their content so that only the comments they approve appear on their videos.
We announced a partnership with the Cyberbullying Research Center to advance our knowledge of bullying and develop new anti-bullying initiatives.
We want people to feel comfortable and confident expressing themselves exactly as they are on TikTok. To support body inclusivity on TikTok, in February we introduced access to well-being resources we developed with eating disorders experts to help people identify negative self-talk, think about one’s own positive attributes and strengths, or support a friend who may be struggling. We also published permanent public service announcements (PSAs) on relevant hashtags aimed at driving awareness or fostering support around recovery and those affected by eating disorders.
Protecting platform integrity
We continue our work to counter content and behaviors that seek to jeopardize the integrity of our platform. To aid us in this goal when it comes to illegal activities and regulated goods, we joined the Coalition to End Wildlife Trafficking Online which enables us to collaborate with others in the industry on best practices and emerging content and trends.
In addition, TikTok will remove or limit misleading information as it’s identified, and we partner with 11 organizations accredited by the International Fact-Checking Network to support fact-checking in 21 languages and 57 markets across the Americas, APAC, and EMEA. If fact checks confirm content is false, we’ll remove or limit it from our platform. In the first three months of this year, we introduced a feature aimed at reducing the spread of potential misinformation by informing the viewer when unsubstantiated content has been identified.