TikTok works with industry experts, non-governmental organisations, and industry associations around the world in our commitment to building a safe platform for our community. We collaborate with organisations in different regions to share best practices, create programs, and exchange ideas on safety-related topics.
We regularly work with experts in online security, wellness, digital literacy, and family safety to help provide advice and resources for you and your family.
We’re committed to fighting against harmful misinformation. Our fact-checking partners help review and assess the accuracy of content across more than 60 markets. If content violates our misinformation policies, we either remove it from our platform and notify the creator or it becomes ineligible for recommendation into anyone’s For You feed. If fact-checking results are inconclusive, we may add a prompt to notify viewers the content could not be substantiated and to consider before sharing potential misinformation.
TikTok’s “Be Informed” video series addresses an important building block of digital citizenship: media literacy, which strengthens the ability to access, analyse, evaluate, create, and act using all forms of communication. With our “Be Informed” series, the TikTok community is encouraged to think critically about not only the content they come across online, but also the content that they themselves create. Learn more
TikTok connects people who are looking for support with important resources directly from our app. We redirect searches and hashtags – for terms provided by NEDA, or associated with unsafe content we’ve removed from our platform – to the NEDA Helpline, where NEDA can then provide our community with confidential support, tools, and resources. Learn more
It’s important for our community members to look after their well-being, which means having a healthy relationship with online apps and services. Our Digital well-being features, including Daily Screen Time and Restricted Mode, empower our community to create the TikTok experience that’s right for them. We also worked with TikTok creators to launch in-feed screen time management reminders through our “You’re in Control” safety video series.
We recognise that having conversations about mental health, or asking for help can be really hard, which is why we’ve partnered with leading mental health advocacy organisations to help bring their message to TikTok as a way to foster ongoing conversations around emotional wellbeing. We also offer reporting tools and support resources for people who need direct help or see a video of someone who is referencing self harm. Learn more
TikTok has important partnerships with leading youth safety organisations, including ConnectSafely, the Family Online Safety Institute, and the National PTA. Together, we help parents and caregivers learn about TikTok and how their teen can be a safe and savvy internet user. We actively educate families about the safety tools and controls built into TikTok and encourage ongoing conversations about digital well-being and technology use.
TikTok also provides grant support to PTA units across the US. These grants support online safety trainings and relief for families and communities in need of devices, meals, and other essentials. Learn more
TikTok takes an uncompromising stance against violent extremism on or off our platform. Our mission is to inspire creativity and bring joy, and any attempt to promote violence runs counter to that mission and our values. We are proud to be members of Tech Against Terrorism, which supports the tech industry in tackling terrorist exploitation of the internet, whilst respecting human rights.
Membership in Tech Against Terrorism provides practical and operational support to the Trust & Safety teams tasked with preventing violent extremists from using our platform to perpetrate harm. We work with them to strengthen our policies and stay up to date on current trends and scholarship.
If you or someone you love is going through a difficult time, please contact an organisation that can provide the support you need.
The Content Advisory Council brings together a group of independent online safety experts and thought leaders who can help us develop forward-looking policies and programs that not only address the challenges of today, but also plan ahead for the next set of issues that our industry will face.
The Council members we’ve assembled represent a diverse array of backgrounds and perspectives, and have spent much of their lives researching, studying, and analysing issues relevant to online platforms like TikTok, such as child safety, hate speech, misinformation, and bullying. We work with our Council to gain unvarnished views on and advice around our policies and practices as we continually work to improve. We’re excited to have members who represent legal, regulatory, and academic expertise, as well as the needs and perspectives of our diverse community.
Rob Atkinson, Information Technology and Innovation Foundation Rob Atkinson brings academic, private sector, and government experience as well as knowledge of technology policy that can advise our approach to innovation.
Dorothy L. Espelage, Ph.D., William C. Friday Distinguished Professor of Education, University of North Carolina at Chapel Hill Dorothy Espelage is the leading international prevention science researcher and expert in minor safety issues, including bullying, harassment, teen dating violence, and suicide prevention.
Hany Farid, University of California, Berkeley Electrical Engineering & Computer Sciences and School of Information Hany Farid is a renowned expert on digital image and video forensics, computer vision, deep fakes, and robust hashing.
Mary Anne Franks, University of Miami Law School Mary Anne Franks focuses on the intersection of law and technology and will provide valuable insight into industry challenges including discrimination, safety, and online identity.
Vicki Harrison, Stanford Psychiatry Center for Youth Mental Health and Wellbeing Vicki Harrison is a social worker at the intersection of social media and mental health who understands child safety issues and holistic youth needs.
Mutale Nkonde, Fellow at Berkman Klein at Harvard and Digital Society Lab at Stanford; Founder of AI for the People and AI Advisor to the UN Mutale Nkonde is an expert on issues relating to technology and race, with a research focus on race-based disinformation.
Dawn Nunziato, Chair, George Washington University Law School Dawn Nunziato is an internationally recognised expert in free speech and content regulation.
David Ryan Polgar, All Tech Is Human David Ryan Polgar is a leading voice in tech ethics, digital citizenship, and navigating the complex challenge of aligning societal interests with technological priorities.
Dan Schnur, USC Annenberg Center on Communication and UC Berkeley Institute of Governmental Studies Dan Schnur brings valuable experience and insight on political communications and voter information.