img

Snapchat Introduces New Security Measures to Protect Our Community

Creating a safe and positive experience for Snapchatters is a top priority, which is why we’re always trying to do more to help keep our community safe. As a messaging platform for real friends, our goal is to help Snapchatters communicate with people that matter to them and to ensure that the content they view on our app is informative, fun, and age-appropriate. 

With that goal in mind, today we are announcing new features to further protect 13-17-year-olds from potential online risks. These features, which will begin to roll out in the coming weeks, are designed to 1) protect teens from being contacted by people they may not know in real life; 2) provide a more age-appropriate viewing experience on our content platform; and 3) enable us to more effectively remove accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies. 

In addition, we are releasing new resources for families, including an updated parents guide at parents.snapchat.com that covers our protections for teens, our tools for parents, and a new YouTube explainer series.

Safer Contact  

Most teens use Snapchat to chat with their close friends using pictures and text, like text messaging or phone calls. When a teen becomes friends with someone on Snapchat, we want to be confident it is someone they know in real life — such as a friend, family member, or other trusted person. To help encourage this, we already require teens to be Snapchat friends or phone book contacts with another user before they can begin communicating. 

Building on this, we will begin to roll out additional protections for 13-17-year-olds, designed to further limit unwanted interactions or potentially risky contact, including:

In-App Warnings: We are launching a new feature that sends a pop-up warning to a teen if they add a friend when they don’t already share mutual friends or the person isn’t in their contacts. This message will urge the teen to carefully consider if they want to be in contact with this person and not to connect with them if it isn’t someone they trust.

Stronger Friending Protections: We already require a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in Search results or as a friend suggestion. We are raising this bar to require a greater number of friends in common based on the number of friends a Snapchatter has – with the goal of further reducing the ability for teens to connect with people they may not already be friends with. 

Across Snapchat, we prohibit illegal and harmful content such as sexual exploitation, pornography, violence, self-harm, misinformation, and much more. To enforce our policies and take quick action to help protect our community, we have long used a zero-tolerance approach for users who try to commit severe harms, such threats to another user’s physical or emotional well-being, sexual exploitation, and the sale of illicit drugs. If we find accounts engaging in this activity, we immediately ban the account, apply measures to prevent the account holder from getting back on Snapchat, and may report that account holder to law enforcement. 

New Strike System for Accounts Promoting Age-Inappropriate Content

While Snapchat is most commonly used for communicating with friends, we offer two content platforms — Stories and Spotlight — where Snapchatters can find public Stories published by vetted media organizations, verified creators, and other Snapchatters. On these public content platforms, we apply additional content moderation to prevent violating content from reaching a large audience. 

To help remove accounts that market and promote age-inappropriate content we recently launched a new Strike System. Under this system, we immediately remove inappropriate content that we proactively detect or that gets reported to us. If we see that an account is repeatedly trying to circumvent our rules, we will ban it. You can learn more about our new Strike System here.

Education About Common Online Risks

In recent years and, especially with online communications becoming more prevalent during the pandemic, young people have become more vulnerable to a range of sexual risks like catfishing, financial sextortion, taking and sharing explicit images, and more. 

As we continue to bolster our defenses against these risks, we also want to use our reach with young people to help them spot likely signs of this type of activity — and what to do if they encounter it. Today we are also releasing new in-app content that helps explain these risks and shares important resources for Snapchatters, such as hotlines to contact for help. This content will be featured on our Stories platform and surfaced to Snapchatters by relevant Search terms or keywords. 

Several of our new product safeguards were informed by feedback from The National Center on Sexual Exploitation (NCOSE). Our new in-app educational resources were developed with The National Center for Missing and Exploited Children (NCMEC). We are grateful for their recommendations and contributions.

Our commitment to making our platform safer is always on, and we will continue to build on these improvements in the coming months with additional protections for teen Snapchatters and support for parents.