Snapchat is unveiling a set of new safety measures today to enhance protection for its teenage users, following the lead of other social platforms like Facebook and Instagram. These updates aim to make it more challenging for strangers to reach out to teens, create a more age-appropriate environment, combat accounts promoting inappropriate content, and enhance educational resources for teen users. The company is also rolling out additional support for parents and families, including a new website and a YouTube series offering explanations.
This announcement follows events from almost two years ago when Snap faced congressional scrutiny over its app’s 13+ age rating on the App Store, with concerns raised about inappropriate content, such as sexualized material, ads, articles about alcohol, and pornography. Snap has since faced issues related to the platform being used by drug dealers, resulting in the tragic deaths of several teenagers. Earlier this year, law enforcement reports indicated an increase in child luring and exploitation on the app.
These new safeguards introduced today are designed to address some of these concerns, with features aimed at enhancing the protection of 13 to 17-year-olds from online risks. For instance, Snapchat will now issue in-app warnings when a minor adds a friend who lacks mutual connections or isn’t in their contacts, encouraging teens to carefully consider their online contacts.
Additionally, Snap is raising the threshold for when minors appear in search results. Currently, Snap requires 13-to-17-year-olds to have several mutual friends with another user before they can appear in search results or be suggested as a friend. However, they will now need an even greater number of mutual friends based on their own friend count, making it more challenging for teens to connect with unfamiliar individuals.
Instagram has also introduced various features in previous years to restrict teen interactions with unknown adults, including hiding minor accounts from search and discovery.
Snap is also implementing a three-part strike system across Stories and Spotlight, where users can access public content with a wide audience. Under this system, Snap will promptly remove inappropriate content that it detects or is reported. Additionally, accounts attempting to bypass Snap’s rules will face bans.
Snapchat will now prominently feature resources, including hotlines for assistance, in Stories, and Search features to support young people encountering sexual risks, such as catfishing, financial sextortion, explicit image sharing, and more.
The company emphasizes its commitment to banning accounts of users involved in severe harm, such as threats to others’ physical or emotional well-being, sexual exploitation, and illicit drug sales.
Snap’s new features have been informed by feedback from The National Center on Sexual Exploitation (NCOSE), while in-app educational resources have been developed in collaboration with The National Center for Missing and Exploited Children (NCMEC).
For parents, Snap is launching an online resource at parents.snapchat.com and a new YouTube series to help families understand how Snapchat works and how to utilize the app’s parental controls.