Meta Rolls Out New Instagram Safety Features for Teens and Children
Meta has announced a new set of safety tools designed to make Instagram safer for teenagers and children. These updates aim to give young users and their guardians more control and better protection against suspicious behavior, scams, or unwanted contact.
đź”’ Safer DMs for Teens
One of the major updates allows teen accounts to block and report users directly from the private message (DM) screen. Along with that, Instagram will now display safety tips right inside the chat window. These tips are designed to help teens spot warning signs and stay safe while messaging others.
Instagram will also start showing when an account was created—including the month and year it joined the platform—at the top of new chat windows. This small detail can help teen users spot new, suspicious, or fake accounts before they engage in conversation.
👤 What Are Teen Accounts?
Any Instagram profile created by someone under the age of 18 is automatically classified as a Teen Account. These accounts:
- Are set to private by default.
- Include extra security and privacy settings.
- Allow for parental controls to be enabled.
These settings are designed to protect young users from harmful content and interactions on the app.
👪 Extra Protection for Adult-Managed Accounts Featuring Children
Meta is also making changes for adult-run Instagram accounts that feature children under 13, such as:
- Kid influencers
- Family bloggers
- Parent-managed profiles
These accounts will now have:
- Strict messaging filters to block unwanted messages.
- Hidden Words is automatically enabled to filter out offensive language and comments.
- Limited visibility—meaning they won’t show up in account suggestions, helping prevent unwanted attention from adults trying to contact them.
Parents running these accounts will also see a notification at the top of their feed, prompting them to review privacy settings and confirm that protections are active.
📉 Why It Matters: The Bigger Picture
These updates come as part of a broader push by Meta to address the growing safety concerns on social media platforms, especially for younger users. The company says it is committed to building safer digital environments for teens and children.
Meta shared some data that shows the scale of the issue:
- Over 1 million Instagram accounts were flagged by users under 18 in June 2025 alone.
- The “Location Notice” feature, which warns users when they’re chatting with someone in another country, was seen over 10 lakh times, though only 10% clicked to learn more.
- Instagram took down about 1.35 lakh accounts for posting inappropriate messages or sending unwanted requests to profiles featuring young children.
- Another 500,000 accounts across Instagram and Facebook that were linked to those offenders were also taken down.
🚀 What’s Next
Meta shared that these updates will be introduced step by step over the next few months. It encourages teens, parents, and guardians to stay informed and review privacy settings regularly.
The company also reminded users that the minimum age to join Instagram is 13, and anyone under 13 must use an account that is managed by an adult and clearly states that in the bio.
With more young users joining social platforms every day, these efforts aim to build trust and safety—while addressing concerns from parents, regulators, and child safety organizations worldwide.