Meta has introduced Teen Accounts for Facebook and Messenger, building on the safety-focused experience initially launched on Instagram.
These accounts are designed to create a more secure and age-appropriate environment for younger users, with default settings that limit exposure to inappropriate content and reduce the risk of unwanted interactions.
The rollout begins in the U.S., U.K., Australia, and Canada, with plans for broader global expansion.
The initiative comes amid growing regulatory interest and public concern about the impact of social media on youth mental health.
In response, Meta has positioned Teen Accounts as part of a wider effort to implement consistent safeguards across its platforms for users under 18.
This includes automatic privacy settings, restricted interactions, and parental controls designed to align with new policy discussions in several regions seeking to regulate teen access to digital platforms.
Built-In Safeguards and Privacy Defaults
Teen Accounts introduce a set of privacy-focused features that are automatically activated. Messaging capabilities, for example, are limited—only individuals who a teen user has previously followed or interacted with can initiate contact.
Story replies, tags, mentions, and comments are similarly restricted to approved connections, helping limit unwanted outreach.
For users under 16, any attempt to modify these default settings requires parental consent. Additionally, Meta has applied the same restriction to features like live streaming and disabling automatic filters that blur suspected nudity in direct messages.
These permissions aim to involve parents more directly in managing teens’ online experiences.
Usage Management and Screen Time Awareness
To address concerns around excessive screen time, the accounts feature digital well-being tools such as usage reminders after 60 minutes and Quiet Mode, which is automatically activated during nighttime hours.
These nudges are intended to encourage healthier online behavior and reduce late-night usage, reflecting ongoing debates around the relationship between screen time and mental health in adolescents.
Parental Feedback and Adoption Metrics
Since launching Teen Accounts on Instagram, Meta reports that around 54 million teens have been transitioned to the system.
According to internal data, 97% of users aged 13 to 15 have retained the default protections, indicating a high rate of adoption.
While the company has not detailed how many teens globally remain outside of these safeguards, it views this rollout as a step toward broader platform consistency.
Meta also references findings from a survey conducted by Ipsos, which suggests that 94% of U.S. parents believe Teen Accounts are helpful, with 85% agreeing that the features contribute to a more positive online experience for their children.
Standardizing Protections Across Meta Platforms
The expansion of Teen Accounts to Facebook and Messenger reflects Meta’s broader strategy to unify safety measures across its digital ecosystem.
By offering a consistent set of features—spanning parental oversight, content filtering, and usage control—Meta aims to simplify how families navigate online safety across its suite of apps.
While the full scope of restrictions has not been publicly disclosed, the approach appears to combine technical safeguards with behavioral design to promote safer interactions.