Meta expands ‘teen account’ safety features to Facebook and Messenger
text_fieldsMeta Platforms has extended its “Teen Accounts” feature to Facebook and Messenger, broadening efforts to protect younger users online.
The company initially launched these enhanced privacy tools on Instagram last year and is now bringing them to its other major platforms amid growing scrutiny over child safety on social media.
The new safeguards aim to address ongoing concerns about teens' exposure to harmful content and the amount of time they spend online. Features include strengthened parental controls and privacy protections, such as requiring parental approval for users under 16 to go live or to disable automatic blurring of images that may contain nudity in direct messages.
Meta’s move comes as pressure intensifies from lawmakers and advocacy groups demanding stronger protections for children and teens online. Legislators in the U.S. are advancing key bills like the Kids Online Safety Act (KOSA) and the Children and Teens’ Online Privacy Protection Act, which would hold tech companies accountable for the effects of their platforms on younger users.
Major social media companies - including Meta, TikTok (owned by ByteDance), and Google’s YouTube - are already facing hundreds of lawsuits from parents and school districts over the alleged addictive nature of their services and the psychological harm they may cause.
In 2023, 33 U.S. states, including California and New York, sued Meta for allegedly downplaying the risks its platforms pose to young users. Although the U.S. House of Representatives did not bring KOSA to a vote last year, recent hearings suggest lawmakers are still determined to push forward with new digital safety regulations.
Meta confirmed that the new Facebook and Messenger teen safety features will begin rolling out in the coming months. The company, along with other major platforms like TikTok and Instagram, currently allows users aged 13 and older to join their services.