
Meta is expanding its “Teen Accounts” feature to Facebook and Messenger starting Tuesday, introducing new privacy and safety settings aimed at younger users. The update follows similar features rolled out last year on Instagram.
The move comes as the tech giant faces growing criticism from lawmakers and advocacy groups, who argue that social media platforms aren’t doing enough to shield teenagers from harmful content and addictive design.
Why it matters
Meta’s decision arrives at a time when US lawmakers are preparing to revisit key legislation such as the Kids Online Safety Act (KOSA), which seeks to regulate how platforms engage with young users.
Meta, along with TikTok parent ByteDance and Google’s YouTube, is already battling hundreds of lawsuits filed by school districts and families. The lawsuits accuse these companies of contributing to mental health issues in children by promoting compulsive usage through algorithm-driven content.
In 2023, 33 states, including California and New York, sued Meta for allegedly misleading the public about the risks its platforms pose to young people.
What’s changing
According to Meta, users under the age of 16 will now need parental permission to go live on Facebook or Messenger. The platforms will also feature automatic blurring of images that may contain nudity in direct messages — a feature already present on Instagram.
“We will start including these updates in the next couple of months,” the company stated.
The feature rollout aligns with renewed momentum in Congress, where two online safety bills — KOSA and the Children and Teens’ Online Privacy Protection Act — passed the Senate last year. Although the House didn’t bring KOSA to a vote in 2024, recent hearings suggest lawmakers still intend to pursue legislation to protect children online.
Currently, platforms like Facebook, Instagram, and TikTok allow users aged 13 and above to sign up.