Apr 9, 2025, 7:36 AM
Apr 8, 2025, 12:00 AM

Meta expands teen accounts to Facebook and Messenger for safer usage

Highlights
  • Meta has rolled out updates to enhance parental controls and protect younger users on Instagram, Facebook, and Messenger.
  • Users under 16 are placed into Teen Accounts by default with restricted features, such as requiring parental permission to stream live or disable content protections.
  • While these changes are seen as positive for teen safety, concerns remain about the effectiveness and the actual impact on reducing exposure to harmful content.
Story

In recent months, Meta has made significant strides in enhancing safety features for its younger users on its platforms like Instagram, Facebook, and Messenger. This update involves placing users under 16 into Teen Accounts by default, where stringent safety measures are automatically applied. These measures aim to limit content exposure and enhance privacy settings, thereby fostering a safer online environment for adolescents. Under these adjustments, younger users require parental consent to utilize specific features, such as live streaming or turning off nudity protections in direct messages. The launch of the Teen Accounts system was first implemented on Instagram in September 2024 and has since led to the transition of around 54 million teenagers globally into this more controlled setting. Parents can now have peace of mind knowing that there are enhanced restrictions, with younger users having their accounts set to private and exposed to reduced messaging capabilities. Meta has expressed its commitment to listening to parental feedback and adjusting safety features in response to continuous research about young users' experiences. However, despite these advancements, several campaigners and child safety experts have raised concerns regarding the effectiveness of the Teen Accounts in genuinely protecting adolescents from encountering inappropriate or harmful content. Critics argue that the actual impacts of these changes remain unclear, leaving parents questioning whether these measures adequately shield their children from undesirable content within the platform's complex algorithms. As regulators, particularly in the UK with the introduction of the Online Safety Act, push for firmer control of content targeting minors, social media giants like Meta are under increasing pressure to prioritize child safety. While the newfound restrictions are viewed as a proactive step, experts emphasize that further accountability from tech companies is essential in ensuring children are not inadvertently exposed to dangerous or harmful material online.

Opinions

You've reached the end