Photo credit: www.theverge.com
Meta Expands Safety Features for Teen Accounts on Instagram, Facebook, and Messenger
Meta is implementing enhanced restrictions aimed at protecting minors from inappropriate interactions and content on its platforms. The company is extending the existing safety measures from Instagram to Facebook and Messenger, along with new limitations specifically for teenage users of Instagram.
Starting today, Facebook and Messenger will introduce Teen Accounts in the United States, the United Kingdom, Australia, and Canada, with plans to expand to additional regions shortly. While Meta has not disclosed the exact nature of these protections, it is anticipated that they will closely resemble those introduced during the rollout of Instagram’s Teen Accounts. As a result, these changes will automatically affect both new and existing accounts for users under 18. Older teens will have the option to disable these protections, while those under 16 will need parental consent through supervisory tools to alter any settings.
Key features of the current Teen Account protections include restrictions on messaging and interactions with unknown users, along with stricter controls regarding exposure to sensitive content. To encourage healthier digital habits, teens will receive reminders to limit their screen time to 60 minutes and will benefit from a sleep mode that silences notifications between 10 PM and 7 AM. These initiatives are part of Meta’s broader strategy to respond to growing concerns over child safety on its platforms. Both Facebook and Instagram are currently being investigated by the European Union for potential failures in safeguarding minors. Additionally, a separate lawsuit filed in the United States in 2023 has accused Meta of fostering an environment that poses risks to children.
In the coming months, Meta plans to introduce further protections for Teen Accounts on Instagram. These measures will include restrictions that prevent minors from initiating live broadcasts and disabling a feature that blurs images in direct messages when nudity is detected. Users under the age of 16 will need to secure parental permission to modify or remove these new restrictions, which aim to reduce the risk of contact between children and strangers on the platform.
Source
www.theverge.com