Meta has unveiled “Teen Accounts” on Instagram, a significant step to enhance parental control and ensure safer social media use for users under 18. This initiative addresses growing concerns about the negative effects of social media on young people, particularly as mental health issues rise.
Key Features of Teen Accounts
The new Teen Accounts come with several features designed to empower parents and protect children:
- Enhanced Parental Controls
Parents can set daily usage limits and restrict app access during nighttime hours. This encourages healthier screen time habits among teens. - Private Accounts by Default
All Teen Accounts will be private. This means users can only be messaged or tagged by accounts they already follow, minimizing unwanted interactions. - Sensitive Content Restrictions
For users under 16, sensitive content settings will be at their most restrictive. Teens can only change these settings with parental permission, creating a safer online environment. - Activity Monitoring
Parents will receive insights into their child’s activities. This includes monitored direct messages and categories of content viewed, which can help facilitate important conversations about online behavior.
Addressing Growing Concerns
There is increasing scrutiny of platforms like Instagram, TikTok, and YouTube due to rising mental health issues among young users. Studies link excessive social media use to heightened anxiety, depression, and learning disabilities. In response, Meta aims to provide a safer space through these new features.
Meta plans to roll out these Teen Accounts within 60 days in key markets, including the US, UK, Canada, and Australia. This initiative aligns with recent legislative efforts, such as the Kids Online Safety Act, reflecting a commitment to a safer digital environment for young users. With these features, families can navigate social media together, ensuring a more positive online experience for teens.
Related articles to view: