:strip_exif():quality(75)/medias/780/70264c98007c0a0ec417721eea26344c.jpeg)
- Teen Accounts: Increased Privacy and Parental Controls
- Addressing Concerns: Research and Lawsuits
- A Shift in Approach: Policy and Future Plans
- What are some of the new privacy features that Meta has implemented for Instagram accounts belonging to users under 18?
- What prompted Meta to implement these new features for Instagram accounts belonging to teenagers?
- How will these changes affect how teenagers interact with Instagram?
- How are parents involved in these changes?
- What are some of the legal challenges that social media platforms are facing?
- Moving Forward: Balancing Safety and Innovation
In a move aimed at addressing growing concerns over the impact of social media on teenagers, Meta has implemented enhanced privacy features and parental controls for Instagram accounts belonging to users under 18 years old. This shift reflects a growing awareness of the potential negative consequences of excessive social media use, particularly among young people.
Teen Accounts: Increased Privacy and Parental Controls
All Instagram accounts associated with teenagers will automatically transition to "Teen Accounts" and default to private profiles. This means users will only be able to receive messages and be tagged by accounts they follow or are already connected with. Additionally, settings for sensitive content will be stricter, limiting the exposure of potentially harmful material.
Users under 16 years old can only modify these default settings with parental permission. Parents will also have access to a range of controls to monitor their children's interactions, restrict app usage, and limit the amount of time spent on the platform.
Addressing Concerns: Research and Lawsuits
The move comes amidst a growing body of research suggesting a possible link between excessive social media use and higher rates of depression, anxiety, and learning difficulties, especially among young people. Social media platforms like Instagram, TikTok, and YouTube have faced numerous lawsuits concerning the addictive nature of their platforms, filed on behalf of children and school districts.
Last year, 33 states in the US, including California and New York, filed lawsuits against these companies, alleging that they misled the public about the potential harm of their platforms. Currently, platforms like Facebook, Instagram, and TikTok allow users aged 13 and above to register.
A Shift in Approach: Policy and Future Plans
Meta's recent move follows a three-year pause in developing a separate Instagram app specifically for teenagers, a decision made following pressure from lawmakers and advocacy groups regarding safety concerns. In July, the US Senate proposed two online safety bills, the Children's Online Safety Act and the Kids Online Safety and Privacy Act, which would hold social media companies accountable for the impact of their platforms on children and teenagers. These legislative efforts underscore the increasing pressure on social media companies to prioritize the safety and well-being of young users.
:strip_exif():quality(75)/medias/781/52c29355bbb1ffde40e3bfd4abfaa547.jpeg)
What are some of the new privacy features that Meta has implemented for Instagram accounts belonging to users under 18?
Instagram accounts for teenagers will now automatically be transitioned to "Teen Accounts" and will default to private profiles. This means that users will only be able to receive messages and be tagged by accounts they follow or are already connected with. Additionally, settings for sensitive content will be stricter.
What prompted Meta to implement these new features for Instagram accounts belonging to teenagers?
These new features were implemented in response to concerns regarding the potential negative effects of social media on teenagers, especially the potential link between excessive social media use and higher rates of depression, anxiety, and learning difficulties.
How will these changes affect how teenagers interact with Instagram?
Teenagers will only be able to receive messages and be tagged by accounts they follow or are already connected with. They will also have stricter settings for sensitive content. These changes aim to limit exposure to potentially harmful content and interactions.
How are parents involved in these changes?
Parents will have access to a range of controls to monitor their children's interactions and restrict app usage. Users under 16 years old can only modify the default settings with parental permission.
What are some of the legal challenges that social media platforms are facing?
Social media platforms like Instagram, TikTok, and YouTube have faced numerous lawsuits concerning the addictive nature of their platforms, filed on behalf of children and school districts. Last year, 33 states in the US filed lawsuits against these companies, alleging that they misled the public about the potential harm of their platforms.
Moving Forward: Balancing Safety and Innovation
The changes implemented by Meta are a significant step towards addressing concerns about the potential risks of social media for teenagers. The move also signals a broader shift in the tech industry's approach to online safety, with platforms increasingly acknowledging the need for stronger measures to protect young users. As technology continues to evolve, finding a balance between innovation and ensuring the well-being of children remains a critical challenge for social media companies, policymakers, and parents alike.