Meta Expands Teen Safety Features with New Account Options

Feature and Cover Meta Expands Teen Safety Features with New Account Options

Meta is enhancing safety for teens on its platforms by introducing Teen Accounts on Facebook and Messenger, alongside a new School Partnership Program for educators to report bullying.

Meta is taking significant steps to improve safety for young users across its platforms. In September 2024, the company launched Teen Accounts on Instagram, which come equipped with built-in safeguards designed to limit who can contact teens, control the content they see, and manage their time spent on the app. The initial response has been overwhelmingly positive, with 97% of teens aged 13 to 15 opting to retain the default settings, and 94% of parents finding the Teen Accounts beneficial.

Following the successful introduction on Instagram, Meta is now expanding these protections to Facebook and Messenger globally. This move aims to enhance safety standards across the apps that teens frequently use, ensuring a more secure online environment.

Teen Accounts automatically implement various safety limits, addressing parents’ primary concerns while empowering teens with greater control over their online experiences. Adam Mosseri, head of Instagram, underscored the initiative’s purpose, stating, “We want parents to feel good about their teens using social media. … Teen Accounts are designed to give parents peace of mind.”

Despite these advancements, some critics argue that the measures may not be sufficient. A study conducted by child-safety advocacy groups and researchers at Northeastern University revealed that only eight out of 47 tested safety features were fully effective. Internal documents indicated that Meta was aware of certain shortcomings in its safety measures. Critics have also pointed out that some protections, such as manual comment-hiding, place the onus on teens rather than preventing harm proactively. They have raised concerns about the robustness of time management tools, which received mixed evaluations despite functioning as intended.

In response to the criticisms, Meta stated, “Misleading and dangerously speculative reports such as this one undermine the important conversation about teen safety. This report repeatedly misrepresents our efforts to empower parents and protect teens, misstating how our safety tools work and how millions of parents and teens are using them today.” The company emphasized that Teen Accounts lead the industry by providing automatic safety protections and straightforward parental controls. According to Meta, teens utilizing these protections encountered less sensitive content, experienced fewer unwanted contacts, and spent less time on Instagram during nighttime hours. Additionally, parents have access to robust tools for limiting usage and monitoring interactions. Meta has committed to continuously improving its tools and welcomes constructive feedback.

Alongside the enhancements to Teen Accounts, Meta is also extending its safety initiatives to educational institutions. The newly launched School Partnership Program is now available to all middle and high schools in the United States. This program allows educators to report issues such as bullying or unsafe content directly from Instagram, with reports receiving prioritized review typically within 48 hours.

Educators who have participated in pilot programs have praised the improved response times and enhanced protections for students. Beyond the app and school initiatives, Meta has partnered with Childhelp to develop a nationwide online safety curriculum tailored for middle school students. This curriculum aims to educate students on recognizing online exploitation, understanding the steps to take if a friend needs help, and effectively using reporting tools.

The program has already reached hundreds of thousands of students, with a goal of teaching one million middle school students in the upcoming year. A peer-led version, developed in collaboration with LifeSmarts, empowers high school students to share the curriculum with their younger peers, making discussions about safety more relatable.

For parents, the introduction of Teen Accounts means that additional protections are in place without requiring complex setups. Teens benefit from safer defaults, providing parents with peace of mind. The School Partnership Program offers educators a direct line to Meta, ensuring that reports of unsafe behavior receive prompt attention. Students also gain from a curriculum designed to equip them with practical tools for navigating online life safely.

However, the pushback from critics highlights ongoing debates about whether these safeguards are adequate. While Meta maintains that its tools function as intended, watchdog organizations argue that protecting teens online necessitates even stronger measures. As teens increasingly engage with digital platforms, the responsibility to ensure their safety intensifies.

The expansion of Teen Accounts represents a significant shift in how social media platforms approach safety. By integrating built-in protections, Meta aims to mitigate risks for teens without requiring parents to manage every setting. The School Partnership Program further empowers educators to protect students in real time, while the online safety curriculum teaches children how to identify threats and respond effectively.

As the conversation around teen safety continues, the effectiveness of these new tools will be put to the test against the evolving landscape of online threats. The question remains: Are Meta’s new measures sufficient to protect teens, or do tech companies need to implement even more robust safeguards?

Source: Original article

Leave a Reply

Your email address will not be published. Required fields are marked *

More Related Stories

-+=