Meta is expanding Teen Accounts - what it considers its age-appropriate experience for under 18s - to Facebook and Messenger. The system involves putting younger teens on the platforms into more restricted settings by default, with parental permission required in order to live stream or turn off image protections for messages. It was firstintroduced last Septemberon Instagram, which Meta says "fundamentally changed the experience for teens" on the platform. But campaigners say it's unclear what difference Teen Accounts has actually made. "Eight months after Meta rolled out Teen Accounts on Instagram, we've had silence from Mark Zuckerberg about whether this has actually been effective and even what sensitive content it actually tackles," said Andy Burrows, chief executive of the Molly Rose Foundation. He added it was "appalling" that parents still did not know whether the settings prevented their children being "algorithmically recommended" inappropriate or harmful content. Matthew Sowemimo, associate head of policy for child safety online at the NSPCC, said Meta's changes "must be combined with proactive measures so dangerous content doesn't proliferate on Instagram, Facebook and Messenger in the first place". But Drew Benvie, chief executive of social media consultancy Battenhall, said it was a step in the right direction. "For once, big social are fighting for the leadership position not for the most highly engaged teen user base, but for the safest," he said. However he also pointed out there was a risk, as with all platforms, that teens could "find a way around safety settings." The expanded roll-out of Teen Accounts is beginning in the UK, US, Australia and Canada from Tuesday. Companies that provide services popular with children have faced pressure to introduce parental controls or safety mechanisms to safeguard their experiences. In the UK, they also face legal requirements to prevent children from encountering harmful and illegal content on their platforms, under the Online Safety Act. Roblox recently enabled parentsto block specific games or experiences on the hugely popular platformas part of its suite of controls. How Teen Accounts work depend on the self-declared age of the user. Those aged 16 to 18 will be able to toggle off default safety settings like having their account set to private. But 13 to 15 year olds must obtain parental permission to turn off such settings - which can only be done by adding a parent or guardian to their account. Meta says it has moved at least 54 million teens globally into teen accounts since they were introduced in September. It says that 97% of 13 to 15 year olds have also kept its built-in restrictions. The system relies on users being truthful about their age when they set up accounts - with Meta using methods such as video selfies to verify their information. It said in 2024 it would begin using artificial intelligence (AI) to identify teens who might be lying about their age in order to place them back into Teen Accounts. Findings published by the UK media regulator Ofcom in November 2024 suggested that 22% of eight to 17 year olds lie that they are 18 or over on social media apps. Some teenagers told the BBCit was still "so easy" to lie about their age on platforms. In coming months, younger teens will also need parental consent to go live on Instagram or turn off nudity protection - which blurs suspected nude images in direct messages. Concerns over children and teenagers receiving unwanted nude or sexual images, or feeling pressured to share them in potential sextortion scams, has prompted calls for Meta to take tougher action. Prof Sonia Livingstone, director of the Digital Futures for Children centre, said Meta's expansion of Teen Accounts may be a welcome move amid "a growing desire from parents and children for age-appropriate social media". But she said questions remained over the company's overall protections for young people from online harms, "as well as from its own data-driven and highly commercialised practices". "Meta must be accountable for its effects on young people whether or not they use a teen account," she added. Mr Sowewimo of the NSPCC said it was important that accountability for keeping children safe online, via safety controls, did not fall to parents and children themselves. "Ultimately, tech companies must be held responsible for protecting children on their platforms and Ofcom needs to hold them to account for their failures."
Meta expands restrictions for teen users to Facebook and Messenger
TruthLens AI Suggested Headline:
"Meta Introduces Expanded Teen Account Restrictions on Facebook and Messenger"
TruthLens AI Summary
Meta is broadening its Teen Accounts initiative, which aims to create age-appropriate experiences for users under 18, to include Facebook and Messenger. This expansion introduces more stringent default settings for younger teens, requiring parental permission to access features such as live streaming and disabling image protections for messages. Initially launched on Instagram in September, Meta claims that these Teen Accounts have significantly altered the experience for teens on the platform. However, critics, including Andy Burrows of the Molly Rose Foundation, have raised concerns regarding the effectiveness of these changes, pointing out that there has been little transparency from Meta about the specific harmful content these accounts are designed to mitigate. They argue that parents remain unaware of whether their children are still at risk of being exposed to inappropriate content through algorithmic recommendations. Matthew Sowemimo from the NSPCC emphasizes that while these changes are a step forward, they need to be complemented by proactive measures to prevent the spread of dangerous content across all platforms, including Instagram, Facebook, and Messenger.
The rollout of the expanded Teen Accounts is commencing in the UK, US, Australia, and Canada. The effectiveness of these accounts relies heavily on users accurately declaring their ages, with Meta implementing verification methods, including video selfies. The company plans to introduce artificial intelligence in 2024 to identify users who may be misrepresenting their ages. Recent findings from Ofcom indicate that a significant percentage of young users, about 22%, lie about their age on social media. Upcoming changes will also require parental consent for younger teens to go live on Instagram or disable protections against nudity in direct messages. While experts like Prof Sonia Livingstone recognize the potential benefits of Teen Accounts, they also highlight ongoing concerns about Meta's overall commitment to safeguarding young users from online harms and its data-driven practices. The NSPCC stresses that the responsibility of ensuring online safety should not solely rest on parents and children, but rather, tech companies must be held accountable for protecting young users on their platforms.
TruthLens AI Analysis
Analysis failed for 'Meta expands restrictions for teen users to Facebook and Messenger' due to an error: Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}