Tech firms face demands to stop illegal content going viral

TruthLens AI Suggested Headline:

"Ofcom Proposes New Measures to Enhance Online Safety and Prevent Viral Illegal Content"

View Raw Article Source (External Link)
Raw Article Publish Date:
AI Analysis Average Score: 8.0
These scores (0-10 scale) are generated by Truthlens AI's analysis, assessing the article's objectivity, accuracy, and transparency. Higher scores indicate better alignment with journalistic standards. Hover over chart points for metric details.

TruthLens AI Summary

Ofcom, the UK regulator, has proposed new online safety measures aimed at preventing illegal content from going viral and enhancing protections for children online. The consultation, published on Monday, seeks input on a range of initiatives that could hold tech platforms accountable for their content. Among the proposals is the requirement for certain platforms to proactively detect and manage terrorist material, as well as mechanisms that would allow users to report livestreams depicting imminent physical harm. Ofcom's online safety group director, Oliver Griffiths, emphasized that these measures are designed to adapt to the evolving nature of technology and its associated risks. The proposals target varying issues including intimate image abuse and the dangers associated with witnessing harmful content during livestreams, with specific requirements tailored to different sizes and types of platforms. For instance, while all user-to-user sites would need to provide reporting mechanisms for harmful livestreams, only larger tech firms would be mandated to use technology for detecting content harmful to minors.

However, some critics argue that the proposed measures may not sufficiently address the systemic issues within the current Online Safety Act. Ian Russell, chair of the Molly Rose Foundation, expressed concerns that Ofcom's approach lacks ambition and fails to provide comprehensive solutions to the dangers posed by social media platforms. He highlighted the need for the government to strengthen the Online Safety Act to ensure tech companies are compelled to identify and mitigate risks. The consultation process, which will remain open until October 20, 2025, aims to gather feedback from various stakeholders, including service providers and the public. This initiative comes in the wake of tech platforms making adjustments to their services, such as TikTok raising the minimum age for livestreaming from 16 to 18 following concerns about child safety. These ongoing developments indicate a growing recognition of the need for stricter regulations to protect vulnerable users online.

TruthLens AI Analysis

You need to be a member to generate the AI analysis for this article.

Log In to Generate Analysis

Not a member yet? Register for free.

Unanalyzed Article Content

Tech platforms could be forced to prevent illegal content from going viral and limit the ability for people to send virtual gifts to or record a child's livestream, under more online safety measures proposed by Ofcom. The UK regulator published a consultation on Monday seeking views on further protections to keep citizens, particularly children, safer online. These could also include making some larger platforms assess whether they need to proactively detect terrorist material under further online safety measures. Oliver Griffiths, online safety group director at Ofcom, said its proposed measures seek to build on existing UK online safety rules but keep up with "constantly evolving" risks. "We're holding platforms to account and launching swift enforcement action where we have concerns," he said. "But technology and harms are constantly evolving, and we're always looking at how we can make life safer online." The consultation highlighted three main areas in which Ofcom thinks more could be done: The BBC has approached TikTok, livestreaming platform Twitch and Meta - which owns Instagram, Facebook and Threads - for comment. Ofcom'srange of proposals target a number of issues - from intimate image abuse to the danger of people witnessing physical harm on livestreams - and vary in what type or size of platform they could apply to. For example, proposals that providers have a mechanism to let users report a livestream if its content "depicts the risk of imminent physical harm" would apply to all user-to-user sites that allow a single user to livestream to many, where there may be a risk of showing illegal activity. Meanwhile potential requirements for platforms to use proactive technology to detect content deemed harmful to children, would only apply to the largest tech firms which present higher risks of relevant harms. "Further measures are always welcome but they will not address either the systemic weaknesses in the Online Safety Act," said Ian Russell, chair of the Molly Rose Foundation - an organisation set up in memory of his 14-year-old daughter Molly Russell, who took her own life after viewing thousands of images promoting suicide and self-harm. He added that Ofcom showed a "lack of ambition" in its approach to regulation. "As long as the focus is on sticking plasters not comprehensive solutions, regulation will fail to keep up with current levels of harm and major new suicide and self-harm threats," Mr Russell said. "It's time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on by fully compelling companies to identify and fix all the risks posed by their platforms." The consultation is open until 20 October 2025 and Ofcom hopes to get feedback from service providers, civil society, law enforcement and members of the public. It comes as tech platforms look to bring their services in line with the UK's sweeping online safety rules that Ofcom has been tasked with enforcing. Some have already taken steps to try and clamp down on features that experts have warned may expose children to grooming, such as through livestreaming. In 2022, TikTok banned children raised its minimum age for going live on the platform from 16 to 18 - shortly after a BBC investigation foundhundreds of accounts going live from Syrian refugee camps with children begging for donations. YouTube recentlysaidit would increase its threshold for users to livestream to 16, from 22 July. Sign up for our Tech Decoded newsletterto follow the world's top tech stories and trends.Outside the UK? Sign up here.

Back to Home
Source: Bbc News