×

Warning

The form #5 does not exist or it is not published.

Displaying items by tag: social media platforms

Ofcom has introduced draft codes of practice mandating tech firms to shield children from "toxic" content by enhancing age checks and modifying algorithms. This move comes after parents criticized the slow pace of regulatory changes following tragic incidents where children were harmed by online challenges. Companies failing to comply might face bans for users under 18 and public naming. Meta and Snapchat have noted their existing protections for minors, but broader industry responses are tepid. Ofcom's Dame Melanie Dawes emphasized the severity of recurring harmful content in social feeds, declaring the new regulations a significant step towards safeguarding young users. These measures, part of the Online Safety Act, are set to be enforced from the second half of 2025, with tech companies required to assess risks and adjust accordingly. The UK government and Ofcom are urging immediate engagement from tech platforms to prevent harmful exposure, while bereaved parents continue to advocate for stronger actions and the inclusion of mental health education in schools.

Published in British Isles