Displaying items by tag: Online Safety Act

Ofcom has introduced draft codes of practice mandating tech firms to shield children from "toxic" content by enhancing age checks and modifying algorithms. This move comes after parents criticized the slow pace of regulatory changes following tragic incidents where children were harmed by online challenges. Companies failing to comply might face bans for users under 18 and public naming. Meta and Snapchat have noted their existing protections for minors, but broader industry responses are tepid. Ofcom's Dame Melanie Dawes emphasized the severity of recurring harmful content in social feeds, declaring the new regulations a significant step towards safeguarding young users. These measures, part of the Online Safety Act, are set to be enforced from the second half of 2025, with tech companies required to assess risks and adjust accordingly. The UK government and Ofcom are urging immediate engagement from tech platforms to prevent harmful exposure, while bereaved parents continue to advocate for stronger actions and the inclusion of mental health education in schools.

Published in British Isles

The UK's Online Safety Act, signed into law last week, introduces new rules aimed at protecting children online. Ofcom has unveiled its first draft codes of practice under the act, focusing on illegal material such as grooming content, fraud, and child sexual abuse. The rules include limiting direct messages and removing them from suggested friend lists to protect children. Tech platforms will be legally required to keep children's location data private and restrict who can send direct messages to them. Ofcom will publish more rules in the coming months, with each new code requiring parliamentary approval. The goal is to enforce the codes by the end of next year. The act also encourages the use of technology to identify illegal images of abuse and prevent their dissemination.

Published in British Isles