The UK’s communications regulator Ofcom confirmed that more than a dozen leading platforms have pledged to adopt “highly effective” measures by July 25 - including credit-card checks, facial recognition, or open banking verification - to prevent under‑18s from encountering adult material online.
Under the Online Safety Act, sites that fail to deploy these safeguards by the deadline risk fines of up to £18 million or being blocked in the UK.
In a statement, Oliver Griffiths, Ofcom’s Group Director of Online Safety said: “By 25 July, all sites and apps that allow [explicit material] must use highly effective age checks to ensure children are not normally able to encounter it.”
Ofcom has indicated it will actively monitor compliance and may proceed with enforcement action against non‑compliant operators.
The introduction of age checks marks a significant shift towards safer online access for children, balancing privacy for adults with stronger protection for minors.
Independent research shows there is widespread underage exposure to explicit material online. Ofcom has found that eight per cent of children aged eight to 14 visit adult sites monthly, with about three per cent being as young as eight or nine.
The Online Safety Act - which received Royal Assent in October 2023 - mandates strong age assurance systems for explicit platforms, as well as social media and search services.
Earlier regulatory attempts dating back to 2017 were scrapped amid privacy and practicality concerns, but have now been revived with new enforcement teeth.
This new move from Ofcom reflects growing public pressure to align online content with real-world age restrictions - similar to those for alcohol, gambling, and age-restricted goods.