Porn users could have their faces scanned as an age verification tactic, as suggested by the watchdog company Ofcom.
Ofcom made various suggestions for protecting children from viewing explicit content online, following the newly established Online Safety Act.
A survey found that the average age children are exposed to pornography is 13, with one in ten children seeing it by the age of 9.
Additionally, nearly 14 million Brits, roughly 1 fifth of the population, watch pornography online.
However, most porn sites require little to know identification for proof of age.
Leading pornography site PornHub expressed its concerns for the data protection of its users, saying the collection of “highly sensitive personal information” could jeopardise their safety.
The suggestion has also faced criticism from privacy campaigners, who warn implementing these changes could be “catastrophic”.
The Online Safety Act 2023 requires websites, search engines and social media platforms to prioritise the protection and safety of children, from harmful content online.
The new laws will be enforced by Ofcom, who will be given the power to issue fines to those who fail to abide by the regulations.
Ofcom has since outlined its expectations of online platforms to become "highly effective" at complying with the new regulations, which will allegedly be put into action in 2025.
Ofcom’s suggested regulation methods are:
Ofcom also suggested that facial age-estimation technology, which can detect if users are an adult, or underage, could be used.
However, many porn users have expressed concerns that having their faces and personal information on explicit sites could increase the risk of cyber blackmail.