The boss of the UK media regulator Ofcom warned “metaverse” forays from tech giants like Meta and Microsoft will be subjected to incoming rules forcing platforms to protect users from online harms.
Speaking at an event in London hosted by policy consulting group Global Counsel on Tuesday, Ofcom Chief Executive Melanie Dawes said self-regulation of the metaverse, a hypothetical digital world touted by Meta and others, wouldn’t fly under UK online safety laws.
“I’m not sure I really see that ‘self-regulatory phase,’ to be honest, existing from a UK perspective,” Dawes said. “If you’ve got young people in an environment where there’s user-generated content according to the scope of the bill then that will already be caught by the Online Safety Bill.”
Read also: FTC To Rule On Microsoft’s Activision Deal By November
The Online Safety Bill is a set of legislation that seeks to curb harmful content from being widely shared on the internet. The rules would impose a duty of care on firms requiring them to have robust and proportionate measures to deal with harmful materials such as vaccine disinformation or posts promoting self-harm.
Violations of the law — once it is approved — could lead to fines of up to 10% of annual global revenues. Down the track, senior tech executives may also face criminal liability for more extreme breaches.
The bill is especially concerned with the protection of children, having been developed in response to the death of Molly Russell, a U.K. teen who took her own life after being exposed to suicide-related posts on Instagram. In September, a coroner investigating Russell’s death made the landmark conclusion that “negative effects” of social media contributed to her death.