Meta under fire from UK watchdog over child safety in VR
- Child safety experts have warned of the headset’s lack of parental controls
- Research by the Centre for Countering Digital Hate (CCDH) flagged multiple instances of abuse on VRChat, a top-selling social app for Oculus users
- Within a 12-hour period on VRChat, CCDH have found more than 100 potential violations of Meta’s policies
Many by now are familiar with the idea of the metaverse and the benefits of virtual worlds. But when it comes to the damage they could wreak if left to grow unchecked, it has yet to be fully understood. Meta, the umbrella company formerly known as Facebook, is now losing the trust of parents due to its Oculus Quest 2 VR headset and how it possibly endangers children.
The matter was highlighted by campaigners due to the absence of parental controls on the VR headset by Meta. According to a report by The Guardian, the campaign group known as the Center for Countering Digital Hate (CCDH) has flagged multiple instances of abuse on VRChat, a top-selling social app for Oculus users.
Another shocking revelation by the CCDH includes discovering more than 100 potential violations of Meta’s policies over a 12-hour period on VRChat. The campaign said Meta was “ignoring the need to embed even minimum protection” for its metaverse plans. “The public has a right to ask how anyone in good conscience could invite people on to a new platform without real confidence it is safe for them,” CCDH chief executive Imran Ahmed told The Guardian.
To better grasp the severity of the matter, it is fair to note the £300 Oculus Quest 2 VR headsets have proved to be a popular gift this Christmas. The Facebook and Instagram owner has been known for its efforts to mainstream the technology among the masses. However, concerns have been lingering over the risk of exposure to harmful content, without mechanisms in place to block unsuitable material for under 18s.
Can Meta be held responsible over its VR headset?
With CCDH’s revelation, UK’s data watchdog, the Information Commissioner’s Office plans to have “further discussions” with Meta in regards to its VR headset and its compliance with a recently established children’s code that prioritizes the “best interests” of young users.
In short, the watchdog would like to determine whether Meta’s headset and VR services do enough to protect the privacy and data of young children. The children’s code or officially known as the age-appropriate design code, states that the “best interests of the child should be a primary consideration” for online services likely to be accessed by a person under 18. Basically, online services and products that use personal data and are likely to be accessed by children are required to comply with the standards of our children’s code.
To be precise, the code focuses on preventing websites and apps from misusing children’s data, and it also applies to “connected devices”. The code however does not regulate content. Should there be a breach of the code, a fine of up to £17.5 million will be imposed or 4% of a company’s global turnover. “In the case of Meta, it would be £2.5 billion, although formal warnings and reprimands are also possible,” The Guardian stated.
An ICO spokesperson shared that the watchdog “are planning further discussions with Meta on its children’s privacy and data protection by design approaches to Oculus products and virtual reality services. Parents and children who have concerns about how their data is being handled can complain to us at the ICO.”
Separately, according to a Meta spokesperson, the internet giant was “committed” to honoring the children’s code, and was “confident” its VR hardware met the code’s requirements. The representative stressed to The Guardian that terms of service don’t allow children under 13 to use the products — but did not address worries that it was too easy for kids to ignore that policy. The firm has already promised a US$50 million program to make sure its metaverse development obeyed laws and regulations.