After Instagram, Meta adds parental supervision tools to its Oculus Quest VR headset

A new set of tools by Meta will allow parents to stop teenagers accessing inappropriate games and experiences with its VR headset.
22 March 2022

After Instagram, Meta adds parental supervision tools to its Oculus Quest VR headset. (Photo by Sergio FLORES / AFP)

  • VR Chat—the most reviewed social app in Oculus Quest VR’s app store—is rife with abuse, harassment, racism and pornographic content, CCDH emphasised.
  • CCDH said Mark Zuckerberg’s promises on the safety of their version of Metaverse “was obviously a hollow promise”. 

In an unprecedented move earlier this month, Instagram started rolling out the long-awaited parental controls for its app, after its parent company Meta Platforms Inc came under fire for the lack of children’s safety online. The issue, ongoing since September last year, were still ongoing when concerns over child safety in regards to Oculus Quest VR was also raised by the Center for Countering Digital Hate (CCDH).

CCDH was referring to a popular third-party app called VRChat that works on a number of platforms, including Oculus. Available in the Oculus Quest VR headset’s app store, CCDH claimed its research found instances where teenagers below 18 are among users who suffered bullying, sexual harassment and abuse; were exposed to graphic sexuVRal content; and groomed to repeat racist slurs and extremist talking points.

The entire issue with VRChat eventually raised substantive questions about whether Meta has met its regulatory requirements in the children’s code when it comes to its VR headset. In a statement, Meta said: “We’re working to implement the standards within the Age Appropriate Design Code (children’s code), in consultation with the Information Commissioners’ Office (ICO).

That said, just like Instagram’s recent move, Oculus too is now rolling out tools to allow parents to lock specific apps directly from VR to stop teenagers accessing them; block teenagers from downloading or purchasing age-inappropriate apps in the Quest Store; and release a “Parent Dashboard”, accessible from the Oculus mobile app, to –if both adult and teenager agree–allow parents to link to their child’s account.

CCDH’s chief executive Imran Ahmed in a blog post emphasized how “When Facebook launched the Metaverse for Oculus just in time for Christmas shopping, its CEO, Mark Zuckerberg, pledged that privacy and safety is at the heart of Virtual Reality.” However, the group’s researchers discovered that, contrary to his promises, Metaverse is a haven for hate, pornography and child grooming.

“In our study, Metaverse connects users not just to each other but to an array of predators, exposing them to potentially harmful content every seven minutes on average. If Metaverse is safe for predators, it’s unsafe for users, especially children. Any parent who gifted the VR Oculus headset by Meta for Christmas needs to be aware that they are potentially exposing their children to serious danger,” Ahmed reiterated.

With the Parent Dashboard, guardians will be able to view all the apps their child owns, be notified when their child makes a purchase in VR, know how much time their child is spending in VR, and view their child’s list of Oculus Friends. Overall, the new controls in both Instagram and VR are part of what Meta calls its “Family Center”.

The features will first be released in the US, and then globally in the coming months. It will allow parents to initiate these controls with their child’s consent.  The first three Instagram parental-supervision tools—available this week in the US—will let parents see how much time their teens spend on Instagram and set limits. 

Even for Instagram, parents can view what accounts their teens follow, as well as who follows them. It also lets teens notify their parents when they report inappropriate behavior. Meta has plans to eventually make the tools available on its other platforms like Facebook.