On Wednesday, Meta will allow preteen users to access its 3D social platform, Horizon Worlds, provided they have parents’ approval for the content their children consume.
Furthermore, the tech giant will launch a new rating feature for the platform, which will be for people aged 10+, 13+, and 18+, though these ranges may vary depending on region.
According to reports, to safeguard preteens’ experience in Horizon Worlds, Meta has made fixed protection and age-appropriate settings that parents can customize. For instance, voice chat, except for parent-approved contacts, is turned off by default.
Moreover, the platform has personal boundary-setting, which parents can utilize to prevent other users from getting too close in the virtual landscape.
Meta plans to give preteens access to age-appropriate landscapes and initially offer them Starter Worlds, suitable for ages 10+. The feature must provide preteens with an easy and secure way to start in Horizon Worlds.
The platform was formerly available in virtual reality (VR), but some experiences are now convenient from the Meta Horizon mobile app. Furthermore, the tech giant has recently overhauled the app to focus on Horizon Worlds.
Meanwhile, last month, Meta announced that preteens with parent-managed accounts can chat or call in mixed and virtual reality with parental consent.
European Commission Probe to Child Safety Concerns on Meta
Earlier this year, the European Commission launched a probe into Meta’s social media platforms over potential child safety risks, mental health harm, and addiction.
The investigation will examine the company’s privacy settings and age verification tools for minors. It will also explore the rabbit-hole effects, where preteens are exposed to damaging content.
Furthermore, Horizon Worlds’ new age rating system and integrated parental controls are designed to shield preteens from inappropriate content. However, once they gain access, people will have to wait and see if they are effective.
Meanwhile, this is not the first case that Meta has been accused of child safety concerns. In June, Instagram was found endorsing explicit reels to children as young as 13 years old.
According to experts, these reels were still being recommended despite the application of privacy updates planned to safeguard younger users.