Users

Minors in the metaverse

As the evolving media landscape presents new and specific risks for minors, legislators and regulators across Europe are increasingly having to shift their focus on to youth protection.

EU, UK, and local regulations on protecting minors emphasise “interaction risks” and the increased responsibility of content platforms and services enabling user interaction. Both trends impact metaverse operators who enable and encourage content creation and users sharing the content.

For instance, recent updates to German youth protection law, applicable since 1 May 2021, put transparency obligations on platforms that distribute films and games, “precautionary obligations” on platforms for user-generated content and communication, and enable age ratings bodies to consider non-content issues like surprise-based game mechanics, monetisation strategies and other aspects as factors in the relevant age rating.

These factors are relevant for any online game that evolves into a metaverse. Alongside the traditional game content, which may have a specific age rating, players may access concerts, films or other content within the game. They can also create their own content to share with other players.

image

Source: Newzoo Trend Report 2021 - Intro to the Metaverse report.

Already, virtual events attract huge crowds. The Travis Scott Astronomical Fortnite concert had an audience of nearly 28 million players. The game has become the platform.

image

Source: Newzoo Trend Report 2021 - Intro to the Metaverse report.

Providers will, therefore, need to look beyond traditional considerations about the amount of violence or prohibited imagery in their content. German law now contains a catalogue of “precautionary measures” from which platform providers must pick and choose to create an appropriate overall safety net protecting their young users. These measures can include:

  • Providing easy-to-use flagging tools to allow players to report inappropriate content and communication.
  • Allowing users to flag their own content as mature.
  • Implementing parental control mechanisms that allow filtering of self-rated user content, and otherwise limit minors’ use and communication through the platform.
  • Providing easy-to-understand terms and conditions, and information on third-party resources that can help deal with unwanted or harmful interactions (such as harassment).
  • Using “safety by default” presets in privacy and communication settings.

The last two points, in particular, are evidence of another broad trend: legislators and regulators are increasingly using general legal principles to bolster youth protection. Regulators and consumer groups in Germany have targeted advertisements for in-game purchases based on laws implementing the Unfair Commercial Practices Directive, and the UK’s privacy regulator, the Information Commissioner's Office (ICO), has put in place a code of principles on designing services for children.

The ICO Age Appropriate Design Code became fully applicable on 2 September 2021, after a 12-month transition period. While it is not a formal law, it carries significantly more weight than guidance. The UK regulator must take the code into account when considering compliance with UK data protection laws and has said that it will monitor conformance via proactive audits and investigation of complaints, with substantial fines to boot. Not surprisingly, the code also requires easy-to-understand information and particularly privacy-friendly default settings for communication features.

However, the requirements are not limited to typical privacy measures. Among other things, providers are expected to take a risk-based approach to verify the age of their users and make their services “behave” accordingly, down to quite detailed guidance on in-game messaging or driving engagement.

Lawmakers in the German federal states are now considering regulations to impose additional obligations on providers of operating systems as well to provide a standardised age-information interface to communicate a user’s age. This would enable metaverses to sort their users into age cohorts without each provider operating within it having to verify age themselves. This legislative project raises numerous questions – technical and constitutional – but providers would be well advised to pay attention to these developments.

Finally, the updated EU Audiovisual Media Services Directive (AVMSD) includes transparency and age-gate requirements: some remarkably similar to Germany’s long-standing media youth-protection rules. While traditional games providers may not be in the scope of this directive, national implementing legislation may not differentiate between different media services. Furthermore, where the metaverse contains films and TV series, they are likely to be subject to these rules.

Authors

image

Felix Hilgert, LL.M. Partner, Germany felix.hilgert@osborneclarke.com +49 221 5108 4434

image

Philipp Sümmermann, LL.M. Associate, Germany philipp.suemmermann@osborneclarke.com +49 221 5108 4504