Metaverse platforms open up new opportunities for three-dimensional virtual interaction with attractive applications for users and major economic potential for companies – but also create rights and obligations. This article explores the regulatory issues under media law associated with these platforms.
A metaverse platform is a virtual, three-dimensional environment accessible via the internet. Users can engage in economic, social and cultural activities, such as “attending” concerts or other events and playing online games, using avatars. Businesses can acquire virtual properties and contact other users (both businesses and consumers) in a variety of ways. This creates the basis for an extensive exchange of virtual goods and services (including legal services; see here).
However, the metaverse is not a uniform entity, but a blanket term applied to a range of virtual experiences. There are various platforms providers within it, some of which are owned by a single legal entity, and others run by decentralised autonomous organisations (DAOs).
The rapid changes in EU law (see 1. below) bear watching and recent decisions of courts and public authorities on the failure of ISPs and social network providers to comply with German regulatory law (see 2. below) throw light on the current regulatory framework – and what may change in the future.
1. Regulation under EU law
a) EU Digital Services Act
The Digital Services Act (DSA) was published in the Official Journal of the European Union on 27 October 2022 (see here) and introduces a comprehensive legal framework for providers of intermediary services, with particular emphasis on “very large online platforms”. It aims to establish uniform due diligence and transparency obligations and requires procedures to be laid down for reporting and removing illegal content. The DSA also largely preserves the liability principles laid down in the e-Commerce Directive of 2000.
The DSA does not specifically mention either metaverse platforms or virtual reality. The European Commission also made it clear at the end of 2021 that it wanted to refrain for the time being from creating any regulations for the metaverse launched by Meta. According to the European Parliamentary Research Service, this leaves the question open as to whether EU rules on content moderation – which are currently being amended – would also apply to illegal or harmful content in the metaverse. This cannot be ruled out, however. The DSA may in future apply to a large number of providers operating in the metaverse, especially since it contains general rules for accessing and storing content on platforms. In purely practical terms, however, the large number of companies operating in the metaverse and the complex links between them can make it hard to identify the company responsible for the content.
Violations of the DSA may lead to substantial fines of up to a maximum of 6 % of total worldwide annual turnover, exceeding even the maximum penalties under the General Data Protection Regulation (GDPR). The final version of the DSA has also introduced a new right to compensation for damage or loss suffered by users due to a provider having violated its obligations under the DSA.
b) Draft EU Regulation on Artificial Intelligence (AI)
On 21 April 2021, the EU Commission published a proposal for a Regulation laying down harmonised rules on artificial intelligence (see here).The proposal provides for bans on certain AI practices and sets out specific requirements and obligations related to high-risk AI systems, as well as harmonised transparency rules for AI systems in general. According to the definition in Article 3, point 1 of the draft Regulation, an “artificial intelligence system” is software that is developed with one or more of the techniques and approaches listed in Annex I to the draft Regulation and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with.
Although the draft Regulation does not include any explicit provisions on metaverse services, it is likely that artificial intelligence will be used for certain applications in the metaverse, and the provisions of the Regulation – once adopted – will apply in this respect.
The transparency obligations set out in the draft could be especially important for providers or users of such AI systems. Data subjects must, for example, generally be informed about the use of an AI system or biometric categorisation. The draft also lays down an obligation to disclose “deep fakes”, i.e. AI-generated or manipulated content that appreciably resembles existing persons, objects, places or other entities or events and would falsely appear to a person to be authentic or truthful.
c) Cybersecurity and data protection in the metaverse
The general cybersecurity and data protection risks on the internet obviously also apply (and perhaps even more so to the metaverse, see our articles “Commission’s proposal for the Data Act”, 9 March 2022 and “New rules for online businesses – Digital Services Act”, 27 October 2022.
2. Regulation under German law: Current examples
a) Telecommunications law
Current developments in German regulatory law for internet services and social networks, which are also relevant for metaverse platforms, have relatively little to do with the Telecommunications Act (Telekommunikationsgesetz, “TKG”), which was fundamentally revised in December 2021. The Act applies the principle of lex loci solutionis – it is not where the company offering the service is based that matters, but whether the activities governed by telecommunications law are carried out within the “area of application of this Act”. The definition of telecommunications services in section 3, no. 61 TKG also covers not only conventional telephone services (internet telephony and VoIP) – as before – but also e-mail, messenger and group chat services (see also our article on the Telecommunications Modernisation Act (Telekommunikationsmodernisierungsgesetz), 14 May 2021). In the metaverse, which is not itself a telecommunications service, some of the services offered may however be classified as interpersonal communication services. But even then, the question is whether such applications merely have a secondary function that is inextricably linked with other services, and therefore do not fall under the TKG.
b) Telemedia Act and Network Enforcement Act
Under section 1(1), sentence 1 Telemedia Act (Telemediengesetz, “TMG”), both the operation of a metaverse platform and – depending on the specific form – commercial offers by companies to market their products and services in the metaverse are classified as telemedia. While there are no authorisation and notification obligations for telemedia, they must include a legal notice if offered commercially (generally in return for a fee) (section 5 TMG). In addition, section 6 TMG stipulates that service providers must fulfil certain obligations in the case of commercial communications – these and the person in whose name the communication is made must be easily identifiable. As hosting providers, metaverse platform operators will not be responsible for content from third parties in accordance with section 10 TMG (and in future, under the DSA; see 1.a) above).
Telemedia service providers operating platforms on the internet with the intention of making a profit and whose platforms allow users to share content with other users or make it accessible to the public (social networks) and are not platforms with independent editorial content (see regarding the latter the Commission’s controversial proposal for a European Media Freedom Act – see here), will also have to comply with the requirements of the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, “NetzDG”). Non-compliance with these requirements can result in substantial fines – as the Federal Office of Justice (Bundesamt für Justiz) made clear when it imposed fines totalling EUR 5.125 million on Telegram FZ-LLC in mid-October (see here). The fine notice can still be contested, however. The NetzDG obliges companies falling under its scope of application to provide reporting channels on their platforms so that users can report criminal content to the social network provider and have it reviewed. In addition, social network providers must name a person who is authorised to receive service and has an address for summons in the Federal Republic of Germany. The obvious reason for this is so that courts and authorities can serve documents with legally binding effect in Germany. According to the Federal Office of Justice, Telegram violated both these obligations. However, due to the considerable overlap with the DSA, the NetzDG will have to be fundamentally revised.
c) Interstate Treaty on the Protection of Minors from Harmful Media
Providers of applications and services in the metaverse may also have to comply with the Interstate Treaty on the Protection of Minors from Harmful Media (Jugendmedienschutzstaatsvertrag, “JMStV”). Recent expedited decisions handed down by North Rhine-Westphalia Higher Administrative Court in Münster have attracted a great deal of attention in this context.
Although it is true that the country of origin principle applies according to telemedia law, with section 3(1) TMG stipulating that internet providers established in an EU Member State will generally only be subject to the laws of that Member State, this does not apply without restriction. Accordingly, the dissemination of telemedia is subject to the restrictions of German law to the extent that this serves to protect certain objectives – such as public safety and order, public health or the interests of consumers and investors – and the measure is proportionate to the objectives being protected. These objectives very explicitly also include the protection of minors.
In this context, North Rhine-Westphalia Higher Administrative Court ruled on 7 September 2022 (decisions 13 B 1911/21, 13 B 1912/21 and 13 B 1913/21) that providers of pornographic websites based in Cyprus cannot invoke the country of origin principle against an injunction issued on the grounds that they had not sufficiently prevented children and adolescents from accessing their websites. The national authority had fulfilled the consultation requirements under European law vis-à-vis the providers’ country of origin before taking the measures. According to the rulings, the level of protection afforded to minors in the providers’ country of origin is also irrelevant. Given the importance of protecting minors, the other Member State concerned has discretionary powers when it comes to guaranteeing the level of protection.
Regulation of the metaverse – as a relatively new phenomenon – through legislation and case law is likely to move quickly in future. As a guiding principle, it is safe to say that those regulations that apply to two-dimensional social media platforms also apply to the metaverse itself and the services offered therein.