Gearing up for the Digital Decade? Assessing the Enforcement Mechanisms of the EU’s Platform Regulation Bills


The EU’s goal is to become a “global role model for the digital economy” and to promote its regulatory model beyond the EU’s borders. Accordingly, lawmakers are now considering various options for limiting the power of ‘Big Tech’ by regulating their data-driven business models.

Platform Regulation in the EU


The EU’s goal is to become a “global role model for the digital economy” and to promote its regulatory model beyond the EU’s borders. As stated on the European Commission’s webpage entitled Shaping Europe’s Digital Future,The European Union will aim to become a global role model for the digital economy, support developing economies in going digital, develop digital standards and promote them internationally.Undeniably, the synergy between digitization and globalization has led to fundamental shifts in markets and societies. Particularly,  Web 2.0 and the emergence of social media platforms have transformed the way people communicate and receive information. While these changes present exciting possibilities, in recent years, we have moved past a naïve enthusiasm about social media platforms that aim at ‘connecting’ people globally. In light of the role social media platforms played during the US presidential elections and the Brexit referendum in 2016, but also tragic events such as the Christchurch shooting in New Zealand, the pressure to reform (possibly outdated) intermediary regulation increased significantly. It became increasingly difficult to justify  the state of affairs in which social media platforms did not have to take responsibility for unlawful content via their services, and in which their influence over the public sphere could remain uncontrolled. Accordingly, lawmakers are now considering various options for limiting the power of ‘Big Tech’ by regulating their data-driven business models.

The EU explicitly strives to be the leading regulatory force of the 21st century on a global scale. The EU Commission’s strategy for 2019-2024 leaves no room for doubt as to the intended goals. The EU’s potential role as a global regulator raises questions as to the broader structure it is aiming to build. Specifically, given the far-reaching implications of the undertaking, and considering that the power of social media platforms over markets and societies has been at the center of policy discussions, it is necessary to take a closer look at the provisions regarding oversight and enforcement of the rules for their regulation.

A significant trend now underway in this regard is the shift from self-regulation to more specific obligations for platforms. To ensure the latter, the current proposals contain the creation of new enforcement mechanisms and competent authorities. This emphasis on enhanced oversight and effective enforcement deserves attention. And, since the policy discussions on platforms, data, and AI are all interrelated, lawmakers should be aware of the potential advantages in cross-linking questions regarding their enforcement. For now, the pressing questions are: Will the current EU bills overlap in terms of enforcement, and will the new competent authorities coordinate?

To address these questions, the present paper examines enforcement mechanisms in light of  the DSA Package (comprising the Digital Services Act (DSA) and the Digital Markets Act (DMA)), born of an initiative to update the eCommerce Directive, and the AI Act. It examines whether the new authorities created by these regulations will interface well with existing national authorities and how they interconnect with new authorities mentioned in other bills. Ideally, the result would be a common front for oversight and enforcement. The paper opens by presenting the context of the current regulatory discussion and the proposals, taking a closer look at enforcement mechanisms. Next, it examines the proposals in terms of coordination at the vertical and horizontal levels to determine whether (as currently drafted) they would lead to harmonization or fragmentation. Finally, it concludes by looking into the question of whether this model could become a de facto standard in the global regulatory context.

Platform Governance and the Current EU Bills

Although digital platforms have been operating for almost 20 years, there is still an ongoing debate about how to treat them. While platforms tend to present themselves as mere technology companies, scholars argue that their activities make them media companies. Subsequently, legal scholars are still discussing how to categorize them in (national) legal frameworks. The more platforms gain power over markets, data, and the way people receive information and form their opinions, the more they will be compared to states. This contributes to an ongoing “constitutionalization” of social media platforms, whereby regulation attempts to hold them accountable to the principles of the rule of law.

The momentum to regulate digital platforms first entered into high gear after unpleasant events and negative headlines around Big Tech such as the Cambridge Analytica scandal, when lawmakers began to challenge platforms’ efforts to combat unlawful content. In the context of elections, these concerns became quite pressing. In 2017, Germany chose to move forward with the Network Enforcement Act (NetzDG), a law requiring platforms to implement flagging mechanisms and examine complaints over illegal content swiftly. A few months later, France passed a law against information manipulation on digital platforms. Both laws require platforms to provide transparency reports to the supervisory authority, which in turn makes them available to the public. It was in light of these initiatives by single Member States, the EU decided to “update” its rules for digital platforms and to engage in a strategic process of designing rules that could potentially be replicated by other states, giving rise to the legislation being examined in this paper.

The debate about regulating platforms extends beyond law. It is a debate about how to govern digital platforms without disproportionally restricting fundamental rights in such a way that they host publicly accessible spaces for communication and deliberation without fueling hate and polarization. In other words, the nature of the services these digital platforms provide leads us to a new form of (corporate) governance for the digital age. In order to investigate this new form, the governance research that preceded the regulation first looked at the Internet as a whole. However, it soon became clear that more specific research was necessary, focusing on the governance of digital platforms. While governance can be understood as “reflexive coordination,” regulation can be defined as targeted interventions, thus possibly included in a broad definition of governance.

Platform governance specifically is a relatively new term, bringing together multidisciplinary perspectives from media and communication, law, governance, and tech research. Because digital platforms govern much of social interaction online, it is necessary to conceptualize the governance of platforms beyond traditional regulatory means. Platform governance can be defined as a “a concept intended to capture the layers of governance relationships structuring interactions between key parties in today’s platform society, including platform companies, users, advertisers, governments, and other political actors. This concept includes the governance of platforms and the governance by platforms (i.e. self-regulation of platform content). Here, we will focus on the governance of platforms, but bearing in mind that it might ultimately influence governance by platforms because platform governance requires the mutual recognition of regulatory propositions.

Social Media Governance and the EU’s Intermediary Liability Regime

Over the past five years we have been witnessing an increasing demand for stricter regulation of digital platform companies, beyond the self-regulatory approach previously followed by regulators around the world. The current European intermediary liability regime under Art. 14 and 15 eCommerce-Directive allows social media platforms to self-regulate and limits state interference with regard to speech regulation. The eCommerce Directive was adopted in mid-2000, following a regulatory trend set by the US regulator with Sec. 230 CDA that was intended to facilitate innovation and the growth of the digital economy. Under Art. 14 eCommerce, service providers can only be held liable for user-generated content on condition that they have no knowledge about unlawful content or act “expeditiously to remove or to disable access to the information” upon notice (so-called “notice and take-down”). Under Art. 15 (1), Member States shall not oblige providers “to monitor the information which they transmit or store” or “to seek facts or circumstances indicating illegal activity.” This liability regime has been very conducive to initiatives by digital platforms to instate their own rules. It also facilitated the global development of digital platforms, because for years they only had to follow light national liability regimes and could largely rely on an exemption from liability (albeit under certain conditions).

In general, the EU can only regulate in areas in which the Member States grant it competences. It has no competence of its own to add further competences, but is dependent on the allocation of tasks by the community. One of the EU's original tasks is the internal market based on four freedoms: movement of goods, persons, services and capital, also known as the EU single market. This also includes the digital single market. Although this area has traditionally been assigned to economic regulation, it is now a cross-sectoral matter that has permeated many aspects of everyday life. We have seen this with the public debate about the Directive on Copyright in the Digital Single Market in 2019: Suddenly, the impact of sectoral regulation and its enforcement (i.e. “upload-filters”) on the digital public sphere became a much-discussed and broadly publicized topic. In view of their impact on the opinion-forming process of EU citizens, the above-mentioned questions of whether social media platforms should be subjected to media regulation arise again and again.[1] In Germany, for example, there was a discussion regarding the NetzDG on the topic of whether the regulation of online platforms should be a matter of media regulation rather than law enforcement. The same questions arise for the enforcement of the DSA. And yet, it seems like the new rules will not bring any substantial change, as I will elaborate below.[2]

Either way, the EU sees itself as having a duty to design and establish a regulatory framework for a rule-based digital economy. This process has been ongoing, of course, but there now is an even stronger push for “a powerful transparency and a clear accountability framework for online platforms.” The current bills employ various regulatory logics and have different goals. What is relevant to this paper, however (which does not aim at providing an analysis of all specificities regarding the regulatory objectives), is that they all three aim to  apply legal standards to  Big Tech and, possibly, to secure a first-mover advantage on the international scene. This paper, then, connects the general vision of a “Europe fit for the digital age” to the planned means to oversee and enforce. It also explores the question of whether the European legislator, in drafting these new rules, has paid sufficient attention to the enforcement aspect, and whether, accordingly, it has a vision with regard to the governance of platforms. 

The EU’s New Systemic Approach 

In light of the technological as well as political, social, economic developments of the past 20 years, the EU announced in June 2020 that it would update its rules for the eCommerce Directive. As previously mentioned, the eCommerce Directive has shaped Europe’s digital economy to a great extent, and, due to the importance of the European market, its influence has extended to digital policy around the globe. Reforming such a set of rules must not be taken lightly. First of all, further legislation and case law has developed on the basis of the eCommerce Directive. In other words, there is more to take into consideration than solely the law on the books. Legislation from the early 2000s needs to be updated, indeed. But the law currently in practice has further developed and has a different meaning than it had 20 years ago, simply because the targeted digital services steadily evolved. Furthermore, the large-scale changes of a readjustment of the platform economy must be considered,  for all those potentially affected. The EU’s regulation affects not only Big Tech from Silicon Valley, but also small and medium-sized enterprises, although the latter certainly did not trigger the reform. The consequences for the whole platform industry might not be fully apparent yet. It is therefore important to meet the challenges of our time with new regulation, but also to ensure legal certainty by adapting the regulatory framework adequately and where necessary.

From the eCommerce Directive (2000) to, most recently, the GDPR, the EU has been eager to offer one-size-fits-all regulation that will allow Member States to adapt according to their national particularities while safeguarding the cross-border character and viability of both the EU Single Market and, especially, digital platforms. Of course, the scope of discretion regarding the application varies depending on the type of legislation, whereby directives (like the eCommerce Directive) delegate the question regarding the means of implementation to the Members States, while regulations (like the GDPR), allow a higher degree of harmonization and more clarity for all actors involved because they do not require an implementation act.

To update the eCommerce Directive, the Commission, as mentioned, proposed two legislative initiatives: the Digital Services Act (DSA) and the Digital Markets Act (DMA).[3] Together they form the DSA Package. The heart of the regulatory package is the DSA, containing quite detailed rules for platforms and building on a broad spectrum of mechanisms. According to the EU Commission, the DSA’s key goals are better protection of “consumers and their fundamental rights online,” to “establish a powerful transparency and a clear accountability framework for online platforms” and to “foster innovation, growth and competitiveness within the single market.” The Commission chose to keep the current liability regime of Art. 14 and 15 eCommerce, while adding additional obligations to serve the protection of the digital public sphere. To do so, it proposed a systemic approach, matching the type of platform with “asymmetric due diligence obligations” according to its “role, size and impact in the online ecosystem.”

The new rules shall apply to all online intermediaries operating on the EU single market, starting from “intermediary services” providing basic network infrastructure. This broad category includes “hosting services,” which in turn include the category of “online platforms.” The latter is the category that has been most in the spotlight of the public debate because social media platforms, app stores and marketplaces are considered the new gatekeepers of the digital public sphere. They are expected to implement measures for a better handling of user complaints regarding unlawful content. Section 3 DSA provides complaint-handling (Art. 17) and dispute-settlement systems (Art. 18), closer attention to trusted flaggers (Art. 19) as meaningful partners for content moderation, and stricter sanctions against users who frequently misuse the platform (Art. 20). A central provision is the obligation to publish transparency reports on content moderation activities under Art. 13 and 23.

There are additional and stricter rules in the DSA for “very large online platforms” (VLOP) that, according to the Commission, “pose particular risks in the dissemination of illegal content and societal harms” because “they have acquired a central, systemic role in facilitating the public debate and economic transactions due to their reach.” According to Art. 25 (1) DSA, VLOPs are platforms that provide their services to at least 45 million users within the EU. They will be subject to additional obligations such as risk assessments (Art. 26), mitigation measures (Art. 27), independent audits (Art. 28) and provision of user-friendly information regarding the parameters used for their recommender systems (Art. 29) as well as for advertisements on their platform (Art. 30). These enhanced transparency obligations expand to providing data “necessary to monitor and assess compliance with this Regulation” to regulators and academics (Art. 31) and more frequent reports (Art. 33). In sum, the DSA combines the probability of online harms with enhanced responsibility, but sticks with the category of unlawful content instead of broadening it to harmful content.[4]

Supervision and Enforcement

The legislative initiatives by the EU would be considered a paper tiger if they did not provide adequate enforcement mechanisms. Indeed, enforcement, defined as “a matter of deploying a strategy or mixture of targeted strategies for securing desired results on the ground,” is key for successful regulation. Enforcement means are not limited to penalties. In fact, in many situations, other techniques can be preferable. Public authorities can have far-reaching powers to sanction infringements with penalties, but they might fail if they are lacking information from peers or a superordinate authority. Practicable and effective enforcement mechanisms are even more pressing in the present regulatory context considering the very high density of new rules. For example, some fear a regulatory inflation that will come at the expense of platforms that do not employ large departments devoted to the implementation of legal requirements. Compliance, oversight and enforcement require clarity both 1) regarding obligations for platforms mentioned above and 2) regarding the expectations of the new authorities in charge of supervising and enforcing. A gap between the laws’ wording and their interpretation, both by those who implement and those who enforce it, might be inevitable. However, this gap should be kept to a minimum by ex ante communication and sufficiently clear rules. When public authorities and corporations both share their respective assessment of the situation, they have a common point of reference to build on. The DSA’s preamble stresses this point with regard to VLOPs due to the way they influence many aspects of online activities.[5] Finally, the EU legislator clearly wants to avoid mistakes made by the GDPR by introducing more than one enforcement approach. The following section will provide an overview of the enforcement competencies in the DSA and the DMA, as well as the AI Act.

1. Enforcing the Digital Services Act

Enforcing the DSA is one of the main challenges of the legislative initiative, given the many elements of co-regulation that further empower platform’s internal processes. Although lawmakers clearly want to end the period of self-regulation, they still conceive platform regulation as a task for both the state and the private actors. So far, one of the most important criticisms from politicians and civil society has been that large platforms did not do enough to prevent illegal content. While platform regulation should not encourage over-blocking, which is why overly strict sanctions are discouraged, at the same time, previous attempts at self-regulation have failed to be effective, which has spurred lawmakers to react. As already mentioned, the German NetzDG, as well, has its origins in the vacuum created by social networks failing to fulfil their self-obligations to combat online hate speech.

For purposes of supervision and enforcement of the measures provided for in the DSA, stipulated in Chapter 4 of the draft law, new authorities, known as Digital Services Coordinators (Art. 38 to 46), are to be created at national level. The coordinators of the various Member States are to form the European Board for Digital Services at the EU level (Art. 47 to 49). According to Art. 38, it is left to the Member States to create the competent authority as Digital Service Coordinator (DSC) and, possibly, to connect it to an existing body. Under Art. 39, the DSCs are to perform their tasks in “an impartial, transparent and timely manner,” as well as to “act with complete independence.” At the national level, the DSA mandates the formation of one central competent authority to perform the tasks, but more than one would also be accepted. So far, only a few Member States have passed a law (beyond the eCommerce Directive) on intermediaries, so this is indeed a novelty for most Member States. Others have made progress in this area in recent years, such as Germany, in the form of the NetzDG. The question in Germany is whether the authority responsible for the NetzDG (the Federal Office of Justice) should also become a DSC. Another example is France, which has designated its media authority (CSA) with the oversight of platform regulation.

The DSCs are to be given far-reaching powers, both in supervision, enforcement and, if necessary, the imposition of sanctions. Accordingly, whether this task is entrusted to an authority with a focus on law enforcement or media regulation is a question of paramount importance. Moreover, the requirements laid out in Art. 39 DSA could constitute an obstacle to designating an agency like the German Federal Office of Justice, due to a lack of independence because it is subordinate to the Federal Ministry of Justice and Consumer Protection. The same requirements might also constitute an argument against the central role of the Commission as a regulator and, instead, in favor of a centralized EU Media Agency.

With regard to local competence, the draft follows the “country of origin principle” (Art. 40 DSA). According to this principle, the DSC of the country in which the service provider has its registered office or its legal representative according to Art. 11 DSA is responsible. If, in contravention to Art. 11 DSA, the provider has not appointed a representative, each member state is to be responsible for DSA enforcement.

2. Enforcing the Digital Markets Act

Although the DMA’s regulatory goal is quite different from the DSA, they are closely intertwined and are supposed to somehow form a regulatory entity named the “DSA Package” (by the Commission). The DMA prohibits large platforms, which by nature function as gatekeepers, from employing “unfair practices towards the business users and customers that depend on them to gain an undue advantage.” Particularly with regard to VLOPs, the scopes of application can overlap. According to Art. 3 DMA, a “provider of core platform services shall be designated as gatekeeper if: it has a significant impact on the internal market; it operates a core platform service which serves as an important gateway for business users to reach end users; and it enjoys an entrenched and durable position in its operations or it is foreseeable that it will enjoy such a position in the near future.” Assuming that a platform service has most impact on the market when it has high user numbers, many VLOPs will probably meet the requirements for gatekeepers, and vice-versa.

Unlike the DSA, the DMA does not stipulate the creation of a new competent authority in each Member State. Instead, the Commission will be the competent regulatory body – which comes as no surprise from a competition law perspective. The DMA’s scope of application is much smaller than the DSA because it targets gatekeepers only. The risk of diversity in interpretation and enforcement across the EU is of course the main argument against a decentralized enforcement model. A Digital Markets Advisory Committee (DMAC) shall be created to assist the Commission by providing national expertise (Art. 32). The role of the DMAC is purely of advisory nature. It shall be a Committee within the meaning of Regulation (EU) No 182/2011, which means it can be consulted at the Commission’s discretion and deliver non-binding opinions on technical questions.

3. Enforcing the AI Act

In April 2021, the EU Commission published its first draft of an Artificial Intelligence Act (AIA). It was (and still is) the first proposal worldwide to regulate AI and, therefore, much discussed by experts. Its very broad scope of application and its unusual structure are just some of its features that raise a lot of questions. From a platform regulation perspective, it seems necessary to include this proposal to the debate and to not overlook this interrelation between online platforms, market dynamics and the future of technology. Since social media platforms already use AI systems, for instance to perform content moderation activities (among others), there is an existing connection and we should expect the further implementation of AI applications to online platforms. Platforms will be concerned by specific transparency obligations under Art. 52 (1) AIA regarding the possible interaction between human users and ‘bots.’ Additionally, social media platforms might get involved with the obligations for the users of AI systems, since their business models are based on user-generated content.

Regarding the enforcement of compliance rules, there are similarities with the DSA in the governance mechanisms. Art. 56 AIA stipulates the creation of a European Artificial Intelligence Board (EAIB) and the designation of national competent authorities (Art. 59). . Under Art. 59 AIA, the national supervisory authority shall “act as notifying authority and market surveillance authority.” The text is silent on the question as to whether the national competent authority could be the same as the DSA’s DSC. Complementing the Explanatory Memorandum (sec. 5.2.6.), the AIA’s Preamble is more explicit on the oversight and enforcement mechanisms. According to recital 77, “Member States hold a key role in the application and enforcement.” Indeed, the appointment and the configuration of the competent bodies’ responsibilities and functions is the responsibility of the Members States. The problem is that the expected shortcoming regarding enforcement mirrors other weak points of the draft law. Due to the fact that it is “stitched together from 1980s product safety regulation, fundamental rights protection, surveillance and consumer protection law,” its enforcement risks staying behind expectations when it comes to law in action.

Enforcement Coordination: a Blind Spot?

The EU Commission’s overall goal is a comprehensive legal framework for a digital public sphere that would promote the protection of fundamental rights and European values. It was not a coincidence that the Commission published the proposals within a few months of each other. Moreover, it is quite explicit about its ambition to provide a framework that could serve as a model for non-European countries. Given this strategic positioning, one would expect it to take sufficient account of the possible synergies in the enforcement of these new regulatory frameworks and to propose a consistent (and not only sector-specific) governance model. The next subsections will assess whether the current proposals provide sufficient coordination mechanisms for a coherent platform governance model.

1. Among the New Authorities

First, the draft laws could contain a proposal for a horizontal coordination between the new authorities created, as well as with existing public authorities in the digital sector such as the Data Protection Authorities (under GDPR).

Art. 38 (2) DSA stipulates that the DSCs “shall cooperate with each other, other national competent authorities, the Board and the Commission.” Cross-border cooperation among DSCs is planned under Art. 45. Furthermore, all DSCs are mandated to form a European Board for Digital Services (Art. 47 to 49 DSA). The role of the European Board for Digital Services (EBDS) is to advise the DSCs and the Commission in order to achieve the consistent application of the DSA, especially with regard to the supervision of VLOPs. The EBDS thus represents an opportunity for DSCs to exchange and consult at EU level on the supervision and enforcement of the DSA.

Regarding horizontal coordination with the bodies proposed in the DMA and the AIA, there is nothing in the texts so far, although there is a high probability that competences of the various regulations will overlap, for the following reasons:

  • The DSA does not mention the DMAC. The DMA mentions the DSA as the complementary law and the coherence between the two as far as regulatory objectives are concerned, but not the possible cooperation between the DSCs and the DMAC. On the one hand, the missing nexus is understandable, as the DMAC is not an enforcement agency, but merely fulfils a consultative function. On the other hand, a formalized connection between the EBDS and the DMAC is desirable, in order to possibly issue joint statements and recommendations.
  • The AIA, too, mentions its consistency with the DSA, but no coordination mechanism between the EAIB and the EBDS or the DMAC. It does, however, stipulate cooperation with the European Data Protection Supervisor.

2. Between the European Commission and the New Authorities

Second, consistent enforcement of new regulation could be achieved by virtue of a vertical coordination between the EU Commission and the new authorities. In this respect, the three drafts analyzed are somewhat more informative.

In the DSA context there are a couple of connections between the Commission and the new authorities:

  • As already mentioned, Art. 38 (2) stipulates that the DSCs shall cooperate with the Commission. They are effectively reporting to the Commission, e.g. about certifications of out-of-court dispute settlement bodies (Art. 18 (5)) and trusted flaggers (Art. 19), about annual reports (Art. 44) and in cases of cross-border cooperation (Art. 45).
  • The Commission is involved in many aspects of the DSA, such as issuing guidance for trusted flaggers under Art. 19 (7) or the mitigation of risks by VLOPs (Art. 27 (3). Sometimes the Commission will function in a manner detached from the DSCs, as in the development of standards and codes of conduct (Art. 34-37). However, most actions are planned to be taken in agreement with the DSCs.
  • The connection between the Commission and the EBDS is expected to be close. Under Art. 48 DSA, the EBDS will be chaired by the Commission, which will also set the agenda.

Under Art. 59 (6) AIA, the Commission will be in charge of facilitating the exchange among the national competent authorities, but not in a cross-sectoral manner, that is, between the AIA and the DSA – although (as previously mentioned) there might be overlaps with regard to the addressees. Regarding the DMAC and the EAIB, the connection to the Commission is very close, given that they act as advisory bodies to the Commission. They, too, shall be chaired by the Commission (Art. 57 (3)), who will also set the agenda.

3. Conclusion: All Roads Lead to Brussels?

As mentioned in the beginning, the DSA is at the heart of the EU’s platform regulation. Hence, it is reassuring that, at least within the DSA, both horizontal and vertical cooperation between the DSCs themselves and between the DSCs and the Commission are intended. Of concern, however, is the fact that there is no discernible institutional cooperation between the new authorities of the different bills. It seems as if the conception is that the laws will be enforced completely independently of each other and therefore there is no need for coordination. The lack of horizontal cooperation gives the impression that the EU – contrary to its stated intentions – does not have a strategy thoroughly thought out to the end. Perhaps this shortcoming could be balanced by the central role of the Commission. But should all of the responsibilities ultimately lay with the EU Commission?

Generally speaking, experts have welcomed the proposals and, especially in case of the DSA, lawmakers’ focus on oversight and enforcement mechanisms. Many have highlighted the need to learn from the experience with the GDPR. Most have stressed the need to further conceptualize the proposals’ ideas and think ahead of the interpretation by enforcers and courts. The brief assessment of vertical and horizontal coordination in this paper confirms this impression. There is currently a window of opportunity to “lay out  a more detailed system of EU-wide cooperation” and for the creation of an “ecosystem of oversight.

Outlook: The EU’s Normative Soft Power

The EU wishes to expand its normative power and to develop laws that could become a de facto standard in technology regulation and platform governance. Some perceive the EU approach as an extraordinarily expansionist regulatory agenda,” powered by Europe’s aspirations for “digital sovereignty.” It is still too early to say if the DSA, the DMA and the AIA will become a de facto standard in the global regulatory context. Considering the global “success” of the GDPR, chances are quite high that they will. This is especially true for those bills touching upon new technologies such as the AIA. So far, many aspects still need clarification and this paper addresses just one of many questions on the regulatory horizon.


Baldwin, Robert, Cave, Martin & Lodge, Martin (2012). Understanding Regulation: Theory, Strategy, and Practice. Oxford University Press.

Bureau Européen des Unions de Consommateurs/BEUC (2021). “Position Paper on the Digital Services Act,”

Broadbent, Meredith (2021). “AI Regulation: Europe’s Latest Proposal is a Wake-Up Call for the United States,” May 18, 2021, retrieved from….

Busch, Christoph, Graef, Inge, Hofmann, Jeanette, & Gawer, Annabelle (2021). Uncovering Blindspots in the Policy Debate on Platform Power: Final report. European Commission.

Celeste, Edoardo, Heldt, Amélie & Iglesias-Keller, Clara (forthcoming 2022). Constitutionalising Social Media. Hart Bloomsbury.

Cornils, Matthias (2020). “Designing platform governance: A normative perspective on needs, strategies, and tools to regulate intermediaries,” AlgorithmWatch, retrieved from

De Streel, Alexandre & Ledger, Michèle (2021). “New Ways of Oversight for the Digital Economy, CERRE Issue Paper,” retrieved from

Die Medienanstalten (30.03.21). Stellungnahme: “Digital Services Act und Digital Markets Act der EU Eingabe der Medienanstalten im Konsultationsprozess der EU-Kommission,” retrieved from….

European Regulators Group for Audiovisual Media Services/ERGA (2020). “ERGA Position Paper on the Digital Services Act Subgroup 1 – Enforcement,”

Flew, Terry, Martin, Fiona, & Suzor, Nicolas (2019) “Internet Regulation as Media Policy: Rethinking the Question of Digital Communication Platform Governance.” Journal of Digital Media and Policy, 10 (1), 33-50.

Gasser, Urs & Schulz, Wolfgang (2015). “Governance of Online Intermediaries: Observations from a Series of National Case Studies.” Berkman Center Research Publication No. 2015-5.

Gillespie, Tarleton (2017). “Governance of and by Platforms.” SAGE Handbook of Social Media, 254-278.

Gorwa, Robert (2019). “What is platform Governance?” Information, Communication & Society, Vol. 22, No. 6, pp. 854–871,

Hennemann, Moritz (2020). “Wettbewerb der Datenschutzrechtsordnungen – Zur Rezeption der Datenschutz-Grundverordnung,” 84 Rabels Zeitschrift für ausländisches und internationales Privatrecht  4, 864-895.

Hofmann, Jeanette, Katzenbach, Christian & Gollatz, Kirsten (2017). „Between Coordination and Regulation: Finding the Governance in Internet Governance,” New Media & Society, ISSN 1461-7315, Vol. 19: 9, 1406-1423,

Jaursch, Julian (2021). “The DSA Draft: Ambitious Rules, Weak Enforcement Mechanisms Why a European Platform Oversight Agency Is Necessary,” retrieved from

Kettemann, Matthias C., Schulz, Wolfgang & Fertmann, Martin (2021). “Anspruch und Wirklichkeit der Plattformregulierung – Kommissionsentwürfe der Rechtsakte zu digitalen Diensten und Märkten,” Zeitschrift für Rechtspolitik, 5:138-141.

Napoli, Philip & Caplan, Robyn (2017). “Why Media Companies Insist They’re not Media Companies, Why They’re Wrong, and Why it Matters.” First Monday, 22(5).

Schulz, Wolfgang (2008). “Von der Medienfreiheit zum Grundrechtsschutz für Intermediäre?,” “Computer & Recht, 24 (7), 470-476.

Schulz, Wolfgang & Dreyer, Stephan (2020). Governance von Informations-Intermediären – Herausforderungen und Lösungsansätze, Bericht an das BAKOM,” retrieved from

Veale, Michael & Zuiderveen Borgesius (2021). “Demystifying the Draft EU Artificial Intelligence Act,” Pre-print, July 2021. Version 1.2., forthcoming in (2021) 22(4) Computer Law Review International.

Vergnolle, Suzanne (2021). “Enforcement of the DSA and the DMA: What Did We Learn from the GDPR?,” VerfBlog, 2021/9/03,

[1] In France, for instance, the oversight has so far been attributed to the media regulator: Conseil Supérieur de l’Audiovisuel.

[2] See also: Die Medienanstalten, Digital Services Act und Digital Markets Act der EU - Eingabe der Medienanstalten im Konsultationsprozess der EU-Kommission, 30.03.21, retrieved from <…;.

[3] Note that all references in this text to the DSA, DMA and AI Act refer to the proposals in their first-draft version.

[4] Unlike the UK: Draft Online Safety Bill, retrieved from…;.

[5] Preamble to the DSA Proposal, recital 56, p. 31.

The opinions expressed in this text are solely that of the author/s and do not necessarily reflect the views of the Heinrich Böll Stiftung Tel Aviv and/or its partners.