3rd April 2023

Mandating Big Tech accountability through the Digital Services Act (DSA)


The first phase of a new landmark piece of European Union legislation was completed in February and is expected to have significant impact on how Big Tech operates in Ireland when eventually passed as expected this year. 

Many of the world’s largest tech companies like Google, Meta, TikTok and Twitter are US based but have their European headquarters in Ireland, meaning they will be subject to the new regulatory framework of the Digital Services Act (DSA). 

The DSA will require tech companies to remove illegal content and ensure that users have access to redress mechanisms if they are harmed by content on social media platforms. It will also require companies to be more transparent about how they use and protect user data. 

If found to be in breach of regulations, massive fines could be levelled at online intermediaries like social media platforms, but also at online marketplaces and search engines. 

It’s hoped that the regulations will help to create a safer and more transparent online environment for users. But what are the new rules exactly and how will they will be enforced? 

Liability vacuum 

Under the DSA’s precursor, the E-Commerce Directive, online intermediaries to date have been waived of liability for the information transmitted through their service or posted by its users.  

In other words, if a TikTok user posted content glorifying suicide on their account, the social media platform carried zero liability. That’s set to change. It has become clear to decision makers that online intermediaries need to shoulder responsibility in policing the content on their sites. 

The DSA won’t replace the E-Commerce Directive outright, but it will introduce new responsibilities. Specifically, it sets out new obligations to remove and moderate illegal or harmful content, introduces protections for users’ rights online, and places digital platforms under a new transparency and accountability framework.   

Expanded workload for Big Tech 

As part of the DSA, online intermediaries will be required to submit data on EU-based users at six-month intervals.  

Intermediaries with 45 million or more users per month, which will be designated as very large online platforms and very large online search engines (VLOPs and VLOSEs), will be subject to stricter regulation and accountability protocols.  

This includes banning targeted advertising aimed at children, quickly removing harmful or illegal content once it has been flagged and opening up data on algorithms and “dark patterns” to independent auditors.  

This list is not exhaustive, but it is clear that the DSA introduces a new and extensive moderating workload for the likes of Twitter, Google and Meta.  

At a time when market jitters are causing uncertainty around the tech sector, this new legislation will likely require significantly beefing up or reassigning of employees to fulfil these new functions. 

A safer online experience?  

The DSA will set out what is defined as illegal and harmful content, and leave it up to the online intermediaries to design the processes to remove and moderate the outlawed content on their sites. It will also provide some clarity around the internal workings of opaque advertising algorithms. 

In this sense, rather than enforcing strict processes on Big Tech, the EU hopes to work in partnership with large service providers. While this relatively “soft touch” approach to regulation might be seen as more palatable to online intermediaries it has faced criticisms that it leaves too much discretion in the hands of these online platforms. 

Ireland’s role  

As a first-of-a-kind piece of legislation, the EU hopes that the DSA will set a new global gold standard for online regulation.  

Under the DSA, enforcement will be the joint responsibility of both the EU and the bloc’s member states. In Ireland, the Government has assigned domestic oversight to the newly created Media Commission, established via the Online Safety and Media Regulation Bill 2022.    

Ireland is positioned to be at the forefront of the DSA’s implementation. The Country-of-Origin principle will see Ireland’s Media Commission responsible for regulating Google and Facebook due to the location of their European headquarters. 

It will not face this challenge alone, however, with Brussels also responsible for moderating those very large entities classified as VLOPs and VLOSEs. 

Funding new regulation 

To fund this enforcement, social media platforms will have to pay a supervisory fee of 0.01% of annual turnover. It is anticipated that this will generate up to €30 million in funds per annum for the EU. To put this in perspective, the resource-stretched Irish Data Protection Commision was allocated a budget of €22 million in 2022.  

In other words, it represents a very modest budget to deliver upon its newfound responsibility, particularly given the complex, cross-jurisdictional basis of the legislation and the deep war chest of those who will look to challenge it.   

Ireland’s desirability 

On one hand, the increased regulatory burden, the prospect of increased corporation tax and an acute housing crisis pose questions over Ireland’s desirability as a location for tech companies to do business. 

On the other, Ireland has already come under criticism from other EU member states for what is perceived to be an overly cosy, if not soft-touch approach to Big Tech, which the likes of Google and Meta will no doubt look to continue to flex in their application of the DSA.  

In any case, the influence that Big Tech wields cannot be underestimated, particularly given revelations that in 2022 alone, five tech giants alone spent €29.5 million lobbying EU institutions. 

Given the Irish economy’s heavy reliance on multinationals and FDI, how Irish regulators balance competing interests and walk the tightrope between keeping both Brussels and the tech giants happy will make for interesting viewing.  

Collective reaction 

The hope in Brussels is that equivalent legislation will be developed in other jurisdictions, including the US and UK.  

The UK is currently working on its own version of the DSA, the Online Safety Bill (OSB), which seeks to introduce stricter controls on harmful content online. While, the US government has called for enhanced legislation to strengthen antitrust enforcement and control of Big Tech’s collection of personal data.  

However, the road ahead for the US and UK looks particularly bumpy. One key obstacle is defining what qualifies as harmful content.  The EU has navigated this issue by keeping its definition of harmful content broad and referring to the domestic legal definitions. For instance, Meta will be required to remove swastikas from Facebook in Germany but not in Ireland.  

The US and UK will not have this luxury, and the UK’s OSB has already been subject to almost five years of political wrangling, which has seen the Bill drop a requirement for platforms to take down “legal but harmful” content. 

The EU is demonstrably intent on making Big Tech accountable for what appears on their platforms.  

Given that many of the world’s largest tech companies call Ireland their home in Europe, and that the Irish Data Protection Commission was responsible for issuing more than a third of the €2.92bn in EU data breaches fines last year, Ireland is on the frontline of the EU’s battle with Big Tech.  

Only time will tell how successful this new legislation will be, but for now, all eyes move to London and Washington DC to see how they grasp the nettle.  

Darragh Duncan 360-FINN

As a Senior Account Executive at 360-FINN, Darragh advises a range of clients in recruitment, construction, food and the health sector on their communications, including strategy, crisis management, media relations, corporate messaging, public policy and public affairs, stakeholder engagement, and campaign success monitoring and reporting.

Join the Circle

Get 360’s intelligent communications updates, insights, and research delivered to your inbox every quarter.