The Digital Services Act (DSA), which entered into force on 16 November 2022, marks a milestone in the fight to create a safe online environment. As a Regulation, the Act is directly applicable and will therefore ensure there is harmonised approach throughout the EU. The Act not only tackles illegal activities but also imposes new restrictions on the use of data. The DSA will take effect from 17th February 2024. This article will outline what type of organisations will be in scope, the core obligation and what action organisations need to take and by when.
Scope of the DSA
The DSA sets out obligations and rules for providers of intermediary services, such as online marketplaces, social media platforms, very large online platforms (VLOPs) and very large online search engines (VLOSEs). The rules are structured so that larger intermediary services with a greater societal impact will come under stricter regulations.
Given the nature of these services the Act will not just be restricted to intermediary services that have establishments within the EU. It will be applicable to intermediary services where there is, what the Act defines as a ‘substantial connection to the union’. This substantial connection can be established by:
- The targeting of activities towards Member States (such as the use of language or currency generally used in one or more Member States or the possibility of purchasing goods and services from one or more Member States)
- There being a significant number of recipients of the service in any given Member States (significant in relation to the total population of the Member States)
However, the Act specifies that the mere technical accessibility of a website from the EU is not in itself grounds for establishing this substantial connection.
What does the DSA aim to achieve?
The obligations set out in the DSA aim to ensure there is a safe, predictable and trustworthy online environment. Some of the measures from the DSA create new data protection obligations for intermediary service providers, such as:
- The protection of minors by restricting platforms use of targeted advertising based on the use of minor’s personal data.
- Restricting the use of targeted advertising based on the profiling of special category data (data related to political opinion, racial or ethnic origin, health, sexual orientation etc.).
- Providing transparent information on the key parameters used to determine which advertisements or content selections are presented to which users, including algorithmic transparency where applicable.
- Providing transparent information on the key parameters used in decision making algorithms used to determine what content is presented to users (Just applicable to VLOPs/VLOSEs).
- Offering users a system for determining content and recommendations that is not based on profiling (Just applicable to VLOPs/VLOSEs).
The DSA also imposes other obligations not relevant for data protection but aimed at holding intermediary service providers accountable for any detrimental societal impact, such as:
- Tackling the sale of illegal products and services.
- Reacting quickly where necessary and taking measures to counter illegal content, while respecting fundamental rights.
- Banning misleading practises aimed at disseminating misinformation and ‘dark patterns’ which the Act describes as practises that ‘distort or impair…the ability…to make autonomous and informed choices…’
- Assessing any systemic risks created in relation to; the dissemination of illegal content, any detrimental effects on electoral processes and fundamental rights and any harmful impact to gender-based violence or mental health (this obligation is just applicable to VLOPs/VLOSEs).
- Ensuring there is a crisis response mechanism to respond appropriately to serious threats to public health and security crises, such as a pandemic or a war (this obligation is just applicable to VLOPs/VLOSEs).
What powers does the DSA give?
Each member state will need to appoint a Digital Services Coordinator (DSC) who will be responsible for overseeing the activities of intermediary services and ensuring compliance with the DSA. Member states will be responsible for specifying what penalties can be levied by DSCs. For VLOPs and VLOSEs the Commission will be responsible for supervision and enforcement and for the most serious infringements they will be able to impose fines of up to 6% of the annual worldwide turnover.
What should organisations do now?
The obligation to report user counts or as the Act terms them, Active Monthly Active Recipients (AMARs), will take immediate effect. This data will be used by the Commission to determine which platforms are deemed to be VLOPs or VLOSEs. Rules governing VLOPs and VLOSEs are likely to apply from July 2023. Service provides that fall outside of this will have until February 2024 to bring themselves into compliance with the DSA.
In order to prepare for these deadlines, intermediary service providers should complete a thorough analysis of current practises against the requirements of the DSA to highlight any gaps where they are not currently meeting the standards. They should put into place an action plan to bring their practises in line with the Act ahead of the February 2024 deadline. In regard to their data protection obligations there will be a lot to unpack. We would therefore recommend that intermediary service providers take immediate action and as a first step map out all use of profiling and algorithms. A key step in the journey towards compliance with the DSA is in ensuring that there is internal clarity, transparency and oversight over the use of profiling and algorithmic tools, especially with respect to advertising and content selection.
Trilateral’s Explainable AI Service maps the performance, transparency, justifiability and accountability of your algorithms as part of an organisation’s data protection compliance program; helping organisations respond appropriately to regulatory change. Contact our advisors today to discuss how we can assist your organisation on its compliance journey.