“Data scraping” investigation results in €265m data protection fine for Meta

Reading Time: 3 minutes
Meta fined in Ireland


Claire Looney | Data Protection Advisor

Date: 28 December 2022


Meta Ireland Platforms Limited (“Meta”, formerly Facebook Ireland Limited) is the subject of another fine from the Irish Data Protection Commission (“the DPC”) following what it termed its “Data Scraping” investigation into Meta platforms. This investigation by the DPC had commenced in Spring of 2021 after news broke of a leaked dataset online containing the personal data of more than 500 million Facebook users which had been ‘scraped’ from publicly available data on Meta’s platforms. The collated dataset included phone numbers, dates of birth and email addresses of Facebook users.

The DPC published its final decision in November confirming the imposition of administrative fines on Meta totalling €265m and a range of corrective measures, requiring Meta to bring its processing activities into compliance. This article will consider how Meta’s processing was found to be in breach of the GDPR in allowing “bad actors” to collate and further process data made publicly available on its platforms.

Data Protection by Design and by Default

Article 25 GDPR requires that a data controller ensures its systems or products are designed at the outset to safeguard the privacy and data protection rights of data subjects. To do so, they should implement appropriate technical and organisational measures that effectively incorporate the principles of data protection. It further requires that, by default, only personal data that are necessary for each specific purpose of the processing are processed. The DPC examined the Facebook search, Facebook Messenger contact importer and Instagram contact importer features and found that Meta, the data controller for these platforms, was in breach of these provisions of Article 25.  

The investigation considered that the scope of Meta’s processing created a ‘severe’ risk of bad actors using the platforms to obtain personal data which exposed users to significant risks of fraud, spamming and impersonation. The DPC found that Meta failed to design its systems in a manner that would safeguard against these risks. Its systems allowed these “bad actors” to use random combinations of numbers and letters to identify valid phone numbers or email addresses and, where a match occurred, to identify the Facebook user connected with that phone number or email address.

While Meta had implemented some preventative measures for “bot detection”, these were deemed insufficient. The DPC found that, as there were other measures available to Meta to prevent this data scraping and the severe risk for its users, it had not implemented appropriate measures. In reaching this finding, the DPC had regard to the principles of the GDPR including integrity, confidentiality and purpose limitation.

With respect to the confidentiality of the data, while users had made their information ‘public’, the DPC found that users had not done so to allow “strangers” search for them nor for their data to be ‘scraped’. The DPC held that Meta’s features facilitated this “unauthorised access” to personal data in breach of the principle of integrity and confidentiality.

The DPC also found that Meta should have employed further measures appropriate to the significant risk that its users’ personal data could be processed in a manner that was incompatible with the purposes for which the personal data were collected. The way its system was designed allowed bad actors to process the data for a different purpose and in a manner that was incompatible with the purposes for which the personal data were collected. The DPC found, therefore, that Meta had failed in its obligation under Article 25 to ensure it had used appropriate measures to implement the purpose limitation principle.

In its submissions, Meta argued that its features were “only problematic when exploited in an automated way by fake accounts” and so its processing of personal data was not the problem. This argument did not persuade the DPC, nor any of the other concerned supervisory authorities, which all agreed with the DPC’s decision.

Meta is also the data controller for the Instagram and WhatsApp platforms, which have also been the subject of recent fines of €405m and €225m respectively. While Meta has appealed the €405m fine issued following the DPC’s investigation into Instagram’s processing of children’s data, these recent fines for Meta presently amount to almost €1bn.  Meta reported that it had complied fully with the DPC’s investigation and that it had already taken steps to address the incident by making changes aimed at preventing the ‘scraping’ of personal data from its platforms.


This decision confirms that data controllers must ensure their systems are designed in a manner that safeguards against the risk of third parties unlawfully accessing and/or further processing personal data. In order to ensure compliance with Article 25 GDPR, controllers should:

  • Ensure data protection principles are “baked-in” at design and implementation stage – employing robust pen-testing, risk workshops and brainstorming with variety of stakeholders to identify how personal data may be improperly accessed and mis-used;
  • Assess the severity of the associated risks for data subjects. The greater the risks (e.g., fraud, financial loss etc.), the more robust the technical and organisational measures must be; and
  • Monitor processing activities to ensure that safeguards are still appropriate given emerging risks and state of the art controls available.

Trilateral’s Data Protection and Cyber-risk team have data protection specialists with extensive expertise and experience in reviewing the lawfulness of data processing activities. Please feel free to contact our advisors, who would be happy to speak with you about your compliance needs in light of this decision.

Related posts