Clubhouse, founded in May 2020, is an audio-only social networking iOS that allows users to listen into virtual rooms that other individuals are speaking in, or to create their own.
The app has become popular because it facilitates ‘live chat’ which purportedly cannot be listened to again. At present, in order to join the app, individuals have to receive an invite from an existing user and users reportedly include the likes of Kanye West, Mark Zuckerberg and Oprah Winfrey. The BBC reported in February 2021 that Clubhouse had 2 million users and was valued at over £0.7 billion.
It sounds great, what’s the problem?
A number of critics, in particular Stanford Internet Observatory (SIO), have raised a host of data protection concerns regarding the Clubhouse app, including:
- Clubhouse does not allow users to invite their friends to the app until they grant permission for Clubhouse to access their contacts list. This means that contacts can have their personal data shared with Clubhouse without their consent;
- it records audio as a moderation control (i.e. to identify and remove trolls), but this may not be transparent to users;
- it may lack appropriate technical and organisational security measures, for example:
- it may not encrypt recordings end-to-end;
- it may transmit users’ unique Clubhouse ID number and chatroom ID in plaintext;
- users’ microphones may remain active if they switch to another app without leaving the virtual room;
- a user was able to live stream a conversation involving Elon Musk in January 2021, whilst another was able to stream audio feeds from “multiple rooms” into their own third-party website in February 2021; and
- Shanghai-based Agora, which processes Clubhouse’s data traffic and audio production, may be obliged by Chinese cybersecurity laws to assist in locating and storing audio for national security purposes.
In February 2021, Clubhouse acknowledged that: ”With the help of researchers at the SIO, we have identified a few areas where we can further strengthen our data protection..” Agora maintain that they do not “store or share personally identifiable information.”
The Hamburg Commissioner for Data Protection and Freedom of Information announced that it is coordinating with other German data protection authorities to review the app’s GDPR compliance. The Commissioner stated: “Unfortunately, it happens again and again that providers from the USA are pushing into the European market or are simply successful with their products and services without complying with the most basic data protection regulations of the European digital market . . . It is in the interests of all European users to be able to use services that do not infringe their own or third-party rights . . .”
Conclusion
In light of the above, it is important that organisations seeking to use internationally developed Applications assess the systems and their data flows to:
- ensure that they implement appropriate technical and organisational measures with respect to the apps that their employees are permitted to use for business purposes;
- consider whether it is appropriate to undertake a data protection impact assessment (DPIA) of new apps; and
- ensure that apps adhere to ‘data protection by design and by default.’
Trilateral’s Data Governance and Cyber-Risk Team has significant experience supporting organisations in implementing appropriate security measures in respect of personal data, and/or raising internal awareness of the importance of data protection. We offer a range of data governance services that can help your organisation to develop policies and procedures for ongoing compliance. Trilateral can help audit existing practices, perform gap analyses, draft DPIA’s and data sharing agreements, and offer compliance support. For more information please feel free to contact our advisors, who would be more than happy to help.