Another ‘Big Tech’ GDPR investigation has been launched by Ireland’s Data Protection Commission (DPC): The regulator announced Wednesday that it has launched two investigations into TikTok, a video sharing platform. The first examines how TikTok manages children’s data and whether it conforms with the General Data Protection Regulation (GDPR) of the European Union.
The DPC also stated that it will investigate TikTok’s personal data transfers to China, where its parent firm is situated, to see if the company complies with the regulations governing personal data transfers to foreign nations. A spokesperson from the firm told that TikTok prioritizes the privacy and safety of its youngest members. The company had made many rules and regulations to safeguard data. It relies on authentic methods of data transfer from Europe. TikTok clarified that they will cooperate with the investigation.
Other EU data protection agencies and consumer protection organizations have expressed worry about how TikTok manages user data in general and children’s information in particular, prompting the Irish regulator to announce two “own volition” inquiries.
In Italy, TikTok was required to reconfirm the age of every user in the nation in January after the data protection watchdog used GDPR powers to initiate an emergency procedure in response to child safety concerns. TikTok then followed through on the injunction, erasing almost half a million accounts where it couldn’t confirm the users weren’t minors. This year, European consumer protection organizations have expressed concerns about the platform’s child safety and privacy. In May, EU lawmakers said that the company’s terms of service would be reviewed.
The GDPR puts constraints on how children’s data can be handled, as well as an age limit on children’s capacity to agree to their data being used. The age limit varies per EU Member State, however, there is a hard maximum of 13 years old for children’s ability to consent (in some EU countries the age limit is 16).
TikTok responded to the DPC’s notice by citing its usage of age gating technology and other efforts for detecting and removing underage users on its platform. It also mentioned a few recent changes it’s made in regards to children’s accounts and data, such as changing the default settings to make their accounts private by default and limiting their access to certain features that intentionally encourage interaction with other TikTok users if those users are over 16.
It promises to employ “authorized techniques” for international data transfers. However, the situation is more convoluted than TikTok’s assertion suggests. Because there is no EU-China data adequacy agreement in effect, data transfers from Europeans to China are complex. In TikTok’s instance, this means that any personal data transfers to China must be accompanied by additional “appropriate safeguards” to ensure that the data is protected to the requisite EU standard.
When no adequacy agreement exists, data controllers may rely on procedures like as Standard Contractual Clauses (SCCs) or binding corporate rules (BCRs) – TikTok’s statement specifically mentions SCCs.
However, since a landmark ruling by the CJEU last year, which invalidated a flagship data transfer agreement between the US and the EU and made it clear that DPAs (such as Ireland’s DPC) have a duty to step in and suspend transfers if they suspect people’s data is flowing out of the EU to third countries, personal data transfers out of the EU to third countries have faced significant legal uncertainty and increased scrutiny.
While the CJEU did not completely invalidate mechanisms like SCCs, it did say that all international transfers to third countries must be evaluated on a case-by-case basis and that if a DPA has concerns, it must intervene and suspend any non-secure transfers. The CJEU judgement suggests that the mere use of a mechanism like SCCs is insufficient to determine the legality of a data transfer. It also puts more on EU agencies, such as Ireland’s Data Protection Commission (DPC), to be proactive in assessing dangerous data transfers.
The European Data Protection Board issued final guidance earlier this year that details the so-called “special measures” that a data controller may be able to use to increase the level of protection surrounding their specific transfer so that the information can be legally transferred to a third country. But, considering how TikTok’s platform and algorithms are constantly mining users’ data to modify the content they see and keep them engaged with TikTok’s ad network, it’s unclear how a social media firm like TikTok would be able to make such a patch.
Another recent development is the passage of China’s first data protection law.
However, for EU transfers, this is unlikely to change much. The Communist Party regime’s ongoing expropriation of personal data through the enforcement of extensive digital surveillance regulations makes meeting the EU’s strict requirements all but impossible for China.
When it comes to the EU’s enforcement of its data privacy standards, TikTok might take consolation in the fact that it most likely has time on its side. The Irish Data Protection Commission (DPC) has a large backlog of cross-border GDPR inquiries targeting several tech companies. Only recently did an Irish regulator announce its first verdict against a Facebook-owned company, imposing a $267 million punishment on WhatsApp for violating GDPR transparency regulations (but only doing so years after the first complaints had been lodged).
The DPC handed down its first verdict in a cross-border GDPR case involving Big Tech at the end of last year, fining Twitter $550k for a data breach that occurred in 2018. The Irish regulator is still debating a number of cases involving tech companies such as Apple and Facebook. As a result, the new TikTok probes will be added to the rear of a much-criticized bottleneck. And a decision on these probes is unlikely to be made for several years.
TikTok could face tougher monitoring in Europe if it collects data on children: In the area of children’s data, the UK added some “gold-plaiting” to its version of the EU GDPR and has stated that platforms must satisfy its suggested standards starting this month.
It has cautioned that platforms that do not fully comply with its Age Appropriate Design Code may face penalties under the UK’s General Data Protection Regulation (GDPR). The UK’s code is credited with spurring a slew of recent improvements in how social media firms handle children’s data and accounts.