TikTok hit with multi-million dollar fine over child data safety
There’s a sense in which it was inevitable, and it’s been coming for some time, but TikTok has been hit with a $15.9 million fine by the UK’s Information Commissioner’s Office.
Tensions are running increasingly high around the most popular social media platform not based in the West, with increasing support in the US government for the idea of banning the app completely, and countries around the world boycotting it from government devices – the latest national government to jump on that bandwagon being Australia’s.
The inattentive security risk?
But most of that tension centers on the idea that TikTok is an inherent security risk because its parent company, ByteDance, is headquartered in Beijing, and at least by its own reckoning the Chinese Communist Party government could demand access to any data from companies headquartered within the country.
The ICO’s fine, by comparison, is rather more “business as usual” in the world of social media than it is “global espionage plot.”
That’s not to make light of the seriousness of the deficiencies that have brought TikTok its mega-fine. The ICO says that between 2018-2020, TikTok did not do enough to check who was using its platform – meaning it did not act to remove children under-13, who are not supposed to have access to TikTok, from the platform, and so was deemed to have used the children’s data improperly.
While the world takes a moment to appreciate the irony of the platform accused of being a security risk by virtue of its use of user data… being fined for doing radically too little interrogation of its available user data, it’s worth remembering that this fine by the ICO is by no means TikTok breaking new and dangerous ground.
The Western playbook.
Certainly, any time the data of children and young people is used inappropriately, the company responsible should be fined into actual responsibility. But in essence, TikTok is walking a path and using a playbook learned from leading Western-based social media platforms.
In September, 2022, Meta’s Instagram platform was fined a much more significant $443m for exposing the contact details of teenagers who used it.
Like many of the Meta cases, the TikTok fine relates to data from at least two years ago, and the fine is a binary result of insufficient data protection of users (albeit in the TikTok case, inappropriate users), which takes no account of any data journey. We do not as yet know if any of the data of the inappropriately young users was used in especially inappropriate ways. What’s safe to assume is that the data was used in similar ways to how TikTok uses data from its above-board, above-age users.
It’s safe to assume that, because clearly from the basis of the fine, TikTok did absolutely nothing special to the under-13 year-olds’ data, because it did no interrogation of that data to recognize that they were inappropriately young and that they should be removed from the platform.
Beware the conflation.
Could the sub-13 year-olds have viewed content too mature for them to understand and/or consent to? Absolutely – hence the fine. But it’s worth guarding against the conflation of that issue with the overall environment of whipped-up suspicion over TikTok and its collection and use of user data.
John Edwards, the UK Information Commissioner, in issuing the fine, said that “TikTok should have known better. TikTok should have done better.”
That seems relatively inarguable – though it’s also less than surprising that TikTok is arguing the point (again, following a playbook set out by Meta and the like). A spokesperson for the company told news outlets including CNN that the company “invests heavily to help keep under-13s off the platform.”
The spokesperson added that “Our 40,000-strong safety team works around the clock to help keep the platform safe for our community.”
The point being that nobody is arguing that that is not the case – but that in the particular cases for which the company’s being fined, that either didn’t happen, or a representative number of cases slipped past all 40,000 members of the safety team.
The wounded tiger?
The TikTok response feels like a lashing out from a company already being put through the grinder on its data use in other areas, and therefore disproportionately determined not to admit any wrongdoing on data of any kind.
That might look to outsiders like an unfortunate position to take, when the amount of the fine itself is relatively small noodles for the company. But it’s also fair to say that TikTok will subsequently fight a conflation in the media – and possibly the governmental narrative – that it cannot be trusted with the data of either a) American citizens, or b) government ministers anywhere, because it’s been shown to have insufficient control over its user data to effectively monitor whether users are under the age at which they’re allowed to use the platform.
The argument might now be more convincingly made that all its assurances that TikTok user data will not be accessed by the Chinese government are not worth the paper they’re printed on, as the company doesn’t – or at least, didn’t two years ago – have enough control over user data to know what’s happening to it, beyond the operation of the algorithm that uses data about a user to hone the content they see.
The basis of any promise of data safety has to be based on data knowledge – and in that, this fine has shown TikTok to be woefully lacking.
20 February 2024
20 February 2024
19 February 2024