Is Big Tech falling in line with UK’s child digital privacy laws?

Big tech companies amend apps following UK's new digital privacy regulations for children
3 September 2021

(Photo by Ethan Miller / GETTY IMAGES NORTH AMERICA / Getty Images via AFP)

Digital privacy concerns continue to take up media real estate around the world. Regulators continue to ensure companies take privacy seriously, especially in protecting the personal data of most consumers.

In the past couple of months, big tech players have been subjected to numerous fines in Europe due to data privacy issues. WhatsApp Ireland was recently fined 225 million pounds for data protection breaches while France fined Google US$120 million and Amazon US$42 million for dropping tracking cookies without consent.

While most data privacy issues often concern the use of data of the general population, regulators are now looking at how they can protect children from being pinpointed by targeted advertising, inappropriate advertising, and tactics on keeping children online for long periods. China recently announced that online gaming for gamers under the age of 18 will be banned on weekdays and limited to just three hours most weekends.

Authorities said that the restrictions were put in place to help prevent young people from becoming addicted to video games. However, gaming addiction may not be the only problem affecting children when it comes to digital privacy. The growth and influence of social media on children continues to be a worrying concern, especially as it concerns the digital privacy of minors.

At the same time, concerns of abusive content involving children have seen regulators requiring social platforms and technology providers to make changes to their policies. Interestingly, when Apple announced new mechanisms to curb child abuse content, data privacy advocates felt Apple may be infringing privacy by scanning photographs on its devices for child abuse content.

(Photo by Olivier DOULIERY / AFP)

Protecting children online

Nevertheless, ensuring digital privacy for children is a prerogative that needs to be, and is, being addressed globally. The US was the first country to enact a policy via the Children’s Online Privacy Protection Act (COPRA) that took effect in 2000, with an amended rule coming into effect in 2013.

COPRA ensures parents are in control over personal information collected from their children online. The rule applies to commercial websites, online services, and mobile apps that collect, use or disclose the personal information of children. The EU’s GDPR also requires parental consent before information society service providers can process the personal data of children under 16 years of age.

Meanwhile, the UK’s Age Appropriate Design Code that was written into law as part of the 2018 Data Protection Act has officially come into effect. The act mandates that websites and apps need to take the best interests of child users into account, or face potential fines of up to 4% of their annual global turnover.

Speaking to The Guardian, British Information Commissioner Elizabeth Denham said, “One in five UK internet users are children, but they are using an internet that was not designed for them. In our research conducted to inform the direction of the code, we heard children describing data practices as ‘nosy’, ‘rude’ and a ‘bit freaky’.”

According to a BBC report, the implementation of the new act will require companies targeting children to ensure they design services that are age-appropriate and in their best interests. Companies also need to consider whether their use of data keeps them safe from commercial and sexual exploitation, as well as offer a high level of privacy by default.

They also need to stop using design features that encourage them to provide more data. Geo-location services for tracking must be switched up, and any personal information they collect from UK-based children will need to be mapped.

Tech companies are making amendments

Following the UK’s new ruling, major tech platforms have implemented several changes to their apps, particularly on how they treat child users. Here’s what some of them are doing:

  • TikTok – Restricting sharing options of younger users and disabling notifications from the app after bedtime for those under 18.
  • Google – Anyone under 18 or their parents may request the removal of images from search results. Google will also disable its location history for services aimed at children.
  • YouTube – Updated default privacy settings and turned off autoplay option for users aged 13 to 17.
  • Facebook – Under-18 users to be exempted from targeted advertising, receive tighter default sharing settings, and get protection from potentially suspicious accounts.
  • Instagram – Preventing adults from messaging children who do not follow them, and defaulting all child accounts to private. Instagram will also require users to enter their date of birth to log in.

Following the ruling, it was reported that the Data Protection Commission in the Republic of Ireland is also preparing similar regulations. Other countries may also be looking at ways they can better protect the digital privacy of those under 18.