Meta pixel at heart of data privacy cases – again

Tiny piece of useful code just can't stop pinging confidential data to Facebook.
24 July 2023

Healthcare, taxes, and now victims of serious crime – what data ISN’T the Meta pixel harvesting?

• The Meta pixel is at the heart of new data privacy cases.
• In the UK, it sent data of victims reporting crimes to Facebook.
• And in the US, the Meta pixel is being blamed for massive tax data transmission.

The question of data privacy has been posed before when it comes to Facebook and the existence of something called a Meta pixel.

In June, 2022, journalists at The Markup revealed that the Meta pixel – a tiny piece of data-tracking code – had been installed on online healthcare systems across America. The Meta pixel’s only job in this world is to collate some data about an online event and then, in what would seem to be a clear violation of patient confidentiality, let alone data privacy laws, to send that packet of information across to Facebook, while potentially tracking individual internet users around the web to enrich the available data picture of them.

Back when the discovery was made, the Meta pixel was found to be present in the online systems of 33 of Newsweek’s top 100 American hospitals. When patients made an online appointment, the Meta pixel gathered information on patients’ names, their doctors’ name, their medications – and the reason for the appointment. It was also linked to an IP address. And then, without informing either the patient or the healthcare facility, the pixel sent that data to Facebook.

The scale of Meta pixel use.

The Meta pixel is currently installed on around 30% of the world’s most popular websites. It could, entirely without your knowledge, send packets of data about you to Facebook whenever you use them.

That’s enough of a violation of your right to privacy as it stands, but when it was discovered in a healthcare setting, in a world in which bad data brokers exist, and an America that was still reeling from the repeal of Roe Vs Wade, one of the most basic protections of bodily autonomy available to women in the land of the assault rifle and the home of the incel, it was distinctly disturbing.

Facebook faced class action lawsuits as a result of the Meta pixel’s discovery on the healthcare systems, and settled the cases to the tune of $18.4m.

The UK too has a history of using Meta pixels in healthcare settings. 20 National Health Service (NHS) trusts were previously discovered to have shared the private information of patients with Facebook through Meta pixels. 17 of the trusts, when presented with evidence of the improper use of their Meta pixels, said they would stop using them.

All of which, while shocking, is by way of a prologue.

In July 2023, the use of the Meta pixel sank to a new low.

The Meta pixel with access to victim data.

The UK’s Metropolitan Police Force (Met Police), which enforces the law across the whole of London, was found to have been using the Meta pixel, ostensibly as part of a recruitment campaign.

However, the Meta pixel was also used when people (the massive majority of them women) used the Met Police’s online reporting system to record rapes, domestic abuse, and other crimes.

Despite all appearances of security, the form people filled in collected data as it was entered, and sent a packet of that data to Facebook.

The Meta pixel also sent data about content viewed and buttons clicked on pages linked to contacting the police and accessing victim services – as well as advice pages on rape, assault, stalking and fraud.

The Meta pixel case, o top of recent proven corruption and misogyny, has led some to call for the Met Police to be overhauled.

The Meta pixel case, on top of recent proven corruption and misogyny, has led some to call for the Met Police to be overhauled.

After a degree of outrage from former Victims’ Commissioner Dame Vera Baird, the Met Police has said it will be removing the Meta pixel from everything but its recruitment pages.

The Meta pixel is like a Roomba, where dust equals data.

The Meta pixel is like a Roomba, where dust equals data.

Tax and the Meta pixel.

The Meta pixel has run into trouble in the US in July, too – Congressional Democrats have claimed that three large tax preparation firms have sent “extraordinarily sensitive” information on tens of millions of US taxpayers to Facebook over the course of at least two years by using the Meta pixel.

The Democrats are urging federal agencies to investigate and potentially even to take legal action against H&R Block, TaxAct and Tax Slayer over their unnecessary data-sharing with Facebook through the Meta pixel.

Seven Democrats have sent a letter to leading figures in the IRS, the Justice Department, the Federal Trade Commission and the IRS watchdog, saying their findings “reveal a shocking breach of taxpayer privacy by tax prep companies and by Big Tech firms.”

The healthcare, the Met Police, and the tax prep cases all highlight one central issue – the Meta pixel is, for most of its life, a useful tool. The Met Police said it used it to identify areas where visitors to its website struggled for clarity or a smooth journey. But without in any sense gaining a malign intent along the way, it’s also a piece of technology that can and has resulted in private, privileged, and intensely personal data being funnelled back to Facebook.

Whether Facebook then does anything unethical or malign with that data is almost beside the point – though it has form when it comes to targeting healthcare-based advertising at people whose healthcare data it received via the Meta pixel.

The Meta pixel is a key piece of code used by Meta to gain data.

Meta – we’d like to tell you the street address was just a happy accident. Source: Justin Sullivan/Getty Images North America/Getty Images via AFP

The point is that while there is fury from victims, no-one is talking at any point about refining the coding of the Meta pixel so it can do the useful things it does without packaging up specific data that civilians would rather not be known in the wider world.

The wider world contains bad actors and unscrupulous data brokers, and whether or not there’s malign intent in the collection of such data without express consent, having that data somewhere exist outside of the boundaries of that data consent must be seen as being as much of a concern on the level of private citizens as the potential leaking of priviledged data through generative AI is on the level of companies.

It should also be treated as at least as serious a threat as the risk of US user data being sent to the Chinese government by TikTok.

You should feel free to tell us again how dangerous TikTok is…