How to win friends and not violate Europe’s GDPR legislation

It may be time to get holistic about your data. Yes, all of it.
24 January 2023

A new GDPR judgment costs Meta over $400 million – while it lays off thousands of staff.

Getting your Trinity Audio player ready...

Meta hit the walls of Europe’s GDPR legislation on data use and data security again recently – by no means the first, and probably not the last time it will do so. The result of which is that it was fined $414 million dollars. We spoke to Sasha Grujicic, Chief Operating Officer at NowVertical, a company specializing in big data and analytics, to ask why social media and big tech companies have such trouble respecting the rules of other countries and blocs, particularly Europe.

While we talked to Sasha, he also explained the potential benefits companies can unlock by being prepared to meet the demands – or at least the spirit – of data privacy rules like Europe’s GDPR legislation. Among them, the opportunity to not be fined $414 million while working with other people’s data.

Different lenses of interpretation.

THQ:

Why do social media companies have such trouble obeying other people’s rules on data privacy?

SG:

Their business is predicated on the use of personal information. So whenever there’s any type of legislation put in place about the use of personal information, like Europe’s GDPR legislation, the interpretation of that legislation as it relates to the company and the way in which it operates is always a point of… I don’t want to say tension, but I’d say interpretation. They’re trying to interpret the spirit and the letter of the legislation as it relates to their business, and because their business is predicated on personally identifiable information (PII), theirs is often a slightly… different… perspective from that of the people who framed the legislation.

Also, the space tends to move very, very quickly, and social media companies are actually trying to manage a wide swathe of stakeholder interests, including the people who are present on their platforms, people who have created businesses connected to their platforms, advertisers, and obviously governments and other external stakeholders. So they’re trying to manage all of these different factors, while interpreting the letter and the spirit of the legislation, while trying to run a business that’s not a public utility.

People have scathing views around social media companies and their use of data, to be sure, but they’re very different from enterprises whose businesses aren’t predicated on the use of PII. It’s not that it’s easier for those other enterprises to manage their operations within the spirit and the letter of the GDPR legislation. It’s just that it’s a slightly different set of stakes that are connected to it.

So I think that’s probably part of the answer. And then, layer on top of that the scale and breadth of their operations, which are global, rather than national or regional. So, not to be an apologist for the social media companies, but I understand what they’re up against.

The business model of social media platforms.

THQ:

That’s the thing, isn’t it? The difference in perception between the ordinary social media user and the technologist. Because there are so many cases where social media companies run up against Europe’s GDPR legislation time and time again, and they usually collect a huge fine, and users hope it will make their data more private and more safe, like the fine will be a deterrent. But as you say, social media companies (and other big tech operators too) have a business model that absolutely depends on them being able to use your data.

SG:

Yes – like anything, it’s run through this lens of interpretation, and that’s where a lot of the back and forth tends to happen, and it tends to play itself out in a very public manner because people are interpreting things from their own vantage point and the way they interpret things like consent, or the right to be forgotten, may be different to the way they’re interpreted through the lens of the companies who run the platforms.

So it’s not always just a case of “Of course, you need to go and do that.” There’s the one version of “that” – and then there are many, many interpretations of “that.” Those differences of interpretation, I think, are always points of tension, regardless of industry or technology.

The potential of jail time.

THQ:

Were you aware there are moves in the UK to add a layer of potential jail time to the maximum fines that managers of social media platforms can accrue if platforms are found not to have adequately protected children from seeing things that children shouldn’t see? There’s talk of mandating anything up to two years of prison time in those cases, and it seems to have widespread support.

SG:

I mean… that would be a very interesting turn. And that would undoubtedly raise the stakes of what’s allowable. I think that also challenges the notion of whether these platforms are media companies or not. Are they going to be judged as media platforms? As if they were the ones that generated the content themselves? Which would obviously increase the levels of scrutiny under which they would come and might well limit the personal freedoms of the individuals that are on these platforms.

Welcome to an interconnected world, I guess.

Holistic impacts.

THQ:

Talk to us about the value of holistic data protection impact assessments. That’s not only a heck of a mouthful, but as we understand it, it can help companies and platforms minimize the protection risks to which they’re liable? How can it do that, and how confident are you that companies at various levels would be agreeable to doing one?

SG:

A holistic data protection impact assessment is part of the data governance work that’s increasingly required. It’s incredibly important, especially as we see advancements in new types of technologies which have a kind of data input prerequisite. The development of those technologies means that if you want to do anything interesting with your information or data, the prerequisite is that it’s held to a specific governance and quality standard.

For many organizations, that’s a daunting task just to get started. Simply wrapping your arms around all the data that you have within your organization (and within your connected set of organizations, depending upon how you’re mobilizing your data) is important.

And part of the problem is just getting your arms around it, and – not to be self-serving – that’s why we bought a company that specifically did that work, because we know how critical it is to be able to look at both structured and unstructured data across enterprises’ data estates. You need to understand what you have, so that you can secure it, mobilize it, and utilize it in a way that actually drives growth and efficiency across the company, not to mention saving money and reducing risk.

Because there’s redundant and dark data, and legacy data, that goes beyond retention requirements. All that stuff just sits within an organization – which makes it both a risk point and a cost point. And when you start to increase levels of complexity by introducing other stakeholders across that, meaning you want to do something with that data with an external partner, it’s important to be able to understand what you have to begin with, and where it is.

Taking the legwork out of data identification.

That’s historically been a very manual exercise, which is tough for companies. So to be able to use technologies to accelerate that work and help them manage the governance of that work is important. And that’ll enable them to work with their data in a confident manner.

THQ:

What’s the benefit of that confidence?

SG:

It means they can make sure they’re adhering to legislation and some of the restrictions associated with PII, like Europe’s GDPR legislation.

THQ:

Ahh – you can’t obey the rules on data unless you can “put your arms around” all the data you have, and know a) that you have it, and b) where it is?

SG:

Exactly. So – you asked how a holistic data protection impact assessment can help minimize data protection risks? That’s how – it lets you see what data you have, and so what data risks you might be running, and how to minimize them.

The airbag of data assessments.

THQ:

How can it minimize them?

SG:

Well, for instance, once you know where your data protection risks are, if you’re working with a third party, you can make sure you highlight privacy as an issue, you can ensure you have safe data sharing across a compute landscape, and you can find privacy-safe ways to collaborate and generate business value together. The promise is enormous, in terms of ways to safely generate value from data, but the prerequisite is just to understand what you have and how it’s governed, because of the risks and the costs associated with that.

THQ:

Like for instance whether the GDPR is going to cause you to be walloped with a multi-million dollar fine?

SG:

Yes. It lets you talk about the actual compute requirements to generate some higher order models that try to use a composite of information to generate net-new applications. Requirements are starting to get quite big, very expensive, and, frankly, costly to the environment. So, yeah, understanding what data you need to input into a model is incredibly important.

 

In Part 2 of this article, we’ll dive deeper into processes that can help companies steer clear of both legal and ethical issues when dealing with significant landscapes of data.