Who’s keeping tabs on regulatory compliance for the little guys developing AI?

Who's who and what's what in AI compliance?
29 August 2023

While the rules are tiered and apply to all operators, smaller companies using technology could sneak through, or can they? (Image – Shutterstock)

  • Europe has the strictest regulatory compliance rules.
  • However, most of these rules impose strict fines on big tech companies.
  • Smaller and independent companies are not taking data privacy and transparency seriously enough. 

Europe has the strictest regulatory compliance requirements when it comes to data usage and privacy. In fact, other countries around the world are crafting their privacy laws and regulations based on Europe’s General Data Protection Regulation (GDPR).

The GDPR governs how the personal data of individuals in the European Union (EU) may processed and transferred. Since its inception, many companies, especially big tech enterprises, have been penalized for how they use and manage data. The biggest GDPR fine to date was levied on Meta Platforms in May 2023. The tech giant was fined a record-breaking 1.2 billion euros (over $1 million).

While the GDPR ensures regulatory compliance on data and privacy, the rise in the adoption of artificial intelligence by organizations around the world has also raised concerns. In Europe, regulations on AI are still being discussed, with suggestions for legislation on the technology. According to a report by the World Economic Forum, the proposed Artificial Intelligence Act would classify AI systems by risk, and mandate various development and use requirements.

The EU also recently announced updates to its Digital Services Act (DSA). The act applies to any digital operation serving the EU. These companies would be legally accountable for everything from fake news to manipulation of shoppers.

A statement on the new EU regulatory compliance measures for AI.

Tweet by the President of the European Union Commission on the Digital Services Act.

Ensuring regulatory compliance for all

While the DSA targets mostly tech giants of the industry, there have been some questions on how smaller tech companies, including tech startups and individual app developers, will be affected by the regulatory requirements. While the rules are tiered and apply to all operators, smaller companies using technology could sneak through.

Or can they?

A recent study from the Data Protection Excellence Centre, the research arm of Straits Interactive, unveiled significant privacy concerns in Generative AI desktop applications. Covering 113 popular apps, the study recorded concerns about regulatory compliance, particularly among start-ups and individual developers.

Focused primarily on apps from North America and the EU, the study was conducted from May to July this year, to underscore the potential risks to which users might unwittingly expose their data. The apps were categorized as core apps, which included industry leaders in the generative AI sector, clone apps, which are typically startups, individual developers or developer teams,  and combination apps, which are existing applications that have incorporated generative AI functionalities.

12% of the apps from startups and individual developers lacked a published privacy policy. For those that did publish a privacy policy, 69% identified a legal basis (such as consent and contract performance) for processing personally identifiable information (PII).

Only half of the apps meant for children considered age restrictions and aligned with child privacy standards such as the Children’s Online Privacy Protection Act (COPPA) in the United States and/or the GDPR in the EU.

Another issue revealed by the research was that while 63% cited the GDPR, only 32% were apparently within the GDPR’s purview. The majority alluded to the GDPR without understanding when it applies outside the EU. Of those where GDPR seemed to be relevant, a mere 48% were compliant, with some overlooking the GDPR’s international data transfer requirements.

This clearly highlights the fact that most of these apps and their developers, especially those from startups and individuals, still do not understand how the GDPR works. They probably are not sure of all the other regulatory compliance requirements either. Yet these apps continue to operate.

In terms of data retention, where users often share proprietary or personal data, 35% of the apps did not specify retention durations in their privacy policies – as required by the GDPR and other data laws.

Does GDPR ensure regulatory compliance for small companies?

The GDPR is the strictest set of regulatory rules in the EU. (Image – Shutterstock)

The lack of transparency

When it comes to transparency in using AI, fewer than 10% transparently disclosed AI use or model sources. Out of the 113 apps, 64% remained ambiguous about their AI models, and only one clarified whether AI influenced user data decisions. Transparency in model sources for AI continues to be a big problem as many companies and developers want to know the source of the code. Failure to acknowledge the source could lead to more problems in the future.

Currently, OpenAI, Stability AI, and Hugging Face as well as big tech providers like AWS Bedrock are disclosing the existence of their AI models to ensure users are aware of the code’s origins. However, most AI apps seem to rely primarily on established AI APIs, such as those from OpenAI, or integrated multiple models.

The study shows a tendency among apps to collect excessive user PII, often exceeding their primary utility. With 56% using a subscription model and 31% veering towards relying on advertising revenue, user PII becomes invaluable. The range of collected data – from specific birth dates, interaction-based inferences, and IP addresses to online and social media identifiers – suggests potential ad-targeting objectives.

 “This study highlights the pressing need for clarity and regulatory compliance in the generative AI app sphere. As organizations and users increasingly embrace AI, their corporate and personal data could be jeopardized by apps, many originating from startups or developers unfamiliar with privacy mandates,” commented Kevin Shepherdson, CEO of Straits Interactive.

he launch of the DSA - the newest regulatory compliance measures from Europe.

The launch of the DSA – the newest regulatory compliance measures from Europe. Executive Vice-President Margrethe Vestager and Commissioner Thierry Breton prepare to protect user data.

Meanwhile, Lyn Boxall, a legal privacy specialist at Lyn Boxall LLC and a member of the research team, pointed out that the significance of the fact that 63% of the apps reference the GDPR without understanding its extraterritorial implications.

“Many developers seem to lean on automated privacy notice generators rather than actually understanding their app’s regulatory alignment. With the EU AI Act on the horizon, the urgency for developers to prioritize AI transparency and conform to both current and emerging data protection norms cannot be overstated,” said Boxall.