The sticky scenario of scaling enterprise AI
- To reap the real benefits, organizations need to be able to scale AI solutions.
- A scalable AI can be built by both tech and business users and can run anywhere at high speed and low latency.
In today’s day and age, almost every other organization seems to be working on a transformation driven by Artificial Intelligence (AI). Mainly driven by the huge hype, AI is undeniably a game-changer, but scaling and mastering the technology to unlock its full potential remains a task for many businesses.
In a 2019 study by Accenture, it was found that companies that strategically scale AI generate five times the return on investment (ROI) compared to companies that are unable to scale. 86% of executives believe they won’t achieve their growth objectives unless they can scale their AI. Three-quarters of C-level executives believe if they don’t move beyond experimentation to aggressively deploy AI across their organizations they risk going out of business by 2025.
The problem however is that not every company understands the challenges of creating a first-class scalable AI organization. Some large corporations are starting to see the results of their AI efforts to embed it into their business applications, but most of them struggle to scale. It is mainly the likes of tech giants such as Google and Amazon that are able to scale AI and reap the benefits.
For context, McKinsey estimates that AI will add US$13 trillion to the global economy in the next decade. The full value of AI can only materialize when firms Aare able to offset their upfront costs of developing AI with substantial business gains from its widescale deployment. Indeed, three-quarters of organizations with large ROI have scaled AI across business units. However, most companies are struggling to scale AI. Mckinsey’s survey shows that Most companies have run only pilot projects or applied AI in just a single business process.
The adoption of AI exhibits the same patterns as any other big technology revolution of the past. A truly scalable AI can be built by both tech and business users and can run anywhere at high speed and low latency.
McKinsey reckons that “many organizations aren’t spending the necessary (and significant) time and resources on the cultural and organizational changes required to bring AI to a level of a scale capable of delivering meaningful value—where every pilot enjoys widespread end-user adoption and pilots across the organization are produced in a consistent, fast, and repeatable manner. Without addressing these changes upfront, efforts to scale AI can quickly derail.”
The main reasons why scaling AI is so challenging are often because of customization and data. To begin with, building AI systems from scratch for every problem and domain thus requires a ton of customization work. Businesses that opt instead to buy off-shelf solutions that are not optimized for their specific needs, compromise on performance and outcomes. It is important to recognize that AI requires customizations for every project, and every business problem, and that a key part of operationalizing AI is making the customization process as efficient as possible.
On data, Accenture reckons that rushing to scale new ideas quickly requires a well-constructed data strategy – “which we define as a design and intent that underpins what data is being captured, in what way and for what purpose”. Like a house built on weak foundations, an AI solution built on weak data with no solid strategy may deliver some near-term value but doesn’t stand a chance at scale or delivering results in the long term.
Having a data strategy to underpin your AI strategy is critical for competitive advantage and will ultimately help accelerate your time to value. In fact, 72% of Strategic Scalers (those who are successfully scaling AI in their organizations) said a core data foundation has been key to their success.