Artificial intelligence (AI) is poised to radically remake the digital enterprise, but the transition will not happen all at once. Like most technology initiatives, it will see gradual implementation across the myriad platforms and processes that currently inhabit the workplace.
This means that for a certain period of time, AI will have to integrate into legacy environments, which will likely have the twin effects of bolstering certain activities while straining against the slow pace of non-intelligent processes.
But how should the enterprise navigate this process? Are there ways to make the most of an AI investment without facing the financial burdens and disruptive nature of a full forklift upgrade?
Planning the Integration
According to My Tech Decisions’ Isla Sibanda, AI integration is similar to previous technology integrations in a number of ways. To begin with, you’ll need a plan. This requires a careful assessment of business objectives, stakeholder needs, existing resource configurations and dependencies, and a host of other factors.
From there, you’ll need to determine how to enhance data quality and availability to a level that can best support both the intelligent and non-intelligent sides of the enterprise, paying particular attention to issues like data ownership, policy and governance management, and privacy and security issues along the entire information pipeline.
Cultural factors will also come into play as this new working paradigm is put into place. Upskilling employees is vital for AI to achieve its full potential, as is the need to direct the new technology at solving problems that allow people to become better at their jobs or transition to new, more important ones.
All the while, the enterprise must ensure that its AI models are behaving in a responsible and ethical matter and are compliant with all applicable laws and regulations.
Means of Communication
In order for AI to integrate smoothly with legacy applications, it must be able to communicate. This is where the application programming interface (API) comes in. Salesforce’s Joel Davenport argues that an API-led AI implementation strategy not only resolves many of the integration headaches that exist at the outset but those that arise as environments scale and become more complex.
Using APIs, organizations will find it easier to break down the monolithic structure of most AI deployments, which in turn allows for a more modular approach that promotes flexibility and reliability across the entire enterprise ecosystem. Monolithic platforms, after all, require long development cycles and usually cannot be reused beyond their initial deployment objectives.
With an API-led framework, you can create any number of interchangeable AI modules, each of which can add fresh insight and capabilities to the varied requirements of a typical process.
AI is different from previous technology initiatives in one significant way; however: it will not just create new functions to augment legacy systems but will rework their operations from the inside. Understanding how this will change their functionality is crucial to managing a successful integration.
Sudeep Srivastava, co-founder, and director of tech consultancy AppInvetiv, recently pointed out the three ways AI can alter legacy apps:
- Reasoning: Through real-time decision-making, AI improves the user experience and streamlines operational processes;
- Recommendations: Enhanced data analytics allows AI to offer products and services that build trust and brand retention;
- Behavioral: Differentiating between normal and abnormal behavior improves performance and security.
Ultimately, successful integration of AI should diminish much of the fear, doubt, and uncertainty (FUD) that is hampering AI adoption, while at the same time, it should enable apps to function in ways that are more relevant to user needs and produce higher levels of satisfaction.
Still, it’s important to keep in mind that integration is only the first step in the march toward the real goal of AI implementation: full intelligent orchestration of the business model.
David Linthicum, chief cloud strategy officer at Deloitte Consulting, noted that most AI deployments to date have been following the path of least resistance by merely coupling intelligence to applications and data.
Going forward, we can expect these architectures to be loosened up under lightweight orchestration services in order to provide the flexibility for AI to generate new solutions through self-reconfiguration rather than deep development.
Crucial to this effort is the development of specialized tools that can integrate source and target databases with individual AI systems, most likely using a low-code or no-code orchestration engine. These tools will have to be tailored toward individual cloud architectures, enabling technologies, and other elements in order to provide optimal performance.
Since AI is intended to create highly individualized operating environments for the enterprise, there isn’t likely to be a standard template for complete integration and orchestration. Aside from the most basic commonalities, every organization is pretty much on its own to define and refine the way AI will meld with its legacy ecosystems.
As experience with technology grows, however, organizations should gain an increased sense of what works and what doesn’t.
Eventually, this will unleash the real power of AI – not to allow you to do what everyone else is doing but to do what no one else can.