Artificial intelligence (AI) represents a fundamental shift in how computing technology is developed and functions. But at heart, it is a new way for humans to interact with the digital universe.
Much has been written about how AI introduces a conversational means of engaging with computers, as opposed to the typing, clicking, and tapping methods that have emerged up to this point. In the future, it is expected that users will need little, if any, training or skill sets to communicate with a digital entity. We’ll speak our desires, and the intelligent bot, be it something like ChatGPT or another intelligent virtual assistant, will, or at least should, obey.
This is not merely a new way of doing things, however. It is a fundamental shift in the relationship between biological intelligence (us) and the artificial intelligence we’ve created.
What is a User Interface?
The user interface is simply the means to engage a computer. Whether on a desktop/laptop, cell phone, ATM, or any other digital device, the UI is how we make our desires known. Some UIs are very simple, such as a keypad or even a single button, while others require complex language skills or the ability to navigate through highly layered menu architectures.
And some UIs require no action on the part of the user at all: retinal scans, for example, open doors just by looking at them. Toll transponders trigger a sequence of actions just by driving by.
How Has the UI Changed Over the Years?
The earliest UIs date back to the earliest computers. The ENIAC machine developed by the University of Pennsylvania during World War II used a plugboard to convey instructions from programmers to the machine’s vacuum-tube-based apparatus, similar to the way telephone operators used to connect calls.
Throughout most of the 1950s and 1960s, keyboards modeled after typewriters became the preferred means of engaging with computers, with various specialty function keys evolving over time. Most of these devices used batch processing and punch cards to turn ideas into programming before gravitating toward command-line interfaces that could be viewed instantly on video screens.
The first graphical user interface (GUI) came out of Xerox in the 1970s and has been continuously updated since then by Apple, IBM, Microsoft, and others for all manner of computing devices.
What is an Intent-Based UI?
Each of these earlier UIs simplified and streamlined what is essentially the same basic approach to programming: convey the exact process in which the computer is to produce the expected outcome. AI represents an entirely new approach, however, in that we no longer need to tell the device how to do something; we tell it what we want done, and the intelligent algorithms inside the machine take it from there.
This Intent-Based UI cedes much of the control to the AI model as it seeks to fulfill its mandate. The actual interaction between the user and computer is lessened dramatically, and only if the results are wrong or unsatisfactory will the user have to re-engage. Usually, this happens when the original intent is expressed incorrectly or unclearly.
What Are the Practical Advantages of Intent-Based UIs?
Not only does the Intent-Based UI save a lot of time and effort, it democratizes the awesome power of today’s digital environment to people who would otherwise have no means of accessing it. Even the ubiquity of cell phones has not provided everyone with access to the entirety of the connected world; only those who take the time and effort to understand all the functions and features of their chosen UI.
Using an Intent-Based UI, however, all one has to do is speak or type their desires, and the device will complete the task. Many users have already seen the utility of asking, “Where is the cheapest petrol near me?” rather than launching an app, typing the request, and then zooming in and around a map to get the answer. As AI applications become more innovative and more sophisticated, this same paradigm will apply to more complex requests like, “Create an original graphic depicting sales for the next quarter” or “Write a program that compares customer satisfaction to our most recently added features.”
What are Some of the Drawbacks?
Even with an Intent-Based UI, a computer is still a computer. Its output is only as good as its programming and the data it can gather. Poorly trained AI with access to limited resources will produce faulty results, and no UI can change that.
And while AI is becoming more adept at deciphering human speech, it is still far too early for it to comprehend the vast sets of idioms and turns of phrase that populate languages used around the world. That means engagements with AI through the Intent-Based UI will have to take place at some mid-point between what people say and what intelligent algorithms can understand. Of course, this is far less burdensome on the user than learning a programming language or wading into the depths of a menu-based platform.
What Is Likely to Come After the IB-UI?
Some experts are talking about zero-command interfaces, similar to the scanners and toll transponders mentioned above but geared toward more complex operations. In this view, AI will be able to know what we want without us having to express our desires at all. If it’s time for the quarterly sales meeting, then, of course, sales data from the previous quarter are needed, and it will have to be as comprehensive and forward-looking as can reasonably be expected. If it’s Tuesday and Tuesday is shopping day, then a list must be made, transportation organized, coupons and discounts must be accessed, etc.
This may be a tall order, but AI trends are driven by greater efficacy and convenience. The trick will be to train the algorithms to do what needs to be done, but not to the point where they take detrimental actions that must be undone later.
Efforts are also underway to enable AI to read brain scans to decipher thoughts, and we are even seeing very basic direct Brain-to-Computer Interfaces (BCIs). Both of these developments raise all manner of ethical questions, and they are still at very rudimentary stages.