How do chatbots deal with accents?
How do chatbots deal with accents?
With the emergence of newer and more sophisticated chatbots over the past few years, people in many industries are observing how chatbots are advancing, how they are serving progress in interactive voice response (IVR), and how that’s affecting retail as well as numerous other industries.
One of the big relevant questions is how chatbots are dealing with accents. Regional and world language accents have been a stumbling block for these technologies since the beginning. Particularly, when chatbots were more rudimentary in terms of natural language processing (NLP) algorithms, they were easily confounded by an accent that significantly changes the phonemes of speech. Today, with ever-evolving algorithms, chatbots have become much more resilient.
Here are some main ways that engineers and stakeholders have worked to help chatbots handle accents.
The first is through targeting. Many companies dealing with a diverse clientele will set up multiple systems – they’ll try to move consumers or other end users toward the system that matches their dialect and language, to avoid cross-language problems.
However, targeting can only do so much. Another key way that companies work on chatbot refinement is triangulation – and this is something that has helped chatbots conquer the accent problem.
Triangulating the phonemes helps to provide more specific results. Think of it this way – if a chatbot encounters the voice of a native Indian who moved to the United States and speaks English with a distinct Indian accent, the machine will have to deal with differences, for example, the flatter, broader “a” sound that native Indian speakers have a hard time mastering in English. A chatbot that has greater complexity to isolate phonemes can pick out the trouble spots and more accurately “diagnose” them so that it won’t miss the entire word or phrase. That’s more true of an algorithm than a human being: Many human listeners tend to get confused by any accent differences.
By isolating and dealing with the phonemes in more depth, the technology can come up with more “true answers” or responses, but there’s another important way that chatbots can handle the problem of responding to an accented voice – or some other “problem.”
When comprehension is less than full, one key factor is how the technology responds. The more basic IVR chatbots of yesteryear were inclined to keep saying “I’m sorry, I didn’t understand that” over and over. Today’s refined chatbots are more likely to provide iterative response, either escalating the call to a human, or providing partial answers or, again, attempting to isolate the problem.
With targeting, triangulation, and good triage, chatbots can get much more accurate about dealing with accents and any other idiosyncrasies callers may have. This will revolutionize the world of “virtual assistants” that has, in the past, been less than impressive to most hapless callers.
More Q&As from our experts
- Why are people talking about the 'tipping point' for machine learning?
- Why do security professionals consider bitcoin and cryptocurrency mining a potential for 'parasitic' hacker activity?
- What is a virtual local area network (VLAN) and why would I use one?
- Interactive Voice Response
- Natural Language Processing
- Natural Language Understanding
- Intelligent Virtual Assistant
- Target Platform
- Autonomic Computing
- Computational Linguistics
- Turing Test
Tech moves fast! Stay ahead of the curve with Techopedia!
Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.