Don't miss an insight. Subscribe to Techopedia for free.


How do chatbots deal with accents?

By Justin Stoltzfus | Last updated: January 14, 2022

With the emergence of newer and more sophisticated chatbots over the past few years, people in many industries are observing how chatbots are advancing, how they are serving progress in interactive voice response (IVR), and how that’s affecting retail as well as numerous other industries.

One of the big relevant questions is how chatbots are dealing with accents. Regional and world language accents have been a stumbling block for these technologies since the beginning. Particularly, when chatbots were more rudimentary in terms of natural language processing (NLP) algorithms, they were easily confounded by an accent that significantly changes the phonemes of speech. Today, with ever-evolving algorithms, chatbots have become much more resilient.

Here are some main ways that engineers and stakeholders have worked to help chatbots handle accents.

The first is through targeting. Many companies dealing with a diverse clientele will set up multiple systems – they’ll try to move consumers or other end users toward the system that matches their dialect and language, to avoid cross-language problems.

However, targeting can only do so much. Another key way that companies work on chatbot refinement is triangulation – and this is something that has helped chatbots conquer the accent problem.

Triangulating the phonemes helps to provide more specific results. Think of it this way – if a chatbot encounters the voice of a native Indian who moved to the United States and speaks English with a distinct Indian accent, the machine will have to deal with differences, for example, the flatter, broader “a” sound that native Indian speakers have a hard time mastering in English. A chatbot that has greater complexity to isolate phonemes can pick out the trouble spots and more accurately “diagnose” them so that it won’t miss the entire word or phrase. That’s more true of an algorithm than a human being: Many human listeners tend to get confused by any accent differences.

By isolating and dealing with the phonemes in more depth, the technology can come up with more “true answers” or responses, but there’s another important way that chatbots can handle the problem of responding to an accented voice – or some other “problem.”

When comprehension is less than full, one key factor is how the technology responds. The more basic IVR chatbots of yesteryear were inclined to keep saying “I’m sorry, I didn’t understand that” over and over. Today’s refined chatbots are more likely to provide iterative response, either escalating the call to a human, or providing partial answers or, again, attempting to isolate the problem.

With targeting, triangulation, and good triage, chatbots can get much more accurate about dealing with accents and any other idiosyncrasies callers may have. This will revolutionize the world of “virtual assistants” that has, in the past, been less than impressive to most hapless callers.

Share this Q&A

  • Facebook
  • LinkedIn
  • Twitter


Privacy and Compliance IT Business Alignment Productivity Software Artificial Intelligence Customer Experience Management Emerging Technology Software Bots

Written by Justin Stoltzfus | Contributor, Reviewer

Profile Picture of Justin Stoltzfus

Justin Stoltzfus is a freelance writer for various Web and print publications. His work has appeared in online magazines including Preservation Online, a project of the National Historic Trust, and many other venues.

More Q&As from our experts

Related Terms

Related Articles

Term of the Day

Recommendation Engine

A recommendation engine is a machine learning (ML) system that uses explicit and implicit end user feedback to make...
Read Full Term

Tech moves fast! Stay ahead of the curve with Techopedia!

Join nearly 200,000 subscribers who receive actionable tech insights from Techopedia.

Go back to top