Don't miss an insight. Subscribe to Techopedia for free.

Subscribe
Advertisement

Bidirectional Encoder Representations from Transformers (BERT)

Last updated: December 7, 2021

What Does Bidirectional Encoder Representations from Transformers (BERT) Mean?

Bidirectional Encoder Representations from Transformers (BERT) is a deep learning strategy for natural language processing (NLP) that helps artificial intelligence (AI) programs understand the context of ambiguous words in text.

Applications that use BERT are able to predict the correct meaning of a synonym by processing text in both left-to-right and right-to-left directions simultaneouosly.

Advertisement

Techopedia Explains Bidirectional Encoder Representations from Transformers (BERT)

Google engineers used tools like Tensorflow to create the BERT neural network architecture. Until BERT, AI programs were unidirectional, which means they could only process text from left-to-right.

BERT's bidirectionality, combined with a masking strategy that teaches the programming how to predict the meaning of an ambiguous term, allows deep learning neural networks to use unsupervised learning techniques to create new NLP models.

This approach to natural language understanding (NLU) is so powerful that Google suggests that users can use BERT to train a state-of-the-art question and answer system in about 30 minutes as long as they have enough training data.

Advertisement

Share this Term

  • Facebook
  • LinkedIn
  • Twitter

Related Reading

Tags

Software DevelopmentIT Business AlignmentInternetEmerging Technology

Trending Articles

Go back to top