Don't miss an insight. Subscribe to Techopedia for free.

Subscribe
Advertisements

Bidirectional Encoder Representations from Transformers

What Does Bidirectional Encoder Representations from Transformers Mean?

Bidirectional Encoder Representations from Transformers (BERT) is a deep learning strategy for natural language processing (NLP) that helps artificial intelligence (AI) programs understand the context of ambiguous words in text.

Advertisements

Applications that use BERT are able to predict the correct meaning of a synonym by processing text in both left-to-right and right-to-left directions simultaneouosly.

Techopedia Explains Bidirectional Encoder Representations from Transformers

Google engineers used tools like Tensorflow to create the BERT neural network architecture. Until BERT, AI programs were unidirectional, which means they could only process text from left-to-right.

BERT's bidirectionality, combined with a masking strategy that teaches the programming how to predict the meaning of an ambiguous term, allows deep learning neural networks to use unsupervised learning techniques to create new NLP models.

This approach to natural language understanding (NLU) is so powerful that Google suggests that users can use BERT to train a state-of-the-art question and answer system in about 30 minutes as long as they have enough training data.

Advertisements

Related Terms