Bidirectional Encoder Representations from Transformers

Why Trust Techopedia

What Does Bidirectional Encoder Representations from Transformers Mean?

Bidirectional Encoder Representations from Transformers (BERT) is a deep learning strategy for natural language processing (NLP) that helps artificial intelligence (AI) programs understand the context of ambiguous words in text.

Advertisements

Applications that use BERT are able to predict the correct meaning of a synonym by processing text in both left-to-right and right-to-left directions simultaneouosly.

Techopedia Explains Bidirectional Encoder Representations from Transformers

Google engineers used tools like Tensorflow to create the BERT neural network architecture. Until BERT, AI programs were unidirectional, which means they could only process text from left-to-right.

BERT's bidirectionality, combined with a masking strategy that teaches the programming how to predict the meaning of an ambiguous term, allows deep learning neural networks to use unsupervised learning techniques to create new NLP models.

This approach to natural language understanding (NLU) is so powerful that Google suggests that users can use BERT to train a state-of-the-art question and answer system in about 30 minutes as long as they have enough training data.

Advertisements

Related Terms

Margaret Rouse
Technology expert
Margaret Rouse
Technology expert

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.