Transfer Learning

Why Trust Techopedia

What Does Transfer Learning Mean?

In artificial intelligence (AI), transfer learning is a process that allows a pre-trained machine learning (ML) model to be used as a starting point for training a new model. Transfer learning reduces the cost of building the new model from scratch and speeds up the training process.

Advertisements

Transfer learning is particularly useful when there is a limited amount of data for training the new model. As a design methodology, transfer learning is most effective when the original model and the tasks the new model is supposed to complete are closely related.

Transfer learning can be applied to a wide range of domains and generative AI tasks, including computer vision (CV), natural language processing (NLP), sentiment analysis and natural language generation (NLG).

Techopedia Explains Transfer Learning

Because each domain and task has its own unique challenges and requirements, there are several different approaches to transfer learning.

Types of Transfer Learning

Inductive transfer learning – the knowledge learned from the source task is used to fine-tune the model for the target task. This approach is useful for when the parameters of the pre-trained model can be used as the starting point for the target task — or when the learned features from the source task can be use as input for the target task.

Multi-task transfer learning – a single model is trained on multiple related tasks simultaneously. This approach is useful when the target tasks have the same underlying structure as the initial task.

Domain adaptation transfer learning – knowledge from a source domain is transferred to a target domain but the data distributions between the domains are different. This approach is useful when there is a difference between how data is distributed in the source and target domains, but there is still a relationship between them.

Zero-shot transfer learning – knowledge is transferred from a source task to a target task that has new classes that do not exist in the source task. This approach is useful in situations when acquiring labeled data for all the new classes is difficult, time-consuming or expensive.

Examples of Transfer Learning in Use

Transfer learning has shown to be effective in a variety of applications. Here are a few examples:

ChatGPT uses transfer learning to complete new tasks with relatively small amounts of labeled data.

DALL-E uses a pre-trained transformer-based language model as a starting point for training its image generation model.

Amazon’s Alexa uses transfer learning to improve its natural language processing abilities.

Siri uses transfer learning to improve its speech recognition and natural language processing capabilities.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.