Learning Algorithm

Definition - What does Learning Algorithm mean?

A learning algorithm is an algorithm used in machine learning to help the technology to imitate the human learning process. Combined with technologies like neural networks, learning algorithms create involved, sophisticated learning programs.

Techopedia explains Learning Algorithm

Logic regression, linear regression, decision trees and random forests are all examples of learning algorithms. Algorithms like “nearest neighbor” also involve the ways that these algorithms are used to affect decision-making and learning in machine learning programs. In general, what all of these algorithms have in common is their ability to extrapolate from test or training data to make projections or build models in the real world. Think of these algorithms as tools for “pulling data points together” from a raw data mass or a relatively unlabeled background.

Where learning algorithms are useful in both supervised and unsupervised machine learning, they are used in different ways in each type of discipline. Supervised machine learning benefits from having already labeled and isolated data, so the learning algorithms that are used will be different in some ways. The bottom line is that engineers put these learning algorithms together as building blocks of a particular technology or program that seeks to understand more about the data sets that it digests.

Share this: