An Intro to Machine Learning for IT Pros

Gradient Descent and Backpropagation

In learning about machine learning for the first time, you also come across some fairly heavy terms that talk about the process of fine-tuning the results of a machine learning project.

“Gradient descent boosting” sounds obscure and esoteric. It's not a term that's easy to interpret, especially without certain kinds of scientific or mathematical background. Machine learning experts will say cryptically that engineers use a stochastic gradient descent process to calculate a loss function, which can sound like Greek to a lot of people.

At the same time, they'll talk about backpropagation as a way to enhance a feedforward neural network.

Here's the key thing to keep in mind – gradient descent and backpropagation are two terms for similar aspects of a process that can correct some innate errors in how machine learning does its job – or, you could say, it optimizes the results.

Either way, the idea is that through algorithms, the technology is looking back at its original work, in order to improve it later. Backpropagation is sort of shorthand for “backward propagation of errors.” A backpropagation algorithm typically uses gradient descent.

So what do these algorithms do?

One of the simplest ways to explain backpropagation and gradient descent is that these algorithms change the weighted inputs, or the weights of the inputs in question.

Suppose a machine learning project gets its training set and then is let loose on a set of labeled data in a supervised learning project. Over time, engineers can see where maybe the wrong parts of an input were amplified, or how changes could make the system more precise. Using those particular algorithms on the process, the engineers change the input weights to maximize the accuracy of results.

You don't always get this from reading about gradient descent and backpropagation. There's a lot of inside baseball terminology and lingo that goes into discussing these processes. But for someone who is just starting to learn about machine learning, understanding that all of this technical-speak regards tweaking and changing weighted inputs can go a long way.


Share this:
Written by Justin Stoltzfus
Profile Picture of Justin Stoltzfus

Justin Stoltzfus is a freelance writer for various Web and print publications. His work has appeared in online magazines including Preservation Online, a project of the National Historic Trust, and many other venues.