Why are GPUs important for deep learning?

Presented by: AltaML


Q:

Why are graphics processing units (GPUs) important for deep learning?

A:

The use of graphics processing units (GPUs) has particular importance for the field of deep learning. The reason has to do with how deep learning systems are set up, and what they are intended to do.

Experts define deep learning as a type of machine learning in which algorithms use multiple layers for progressive data analysis.

Some cite particular examples, such as convolutional neural networks (CNNs) with their various layers involving max pooling, filtering, padding, striding and other tasks.

In a broader sense things like image processing and natural language processing rely on multi-step, multi-algorithm procedures, many of which resemble the neural networks that machine learning professionals learn to identify and analyze.

As we’ve noted in a prior article, GPUs are generally valued in machine learning, because of their parallel processing ability. As machine learning progressed, the hardware world was also progressing from the idea of an individual strong CPU core to multiple units with parallel processing that can more adequately handle large amounts of computational work quickly.

With deep learning systems embracing higher level generative models such as deep belief networks, Boltzmann machines and echo state systems, there’s a specific need for parallel processing and specialized core design. You could say that the use of GPUs is somewhat similar to the use of Advanced RISC Machines in some other types of processing — that customizing chips to a particular use makes a good deal of sense.

In addition to the utility of GPUs for deep learning, you also see these same types of processors becoming popular in moves toward a fundamental change in computing structure known as quantum computing.

Here again, it’s the complexity and higher-level ordering of computing power that requires the parallel processing capability. In quantum computing, traditional bits are replaced by qubits, which can have a value of 1, 0 or an unspecified combination. This sort of “Schroedinger’s bit” forms the basis for a computing model that can turn the world of IT on its head.

For those with an interest in emerging technologies, it will be key to watch the use of GPUs and their successors in such systems as deep learning networks and quantum computing setups. Both of these, many experts would say, are in their infancy and will mature and bring results in the years to come.

Have a question? Ask us here.


PRESENTED BY

View all questions from Justin Stoltzfus.

Share this:
Written by Justin Stoltzfus
Profile Picture of Justin Stoltzfus
Justin Stoltzfus is a freelance writer for various Web and print publications. His work has appeared in online magazines including Preservation Online, a project of the National Historic Trust, and many other venues.
 Full Bio