Question

Why are GPUs important for deep learning?

Answer
Why Trust Techopedia

The use of graphics processing units (GPUs) has particular importance for the field of deep learning. The reason has to do with how deep learning systems are set up, and what they are intended to do.

Experts define deep learning as a type of machine learning in which algorithms use multiple layers for progressive data analysis.

Some cite particular examples, such as convolutional neural networks (CNNs) with their various layers involving max pooling, filtering, padding, striding and other tasks.

In a broader sense things like image processing and natural language processing rely on multi-step, multi-algorithm procedures, many of which resemble the neural networks that machine learning professionals learn to identify and analyze.

As we’ve noted in a prior article, GPUs are generally valued in machine learning, because of their parallel processing ability. As machine learning progressed, the hardware world was also progressing from the idea of an individual strong CPU core to multiple units with parallel processing that can more adequately handle large amounts of computational work quickly.

With deep learning systems embracing higher level generative models such as deep belief networks, Boltzmann machines and echo state systems, there’s a specific need for parallel processing and specialized core design. You could say that the use of GPUs is somewhat similar to the use of Advanced RISC Machines in some other types of processing — that customizing chips to a particular use makes a good deal of sense.

In addition to the utility of GPUs for deep learning, you also see these same types of processors becoming popular in moves toward a fundamental change in computing structure known as quantum computing.

Here again, it’s the complexity and higher-level ordering of computing power that requires the parallel processing capability. In quantum computing, traditional bits are replaced by qubits, which can have a value of 1, 0 or an unspecified combination. This sort of “Schroedinger’s bit” forms the basis for a computing model that can turn the world of IT on its head.

For those with an interest in emerging technologies, it will be key to watch the use of GPUs and their successors in such systems as deep learning networks and quantum computing setups. Both of these, many experts would say, are in their infancy and will mature and bring results in the years to come.

Related Terms

Justin Stoltzfus
Contributor
Justin Stoltzfus
Contributor

Justin Stoltzfus is an independent blogger and business consultant assisting a range of businesses in developing media solutions for new campaigns and ongoing operations. He is a graduate of James Madison University.Stoltzfus spent several years as a staffer at the Intelligencer Journal in Lancaster, Penn., before the merger of the city’s two daily newspapers in 2007. He also reported for the twin weekly newspapers in the area, the Ephrata Review and the Lititz Record.More recently, he has cultivated connections with various companies as an independent consultant, writer and trainer, collecting bylines in print and Web publications, and establishing a reputation…