How does Occam's razor apply to machine learning?

Q:

How does Occam's razor apply to machine learning?

A:

The use of Occam's razor dates back to William of Ockham in the 1200s – it's the idea that the simplest and most direct solution should be preferred, or that with different hypotheses, the simplest one or the one with fewest assumptions will be best applied.

However, Occam's razor also has some modern applications to state-of-the-art technologies – one example is the application of the principle to machine learning. With machine learning, engineers work to train computers on sets of training data, to enable them to learn and go beyond the limits of their original codebase programming. Machine learning involves implementing algorithms, data structures and training systems to computers, to allow them to learn on their own and produce evolving results.

With that in mind, some experts feel that Occam's razor can be useful and instructive in designing machine learning projects. Some contend that Occam's razor can help engineers to choose the best algorithm to apply to a project, and also help with deciding how to train a program with the selected algorithm. One interpretation of Occam's razor is that, given more than one suitable algorithm with comparable trade-offs, the one that is least complex to deploy and easiest to interpret should be used.

Others point out that simplification procedures such as feature selection and dimensionality reduction are also examples of using an Occam's razor principle – of simplifying models to get better results. On the other hand, others describe model trade-offs where engineers reduce complexity at the expense of accuracy – but still argue that this Occam's razor approach can be beneficial.

Another application of Occam's razor involves the parameters set for certain kinds of machine learning, such as Bayesian logic in technologies. In limiting the sets of parameters for a project, engineers could be said to be “using Occam's razor” to simplify the model. Another argument goes that when creative people brainstorm how to assess the business use case and limit the scope of a project before using algorithms, they're using Occam's razor to whittle down the complexity of the project from the very beginning.

Yet another popular application of Occam's razor to machine learning involves the “curse of overly complex systems.” This argument goes that creating a more intricate and detailed model can make that model fragile and unwieldy. There is a problem called overfitting where models are made too complex to really fit the data being examined and the use case for that data. This is another example where someone might cite Occam's razor in the deliberate design of machine learning systems, to make sure that they don't suffer from overcomplexity and rigidity.

On the other hand, some point out that using Occam's razor incorrectly can reduce the effectiveness of machine learning programming. In some cases, complexity can be necessary and beneficial. It all has to do with examining the particular project scope and what must be obtained, and looking at the inputs, the training sets and the parameters to apply the most targeted solutions for the given result.

Have a question? Ask us here.

View all questions from Justin Stoltzfus.

Share this:
Written by Justin Stoltzfus
Profile Picture of Justin Stoltzfus
Justin Stoltzfus is a freelance writer for various Web and print publications. His work has appeared in online magazines including Preservation Online, a project of the National Historic Trust, and many other venues.
 Full Bio