Part of:

Debunking the Top 4 Myths About Machine Learning

Why Trust Techopedia
KEY TAKEAWAYS

Machine learning is about to infiltrate the tech world. But before we can gauge whether it will lead to a digital paradise or techno-tyranny, we must understand what it can and cannot do.

Machine learning (ML) is going to be either a boon or a bane to the enterprise, depending on who you talk to. On one hand, it will bring a wide range of new capabilities to digital processes – everything from automated workflows to self-managing infrastructure. On the other, it will displace jobs and leave organizations powerless to make corrections when things go awry.

The truth is probably somewhere between these two extremes, but to really get a handle on what ML can and cannot do, it is necessary to dispel some of the myths that have grown up around the technology. (With so much to offer, why isn't everyone using ML? Find out in 4 Roadblocks That Are Stalling Adoption of Machine Learning.)

Myth 1: Machine learning and artificial intelligence are one and the same.

While it is true that they both utilize the same fundamental technology, AI is an umbrella term that encompasses a wide range of disciplines. According to Dr. Michael J. Garbade, CEO of Education Ecosystem, AI encompasses not only ML, but neural networking, natural language processing, speech recognition and a host of other emerging technologies. ML has the distinction of being able to alter its own code based on experiences, changes to its environment or the introduction of new objectives – this is essentially the “learning” aspect of machine learning.

“The intention of machine learning is to enable machines to learn by themselves using the provided data and make accurate predictions,” he said. “It is a method of training algorithms such that they can learn how to make decisions.”

Machine learning, therefore, is the way in which data systems become intelligent. But since learning is a process, knowledge workers will have to get used to the idea that future technologies will not offer full functionality right out of the box, but will gravitate toward increasingly optimized performance as time goes by.

Myth 2: Machine learning cannot be controlled.

This ability to “learn” has naturally given rise to the fear that ML-powered systems will start to make decisions and take actions beyond what users intended. But stories about killer robots running amok or computer overlords wiping out pesky humans are more science fiction than reality. What has been known to happen is that biases in the data that ML is exposed to can cause it to make poor decisions, as evidenced by the case of Tay, a Microsoft chatbot for Twitter that was led to spout racist views.

Advertisements

But as IV.AI CEO Vince Lynch noted on Tech Crunch recently, this is not a lack of control, but failure to implement the proper controls. By choosing the right learning models and data sets, and then subjecting the system to rigorous oversight, organizations should be able to safely deploy ML without catastrophic consequences. In fact, properly implemented ML algorithms could alert users to the inherent biases that exist in most data sets, leading to a more rational framework of key commercial and industrial operations.

Myth 3: Machine learning will destroy jobs.

While some jobs may be lost, it is more accurate to say that ML will redesign work, not replace it, says Tom Relihan of MIT’s Sloan School of Management. For most people, ML will take over the mundane, boring tasks that make work tedious, but not the actual job itself. There is a key difference between narrow artificial intelligence – that which is designed to suit highly targeted functions – and general AI, which can function in a largely human-like manner. Narrow AI is what we have now, while the general variety won’t be ready for decades, if at all. So no matter how much better ML is at doing certain things, it will not be able to replace humans entirely, and in fact will make us more productive.

Clearly, this will impact some occupations more than others, and it will not necessarily be the less complex work that gets automated. Radiologists, for example, might see key functions like reading medical images giving way to ML, but a masseur will likely remain a hands-on profession for some time to come. (For more on how jobs will change due to AI, see New Jobs in the AI Era.)

Myth 4: Machine learning is actually learning.

Remember, as Dr. Garbade pointed out, ML is just algorithms. Actual human learning is far more mysterious, so much so that even the world’s leading neuroscientists cannot fully explain it. As Brookings Institute’s Chris Meserole points out, human learning requires experience and the ability to gauge probabilities rather than pure logic and reason, and computers are very good at calculating probabilities, so in this sense a machine can “learn” to speak and read and recognize faces in much the same way that we do.

The key difference, though, is that an algorithm will never make the leap from simple data analysis and prediction to a fully realized understanding of what it all means. From its perspective, it is all just numbers. So an ML system can scan an image of, say, a cat, convert the image into a sequence of numbers representing each point of the image in terms of color, shading, etc., then compare that sequence to all other known sequences just to come up with a probability as to whether it is a cat or a dog or a rhinoceros. Meanwhile, a three-year-old girl, who may have seen only one cat in her lifetime, can look at a crude line drawing for barely a second and, with little computation and practically zero energy consumed, tell you for certain that it is a cat.

And this is the reason why, in the final analysis, we can conclude that ML will be a boon to the enterprise from the CEO to the entry-level worker. It will never replace human labor, but will make it richer and more rewarding.

Advertisements

Related Reading

Related Terms

Advertisements
Arthur Cole
Technology Writer
Arthur Cole
Technology Writer

Arthur Cole is a freelance technology journalist who has been covering IT and enterprise developments for more than 20 years. He contributes to a wide variety of leading technology web sites, including IT Business Edge, Enterprise Networking Planet, Point B and Beyond and multiple vendor services.