[WEBINAR] Extract Value From Your Data Through Micro-Queries

Pattern Recognition

Definition - What does Pattern Recognition mean?

In IT, pattern recognition is a branch of machine learning that emphasizes the recognition of data patterns or data regularities in a given scenario. It is a subdivision of machine learning and it should not be confused with actual machine learning study. Pattern recognition can be either “supervised,” where previously known patterns can be found in a given data, or “unsupervised,” where entirely new patterns are discovered.

Techopedia explains Pattern Recognition

The objective behind pattern recognition algorithms is to provide a reasonable answer for all possible data and to classify input data into objects or classes based on certain features. A “most likely” matching is performed between various data samples and their key features are matched and recognized.

Pattern recognition and pattern matching are sometimes confused as the same thing when, in fact, they are not. Whereas pattern recognition looks for a similar or most likely pattern in a given data, pattern matching looks for exactly the same pattern. Pattern matching is not considered part of machine learning, although in some cases it leads to similar results as pattern recognition.

This definition was written in the context of Computer Science
Share this: