Edge Detection

Definition - What does Edge Detection mean?

Edge detection is a process applied to digital image processing. It is really a name for a collection of algorithms and tools that do a particular thing – enhancing the edges of objects in an image by using mathematical models.

Techopedia explains Edge Detection

Edge detection is built into many new televisions. It is commonly used in the software industry, and is one part of a larger pool of filters and techniques that have been pioneered through the use of new technologies like neural networks. A convolutional neural network applies many filters and processes to an image to try to enhance computer image processing.

Edge detection dovetails with other concepts like edge enhancement to form the basis for revolutionary new developments in image processing. For example, scientists have studied how edge detection can improve visual broadcasting for the visually impaired. All of this is a part of the fundamental progress being made on image processing based on the age of big data, artificial intelligence and automation.

Edge detection works on the principle of identifying places in an image where brightness differs suddenly or radically. "Discontinuities" in brightness can often be linked to other image discontinuities such as discontinuity in depth, etc. Then the use of edge detection in image processing helps in the interpretation of the image.

Share this: