Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects simply to a non-technical, business audience. Over…
Discretization is the process of replacing a continuum with a finite set of points. In the context of digital computing, discretization takes place when continuous-time signals, such as audio or video, are reduced to discrete signals. The process of discretization is integral to analog-to-digital conversion. Discretization is related to the term quantization.
Mathematicians have been busy dividing and quantifying things for thousands of years. They ran into problems from the beginning. The Greek philosopher Aristotle proposed the “dichotomy paradox.” Suppose someone wants to walk home. To get there, one must first walk halfway home. To walk halfway home, one must first walk one-fourth the way home. Since the distance home is infinitely divisible, one must complete an infinite number of tasks to get there. So theoretically, one can never walk home.
A related problem in modern times is called discretization error. The discretization of continuity can result in errors in numerical methods. This has something to do with the finite number of evaluations done by computers that may limit their accuracy. Mathematicians describe it in much more elaborate equations today, but never more colorfully and simply than Aristotle. There are more than mathematical problems with the evaluation of continuity and infinitesimals.
Nonetheless, discretization and quantization make mathematics and computing possible. For instance, lane one of a standard running track is recognized as having a length of 400 meters. That means that the route that the runner takes in lane one may be divided into 400 discrete lengths of one meter each. A runner who completes any portion or multiple of the course can be recognized for completing a specific distance in meters. When all runners have completed the same distance, they can be assigned a time because time itself has been divided into discrete segments of hours, minutes, seconds and milliseconds.
Discretization and quantization are necessary to digitization. They break things down into manageable parts. They bring with them the challenges faced by mathematicians since the beginning of the discipline.
Techopedia’s editorial policy is centered on delivering thoroughly researched, accurate, and unbiased content. We uphold strict sourcing standards, and each page undergoes diligent review by our team of top technology experts and seasoned editors. This process ensures the integrity, relevance, and value of our content for our readers.
Margaret is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages.
What Is Hyperdimensional Computing? Hyperdimensional computing is a new approach to information processing that uses high-dimensional mathematical vectors to represent...
Margaret RouseTechnology Expert
What Does STEM Mean?STEM is an integrated, interdisciplinary, and student-centered approach to learning that encourages critical thinking, creativity, collaboration and...
What Does Distributed Cloud Mean?Distributed cloud is a business model that extends a public cloud provider’s infrastructure and services to...
Trending NewsLatest GuidesReviewsTerm of the Day