What Does Rasterization Mean?

Rasterization is the process by which most modern display systems turn electronic data or signals into projected images, such as video or still graphics. This is typically a process of identifying the needs of a specific media configuration, then allocating resources so that images are efficiently and optimally projected on the display device.


Techopedia Explains Rasterization

The origin of image rasterization dates back to the early days of television technology. In the mid-twentieth century, televisions typically consisted of cathode ray tube (CRT) monitors, which scanned lines across their display screens that gradually accumulated into complete images. CRT monitors remained among the most common electronic display hardware for the rest of the century, but mainstream computers did not use them on a normal basis until the 1980s and ’90s.

Rasterized graphics are often compared with image vectors. While rasterization is typically a process of compiling scan lines or pixels on a bitmap, in contrast, vectors incorporate mathematical functions in order to create images based on geometric shapes, angles and curves.


Related Terms

Latest Hardware Terms

Related Reading

Margaret Rouse

Margaret Rouse is an award-winning technical writer and teacher known for her ability to explain complex technical subjects to a non-technical, business audience. Over the past twenty years her explanations have appeared on TechTarget websites and she's been cited as an authority in articles by the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine and Discovery Magazine.Margaret's idea of a fun day is helping IT and business professionals learn to speak each other’s highly specialized languages. If you have a suggestion for a new definition or how to improve a technical explanation, please email Margaret or contact her…