Rasterization

Why Trust Techopedia

What Does Rasterization Mean?

Rasterization is the process by which most modern display systems turn electronic data or signals into projected images, such as video or still graphics. This is typically a process of identifying the needs of a specific media configuration, then allocating resources so that images are efficiently and optimally projected on the display device.

Advertisements

Techopedia Explains Rasterization

The origin of image rasterization dates back to the early days of television technology. In the mid-twentieth century, televisions typically consisted of cathode ray tube (CRT) monitors, which scanned lines across their display screens that gradually accumulated into complete images. CRT monitors remained among the most common electronic display hardware for the rest of the century, but mainstream computers did not use them on a normal basis until the 1980s and ’90s.

Rasterized graphics are often compared with image vectors. While rasterization is typically a process of compiling scan lines or pixels on a bitmap, in contrast, vectors incorporate mathematical functions in order to create images based on geometric shapes, angles and curves.

Advertisements

Related Terms

Margaret Rouse
Technology Specialist
Margaret Rouse
Technology Specialist

Margaret is an award-winning writer and educator known for her ability to explain complex technical topics to a non-technical business audience. Over the past twenty years, her IT definitions have been published by Que in an encyclopedia of technology terms and cited in articles in the New York Times, Time Magazine, USA Today, ZDNet, PC Magazine, and Discovery Magazine. She joined Techopedia in 2011. Margaret’s idea of ​​a fun day is to help IT and business professionals to learn to speak each other’s highly specialized languages.