Definition - What does Rasterization mean?
Rasterization is the process by which most modern display systems turn electronic data or signals into projected images, such as video or still graphics. This is typically a process of identifying the needs of a specific media configuration, then allocating resources so that images are efficiently and optimally projected on the display device.
Techopedia explains Rasterization
The origin of image rasterization dates back to the early days of television technology. In the mid-twentieth century, televisions typically consisted of cathode ray tube (CRT) monitors, which scanned lines across their display screens that gradually accumulated into complete images. CRT monitors remained among the most common electronic display hardware for the rest of the century, but mainstream computers did not use them on a normal basis until the 1980s and '90s.
Rasterized graphics are often compared with image vectors. While rasterization is typically a process of compiling scan lines or pixels on a bitmap, in contrast, vectors incorporate mathematical functions in order to create images based on geometric shapes, angles and curves.
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: