What Does Rasterization Mean?
Rasterization is the process by which most modern display systems turn electronic data or signals into projected images, such as video or still graphics. This is typically a process of identifying the needs of a specific media configuration, then allocating resources so that images are efficiently and optimally projected on the display device.
Techopedia Explains Rasterization
The origin of image rasterization dates back to the early days of television technology. In the mid-twentieth century, televisions typically consisted of cathode ray tube (CRT) monitors, which scanned lines across their display screens that gradually accumulated into complete images. CRT monitors remained among the most common electronic display hardware for the rest of the century, but mainstream computers did not use them on a normal basis until the 1980s and ’90s.
Rasterized graphics are often compared with image vectors. While rasterization is typically a process of compiling scan lines or pixels on a bitmap, in contrast, vectors incorporate mathematical functions in order to create images based on geometric shapes, angles and curves.