Video Graphics Array (VGA)
Definition - What does Video Graphics Array (VGA) mean?
Video Graphics Array (VGA) is a display standard originally developed in 1987 by IBM for its PS2 range of computers. VGA’s single-chip design facilitated direct computer system board embedding with minimum requirements. Later, VGA became the de facto standard for graphics systems in PCs.
VGA was IBM's last graphical standard adopted by most manufacturers of clone computers. Super Video Graphics Array (SVGA) and Extended Graphics Array (XGA) replaced VGA.
Techopedia explains Video Graphics Array (VGA)
VGA was designed as an application specific and integrated circuit (IC) for analog signals, versus digital signals used in Monochrome Display Adapters (MDA), Color Graphics Adapters (CGA) and Enhanced Graphics Adapters (EGA) standards. VGA systems are not compatible with monitors built according to these older standards.
A VGA connector has 15 pins. In text mode, a VGA system normally provides a pixel resolution of 720x400. In graphics mode, a VGA system provides a pixel resolution of 640x480 (16 colors) or 320x200 (256 colors).
Additional VGA specifications include:
- 256 KB video random access memory (VRAM)
- 262,144 total colors
- 16-color and 256-color modes
- Master clock operating at 25.175 MHz or 28.322 MHz
- Planar mode
- Packed-pixel mode
- Up to 800 horizontal pixels
- Up to 600 lines
- Split screen support
- Refresh rates with a maximum of 70 Hz
- Support for smooth hardware scrolling
VGA supports All Points Addressable (APA) graphic modes and alphanumeric computer display modes. Most PC games are compatible with VGA's high-color depth.
Techopedia Deals: Adobe Super Bundle: 130 Expert Courses
Join thousands of others with our weekly newsletter
The 4th Era of IT Infrastructure: Superconverged Systems:
Approaches and Benefits of Network Virtualization:
Free E-Book: Public Cloud Guide:
Free Tool: Virtual Health Monitor:
Free 30 Day Trial – Turbonomic: