Skip to content
What is VGA?

What is VGA?

VGA is a popular display standard that was developed by IBM (International Business Machines Corporation) and introduced by them to the world in 1987 and was the interface that was used widely between a PC and monitor before DVI, HDMI and DisplayPort. It utilizes analog signals, which means it’s only capable of lower resolutions and a lower quality display on screens. The Video Graphics Array started with 640 X 480 pixels and 16 or 256 colors and is still the mode PC’s boot into.

There are many variants to the VGA and each variant has different resolutions, and will be listed below.

  • The Extended Graphics Array (XGA) is an IBM display standard introduced in 1990, XGA was designed to replace the older 8514/A video standard. It provides the same resolutions ( 640 by 480 or 1024 by 768 pixels), but supports more simultaneous colors ( 65 thousand compared to 8514/A’s 256 colors). In addition, XGA allows monitors to be non-interlaced; this is an advantage because although interlacing increases resolution, it also increases screen flicker and reduces reaction time.

    XGA-2 added a 24-bit DAC ( Digital to Audio Converter), but the DAC was only used to extend the available master palette in 256-color mode.

    All standard XGA modes have a 4:3 aspect ratio with square pixels, although this does not hold for certain standard VGA and third-party extended modes (640 x 400, 1280 x 1024).
  • Wide XGA (WXGA) is a set of non-standard resolutions derived from the XGA display standards by widening it to a wide screen aspect ratio. WXGA is commonly used for low-end LCD TV’s and LCD computer monitors for widescreen presentation.
  • XGA + stands for Extended Graphics Array plus and is a computer display standard, usually understood to refer to the 1152 x 864 resolution with an aspect ratio of 4:3. Until the advent of widescreen LCDs, XGA+ was often used on 17-inch desktop CRT monitors.

    XGA+ is the next step after XGA, although it isn't approved by any standard organizations.
  • WXGA+ and WSXGA are non-standard terms referring to a computer display resolution of 1440x900. Occasionally manufacturers use other terms to refer to this resolution. The Standard Panels Working Group refers to the 1440 x 900 resolution is WXGA (II).
    The aspects of each are 16:10 (widescreen)
  • Super XGA (SXGA) is a standard monitor resolution of 1280 x 1024 pixels. This display is one step above the XGA resolution that IBM developed in1990.

    The 1280 x 1024 is not the standard 4:3 aspect ratio, but 5:4. A standard 4:3 monitor using this resolution will have rectangular rather than square pixels, meaning that unless the software compensates for this the picture will be distorted, causing circles to appear elliptical.

    SXGA is the most common native resolution of 17 inch and 19 inch LCD monitors. An LCD monitor with SXGA native resolution will typically have a physical 5:4 ratio, preserving a 1:1 pixel aspect ratio.

    Sony had manufactured a 17-inch CRT monitor with a 5;4 aspect ratio designed for this resolution. It was sold under the apple brand name.
  • WSXGA+ stands for Widescreen Super Extended Graphics Array Plus, it’s found on the 17” Apple PowerBook G4 and the unibody 15” MacBook Pro. The resolution is 1680 x 1050 pixels with a 16:10 aspect ratio.

Companies have started to move away from VGA-based resolutions in recent years however. For example Apple has started using variants when they adopted the 16:9 aspect ratio in order to provide a consistent pixel density across screen sizes. The Iphone X introduced a 2436 x 1125 resolution, i.e, 19:5:9.

Some air traffic control monitors these days use displays with a resolution of 2048 x 2048, with an aspect ratio of 1:1.

Previous article Importance of Video Calibration
Next article What is Temporary Threshold Shift?