VGA - Video Graphics Array, became the industry standard when the GUI - Graphical User Interface, began to emerge in the PC industry as a future standard in coming operating systems (Windows 9x). Before this, there existed many different video standards that were essentially proprietary. By setting up an open architecture video standard the industry assured the end user that graphics-based OS's would be able to load up and boot and be viewable on any PC regardless of the video card installed on it.
VGA is formally defined by its resolution and color depth. VGA is "640 x 480 in 16 of 256." The first two numbers are the screen graphics resolution in pixels across by pixels high. So the VGA screen is a maximum of 640 pixels across by 480 pixels high. The third number means the maximum number of colors that the video controller can represent on screen at one time. For VGA this is 16 colors. VGA supports a color palette of 256 possible colors.
The video controller can be instantly changed from using 16 particular colors assigned to the 16 different numbers representing each bit, into any other arrangement of 16 colors for the 16 numbers. So VGA can display 16 colors at a time and switch between any combinations of 256 possible colors in a color palette.
At any time one, some or all of the 16 colors could be redefined with a new color from the palette. Even the palette can be reloaded, but this would probably not be efficient during the course of a program's execution. The full definition of VGA is then: 640 columns by 480 rows of pixels, each pixel can be one of 16 preset colors taken from a palette of 256 possible colors. Or in shorthand try to remember:640 by 480 in 16 of 256. The industry has added 800x600 in 16 of 256 as part of VGA.
If each pixel can be one of 16 possible colors how many bits are required to store each pixel? 24=16. So each pixel is held in memory as a 4-bit value. The video RAMDAC chip on the video controller is responsible for constantly scanning the video RAM buffer. As it does so, it translates the values by their position in the RAM array into which pixel, and the value stored there will determine which color it should be.
In this example, the RAMDAC is displaying the underlined bytes as the colors shown in which the first four bits of the first byte which are the hexadecimal digit "A" are defined as the color yellow, the next four binary digits represented by the hexadecimal "F" are black, the next hex digit 4 = red, the next pixel in the row is represented by hex digit 2 = green, the next pixel is hex digit 3 = blue, 6 = orange.
This ATI Mach64 series AGP card demonstrates clearly that the edge connector is farther back than PCI and features two rows of connectors stacked on top of each other across the card edge.
Below is a Pentium 4 motherboard with a typical AGP slot; brown in color and closest to the CPU. Note it is farther from the back of the system than the PCI slots.
The AGP slot was developed as the high speed video card expansion bus for the Pentium based motherboard. Only certain chipsets feature the port and quite often OEM brand PCs build an AGP video card directly into the motherboard and fail to provide an AGP slot for future expansion of the computer's video controller capabilities. Within the chipset families there was a quick evolution of AGP technology. The original slot supports 1x and 2x cards. The slot stays at a certain range of clock frequencies while the card uses a clock multiplier similar to what the CPU does with its internal clock multiplier. The chipset only needs a slight modification to accommodate video cards that clock double.
The AGP slot is usually brown and closest to the CPU. However there are many customized motherboards that now deviate from the standard coloration of the motherboard connectors and components so color is not the only clue to go by. Some motherboards have one or more PCI slots closer to the CPU than the AGP slot. The slot is similar in size and connector pins to a PCI slot but the divider is different and it is farther away from the back of the motherboard than a PCI slot.
Comparison of AGP slots (motherboard rear is to the left)
AGP 1.0 slot, 3.3V 1X/2X speeds (slot divider closer to rear)
AGP 2.0 slot, 1.5V 4X/8X speeds (slot divider closer to front)
Dual function AGP slot 1.5V or 3.3V (no slot divider to block wrong card)
Soon a 4x slot was developed with 8x capabilities. In addition to this another enhancement called the AGP Pro slot has appeared which provides additional power pins to large power hungry video cards. These video controllers, part of the AGP 2.0 specification also run at a reduced voltage of 1.5V. Because of the different voltage requirement, the keying of the AGP slot is different from that of the AGP 1.0 specification 3.3V slots and cards that run at 1x or 2x. Unlike VESA the AGP slot connects to motherboard chipset circuitry rather than running directly to the processor pins. Because of this arbitration and the dedication of the slot to video cards the technology appears to greatly maximize the optimization of video throughput within the system which translates to vastly improved graphics since its introduction. All AGP ports are 32-bits data bus width running at 66Mhz.
AGP Port Type |
Voltage |
bits/clock |
Throughput |
AGP |
3.3V |
1 |
266MB/sec |
AGP 2x |
3.3V |
2 |
533MB/sec |
AGP 4x |
1.5V |
4 |
1066MB/sec |
AGP 8x |
1.5V |
8 |
2133MB/sec |
Now with the introduction of PCI-Express based motherboard chipsets, again the standard expansion bus is certainly powerful enough for the current video controllers. AGP8x slots support a throughput of 2133MB/sec which is certainly impressive, but PCI-Express supports 250MB/sec per data channel and an x16 slot has 16 data channels. While they are not sybchronous, the system can use all 16 channels to send data to the card and let the card sort the data back into proper order for display. 250MB/Sec x 16 = 4000MB/Sec. PCI-Express video controllers will continue to come down in proce and should eliminate AGP within a few years. But the VESA people may yet come up with another high speed expansion slot just for their own technologies.
DVI is a new interface connector for attaching video cards to display devices, usually
LCD’s. Since LCD’s are digital devices, the conversion of video data into the analog Red-Green-Blue voltage scan signals is a waste since the LCD display must have the circuitry that will convert these analog signals back into digital information for use to energize the transistors that control the crystals of the LCD display. It is much more efficient to skip the conversion to analog which forces the display to basically be equipped with a set of CRT circuits to convert it back to digital again anyway. Instead the video controller can transmit the video information in a digital format directly to the display without this wasted conversion to analog and then back again. It reduces part counts as well as heat generated by the display and improves performance.
The maximum transfer rate of analog data by video controllers is dictated by the video RAMDAC circuitry and is roughly equivalent to the refresh rate of any analog signal driven monitor which is rather poor. Using a purely digital interface a much higher data transfer rate can be achieved which leads to much better graphics performance on screen than is possible with analog signaling. Modern monitors and video controllers are equipped with a digital interface so that the video card can transfer pure digital image data that the monitor can process internally for rendering on screen. The main standard digital interface is DVI.
There are several different DVI interfaces. These are the three major types:
- DVI-D – DVI – Digital (Only)
- DVI-I – DVI – Integrated (Provides both digital and analog signals)
- DVI-A – DVI – Analog (Provides analog only)
The four separated pins with the cross blade pin connector are the analog section. This is the section hat sends standard RGB raster scan voltages to a CRT. The other 24-pin section of square pins carry the digital data and the Plug-N-Play interface signals.
3D acceleration is a reference to the developing of an image on screen in layers. In this way a background layer can be built up that can be manipulated independently of the foreground layers making scenery versus characters easier to manipulate in full screen animation applications such as video arcade style fully animated full screen games. The number of layers is usually a referred to as 16-bit, 24-bit or 32-bit Z-buffer depth. Further, the buffering can be double or triple. The greater the resolution and "flat" 2D graphics color depth the larger the flat video buffer that is required. Add to that 3D depth in which multiple 2D images are being layered on top of each other and double and triple buffered in video RAM then the more video RAM is needed.
2D and 3D accelerator graphics cards indicate that they support an industry standard technique for manipulating graphics. This can be either software driver acceleration which depends on the CPU to perform all pixel’s calculations or it can be hardware acceleration in which the video card includes its own graphics processing unit or GPU onboard. The GPU is a high speed specialized CPU in its own right and can receive shorthand instructions from the system rather than have the system calculate and describe each pixel which is an enormous drain on the CPU, and in fact most CPU’s simply cannot manage full screen animation.
Instead the system can send a command to the video graphics RAM buffer which says "Draw a circle centered at pixel 17, 109 with a radius of 78 pixels and 24-bit color of F0F000" (Yellow by the way) The GPU will handle the actual calculations of each pixel and write the values into its onboard RAM which the video RAMDAC is scanning and rendering out to the monitor. This allows the software to do more calculations concerning the game itself rather than spend the majority of the time trying to keep up with managing the colors of millions of pixels on screen.
The various industry standards have become more or less terms refering to their pixel resolution. They are:
- SVGA – Super Video Graphics Adapter – A pseudo-standard in which the card must adhere strictly to the published VGA standard so that all systems can boot up and display information on screen using a standard VGA text mode, after that the video controller can be switched by drivers into its native mode which can be any resolution and any method of displaying graphics to the monitor. Despite that, there are many standards above VGA listed below, and all of them are technically SVGA. SVGA is also erroneously called Ultra VGA, the term is a marketing term for SVGA cards and not a true standard.
- XGA – eXtended Graphics Array – Developed for the video controllers of the IBM PS/2 line of personal computers, close cousins to the PC, this involves aside from the low level access ports and command set a resolution of: 1024 x 768 in 256 colors or 640 x 480 in high color (16-bits per pixel = 65,536 possible colors)
- XGA-2 – An improvement on the original XGA controller allows 1024 x 768 in high color (16 bits per pixel = 65,536 colors)
- SXGA – Super eXtended Graphics Array – Adds the 1280 x 1024 resolution.
- UXGA – Ultra eXtended Graphics Array – Adds the 1600 x 1200 resolution.
- QXGA – Quantum eXtended Graphics Array – Add the 2048 x 1536 resolution.
- WXGA – Wide eXtended Graphics Array – Allows unusual sized resolutions from 1366 to 1280 pixels wide by 720 to 768 pixels high which allows it to fit the image onto an LCD screen without leaving a black unused band to the sides or above and below the used area of the LCD as other standards using standard resolutions may do.
Aside from the resolution of the video controller, the display device or monitor must support the resolution of the video controller or the setting in the case of controllers that support more than one resolution setting. The CGA, the first color graphics adapter of them all supported more than one resolution: 640 x 350 in 2 colors, and 320 x 200 in 4 colors. The monitor must support 640 x 350 pixels or that setting of the CGA video controller will not be displayed properly or the screen might go totally black.
Monitors also have a measurement called dot pitch. Dot pitch refers to the distance between the color triads on the surface of the cathode ray tube that form the different colors of the pixels themselves. The larger the number, then the farther apart these are and the poorer the quality of the image on the screen. Large dot pitch monitors are less expensive to manufacture and therefore are less expensively priced but the image on the screen is "fuzzy" while small dot pitch monitors have very crisp clear and sharp images. These however cost more.
The two methods of dividing the red, green and blue phosphors on the surface of the tube from each other are called a shadow mask and an aperture grill. Shadow masks lead to a higher curvature of the surface of the tube while aperture grills lead to very flat surfaces of the modern PC monitor. These were first introduced by Sony in their "Trinitron" line of televisions and these types of CRT’s were also used in the manufacture of their computer monitors as well. Aperture grills aside from being much flatter than the curved shadow mask types of CRTs have very small faintly visible lines across the screen. These are the fine wires beneath the glass surface that form the aperture grill. Even though they can be seen, the overall image quality of an aperture grill monitor is far sharper and brighter than a shadow mask type of CRT.
The CRT uses a single electron beam to draw the image on the surface of the tube starting at the top left corner and sweeping across the top row then swinging to the far left side it pulses across drawing the second row and so forth until it reaches the bottom of the screen. Then it swings back to the top left corner and starts again.
The number of times the monitor draws an entire screen of information per second is expressed as the refresh rate given in Hz – Hertz. A 60Hz refresh rate means that the monitor is refreshing the screen 60 times per second. This was the old VESA standard for the VL bus era video controllers and monitors but it causes screen flicker and is noticeable during rapid eye movements out of the corner of the eye. Look over the upper left corner of the monitor at any distant object, then sweep your eyes to the left. If the monitor has a low refresh rate you will see it flicker in the corner of your eye.
The reason it is clearer at the edge of your field of vision has to do with the structure of the retina and the brain’s processing of the information coming from the center of it near the focal point. The flicker is always there and always visible, your brain adjusts to it. However, the flicker does cause headaches and even triggers seizures in sufferers of epilepsy. The VESA has revised the standard and now 75Hz is the acceptable standard but there exist 85Hz and even higher capability monitors. People who suffer headaches from spending long hours working at the computer and especially people with epilepsy should consider using an active matrix LCD screen instead since each individual pixel is always on and does not get refreshed and does not therefore flicker at all.
The size measurement itself is questionable at best. The monitor's size is supposed to be the diagonal from top left to bottom right corner, but sometimes the "viewable portion" of this diagonal can be quite a bit smaller than the size the manufacturer claims it to be. A monitor of a particular size CRT can only handle a certain resolution in pixels. Some exceptions exist but in general these guidelines seem to hold for a majority of CRT displays in the PC industry. Typical resolutions for monitors are:
Monitor size | Resolution (pixels width x height) |
14" | 640 x 480 |
15" | 800 x 600 |
17" | 1024 x 768 |
19" – 21" | Varies from 1024 x 768 or better |
> 21" | Dedicated to graphics, very high resolution monitor |
The color depth that a CRT supports varies greatly from one manufacturer and price range to the next. Older VGA monitors cannot be expected to handle color depth much greater than the VGA standard of 256 basic colors in the palette. Although some can handle high color. The most destructive setting for a monitor is the vertical refresh rate or the number of times per second the video controller attempts to create the entire screen image per second. Older monitors will adhere to the VGA standard of 60Hz which causes very noticeable screen flicker and should only be used on machines whose output to the screen is not depended upon heavily like servers locked away in a closet. When they are attached to a video controller that is driving a 75Hz or higher refresh rate signal the tube cannot handle it and the screen will be stretched and roll too fast to be seen. Worse, it can literally destroy the monitor’s circuitry. When connecting a new monitor to a system and the video card could possibly be set to a very high resolution in pixels, color depth, or refresh rate, start Windows in safe mode to bypass the video card drivers and leave it in the default VGA settings which will not destroy a monitor. The settings can then be checked by right clicking the desktop and choosing properties. If the settings are too high or in doubt, record the current settings and then change them down to settings that the current monitor can handle.