Ed Cover Ebook Volume2 Promo 5eb9b5a634224

IBM’s XGA

Sept. 9, 2019
The Graphics Chip Chronicles Vol.2 No.1 - The personal computer industry owes a debt of gratitude to IBM for establishing the standards that came to define all of the graphics processors following the XGA, which was introduced in 1990.

Series: The Graphics Chip Chronicles

IBM introduced the eXtended Graphics Array (XGA) graphics processor and add-in board (AIB) in late 1990, and it would become the last graphics processor IBM would produce after having set all the standards for the industry it created.

Developed for the PS/2 along with the VGA, the XGA was referred to as a Type 2 video subsystem (the VGA being a Type 1). The XGA was a high-resolution graphics chip capable of displaying 1024 × 768 pixels, which IBM called PELs—a contraction of “Picture Element.” It could display 256 colors from a palette of 256k, or 6-bits per primary color. The XGA also added support (beyond the 8514/A) for high color (65,536 colors, 16 bpp/primary) at 640 × 480.

The second revision (XGA-2 in 1993) was an upgrade, offering higher refresh rates (75 Hz and higher, non-interlaced, up to at least 1024×768), improved performance, and a fully programmable display engine capable of almost any resolution within its physical limits.

IBM’s XGA combined an upgraded version of the VGA and included features from the 8514/A. The initial implementation of the XGA was as an on-board chip in the new PS/2 Model 90 and 90 XP. A standalone upgrade AIB (the IBM PS/2 XGA Display Adapter/A) was also available for existing PS/2s. The price was $1,095 for an XGA with 512KB VRAM and additional $350 for a memory upgrade to 1MB VRAM, which translates to about $2,400 today.

Big Changes Behind IBM’s XGA Chip

The big philosophical and architectural change in the XGA was the integration of the VGA subsystem. In a way this was an admission of defeat, said Michal Necasek of the OS/2 Museum, as “IBM’s strategy of providing an on-board VGA chip with an additional high-resolution accelerator such as the 8514/A clearly hadn’t worked out.” It was also the way most subsequent non-IBM graphics would be constructed.

Because of the VGA integration, the XGA was not backward compatible with the 8514/A. Also, unlike the 8514/A, the entire XGA framebuffer could be directly accessed by the host CPU. Furthermore, up to eight XGAs could be run in one system through a bus-mastering capability in the XGA. However, because the VGA used fixed I/O and memory mapped address spaces, only one VGA could be active at a time in a system.

The XGA had several such unique features including a new 64×64 hardware sprite used as a mouse cursor. Earlier chips such as the EGA, VGA, and 8514/A used software to manage the mouse overlay, which at the time was not a trivial challenge.

Components of IBM’s XGA

The XGA video subsystem components included:

  • System bus interface
  • Memory and CRT controller
  • Coprocessor
  • Video memory
  • Attribute controller
  • Sprite controller
  • Alphanumeric (A/N) font and sprite buffer
  • Serializer
  • Palette
  • Video digital-to-analog convertor (DAC)

The coprocessor provided hardware drawing-assist functions throughout real or virtual memory. The following functions were built in with the XGA:

  • Line drawing
  • Area filling
  • Logical and arithmetic mixing
  • Map masking
  • Scissoring
  • X and Y axis addressing

The XGA used a 32-bit data bus for all system memory and I/O addresses, while the VGA subsystem used either an 8-bit or 16-bit data bus. With a 16-bit data bus, XGA used a 512KB video display buffer, with a 32-bit data bus it used a 1MB video display buffer. Access to the XGA was accomplished via two sets of registers: The first set was mapped into the system’s I/O space, while the other set of registers mapped into memory.

The original XGA supported 1, 2, 4, 8 bit-per-pixel colors at 2014 × 768 interlaced. In a non-interlaced mode, it could support or 16 bits per-pixel colors. The XGA output went directly to a VGA connector, either on the AIB or from the system board.

The serializer (Figure 2) and DAC converted the data in the video display buffer to the screen image. The video data was stored in the video display buffer in 1-, 2-, 4-, 8-, or 16-bit pixels. The number of bits per pixel was determined by the video mode the computer was operating in. Each memory location in the buffer held one pixel and corresponded to a specific location on the screen. The binary value of each 1-, 2-, 4-, or 8-bit pixel was used as an index into the palette to determine the color to be displayed at that location. If the computer was in direct color mode, each pixel was 16 bits, and it did not use the palette to determine the colors. The XGA offered lots of options to the application developer.

The serializer took the data from the video display buffer and converted it into a serial bit stream. If the pixels were 1, 2, 4, or 8 bits, the binary value of each pixel corresponded to one of the 256 memory locations in the palette. Each memory location contained 18 bits, divided into three 6-bit values that represented specific intensities of red, blue, and green. In the direct color mode (palette bypass mode), each 16-bit pixel was divided into a 5-bit red intensity value, a 5-bit blue intensity value, and a 6-bit green intensity value, for a total of 65,536 possible colors. The DAC then converted the digital color-intensity values to analog values for the monitor.

Although targeted for OS/2-based PS/2s, recognizing that not everyone was signing up for OS/2, IBM provided drivers for Windows 2.1 and 3.0, OS/2 1.2, and popular software packages like AutoCAD.

IBM’s Upgrade to XGA-2

XGA-2. In 1992 IBM did an upgrade on the chip and introduced the XGA-2, which had built-in support for non-interlaced 1024 × 768 resolution and 1MB VRAM standard. The XGA-2 included a programmable PLL circuit and could support pixel clocks up to 90MHz; that enabled up to 75Hz refresh at 1024 × 768 resolution. Also, the 800 × 600 resolution was also supported, at up to 16bpp. The XGA-2 also had an improved DAC with 8-bits per channel, rather than 6 bits like the original XGA, which increase the displayable colors to 16 million.

The PS/2E introduced in 1993 featured a full-sized internal PC speaker, two SIMM sockets, and an extended bank of memory soldered directly to the motherboard. It featured 1 MB of video memory for the onboard XGA-2 graphics adapter, which was attached to the ISA bus instead of the usual MCA bus.

The XGA was speculated about in the late 1980s. as word of its development leaked out of IBM’s Hursley Labs in the UK. Its architecture was quite advanced for the time with a linear framebuffer aperture, a flexible bus-mastering, a draw engine, and a hardware sprite cursor. When it was released most PC graphics AIBs were dumb framebuffer, upgraded VGA AIBs (with higher resolution to i800 × 600) known as SuperVGAs.

It took the rest of the PC graphics hardware industry several years to catch up with XGA’s capabilities. In many ways, the XGA was a classic 1990s design even if it never reached its full potential it could have easily supported up to 4MB VRAM as well as 24/32bpp True Color pixel formats, although it didn’t support the Truevision Targa format.

IBM Offers to License XGA

When the XGA came out, IBM floated the idea it would sell the chip to other companies and allow them to build AIBs with it as a second-source. However, in the spring of 1991, the company changed its position on that and instead offered to license the design. The rumor at the time was that IBM did not want to reveal how much it cost them to build the chip and be accused of dumping.

Instead of selling the chips, IBM made the hardware specifications, the register interface, fully documented; and licensed the design to SGS-Thomson (Inmos), Intel and a few small companies like Integrated Information Technology. But the market had moved on from the character-based user interface into the bit-mapped graphics user interface (GUI), and the new chips and AIBs coming were known as GUI accelerators.

Due to the software support IBM developed for the XGA, it only worked in a 386 or 386-based PS/2 (including 386SX-based PS/2s), and by 1992, the 486 was the CPU of choice.

The XGA was the end of IBM’s dominance in the PC market, a decade after it introduced the PC. IBM sold its PC product line 12 years later to Lenovo, which is currently the world's third largest PC vendor by sales, highlighting that one company’s castoff is another’s gold.

The next major development in PC graphics came from 3D graphics accelerators ranging from 3Dfx’s Voodoo to Nvidia’s Riva 128. But the PC graphics industry owes a debt of gratitude to IBM for establishing standards for all graphics chips.

Series: The Graphics Chip Chronicles

About the Author

Jon Peddie | President

Dr. Jon Peddie heads up Tiburon, Calif.-based Jon Peddie Research. Peddie lectures at numerous conferences on topics pertaining to graphics technology and the emerging trends in digital media technology. He is the former president of Siggraph Pioneers, and is also the author of several books.  Peddie was recently honored by the CAD Society with a lifetime achievement award. Peddie is a senior and lifetime member of IEEE (joined in 1963), and a former chair of the IEEE Super Computer Committee. Contact him at [email protected].

Sponsored Recommendations

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!