Chapter 15: Graphics Hardware Advancements
15.1 Early Hardware
For a little more than a decade in the beginning of computer graphics history, images were created and displayed as vectors – straight lines connecting points on the screen of a CRT. These displays were “refreshed” from a display list, which was a portion of memory that the display controller accessed to determine what to draw. A change in the image was accomplished by a change in the contents of the display list, which could be done fairly rapidly.
The “democratization of computer graphics”, as one author put it, was made possible by the introduction of raster graphics. This technology used a standard television CRT that utilized a video controller to scan the image from top to bottom, turning on or off the individual addressable points on the screen, and providing information as to the color of the points. The video controller obtained its information from a memory array whose contents represented in a one-to-one match the points on the screen. This memory was called a frame buffer, and was significantly larger than the display list memory of vector devices. Each point on a line in a raster system had to be stored, not just the endpoints of the line required by the vector displays. An early standard TV had approximately 250,000 addressable points (called pixels) on the screen, so the memory was rather large, particularly when color was included (which increased the size by a factor of three.)
As was mentioned earlier, core memory was the technology until the 1960s. There were some early implementations of a frame buffer comprised of core arrays, but they were bulky and expensive. The first breakthrough in affordable memory technology came with the introduction of integrated circuits. At this point, some experimental “shift-register” frame buffers were introduced. Each location on the screen was represented in the shift register as a bit. A refresh cycle for a scan line on the raster device was accomplished by shifting all the bits in the register and reading the one on the end, placing it at the beginning and shifting the bits again. The intensity of a screen element could only be changed when its bit came up at the end of the register. This resulted in a delay for screen updates.
The real contribution came with the introduction of the Random Access Memory (RAM) chip a few years later. Instead of the sequential storage required in the shift register, the RAM allowed the computer or display processor to access any bit at any time – randomly. A 1K (1024) bit RAM chip was available in 1970, allowing for the affordable construction of a frame buffer that could hold all of the screen data for a TV image, and it could be updated rapidly because of the access capabilities for each bit. In order to increase the complexity and realism of the image on the screen, even larger frame buffers allowed displays as large as 1024×1024. The depth (the number of bits used to represent the intensity and color of each pixel) increased from 1 to 8 to accommodate color, and then to 24 (8 bits for each RGB value) and upwards.
Because of the cost of the frame buffer at this time, a user who needed the image complexity of the 24 bit version had to look for other approaches. (24-bit color is sometimes referred to as “true” color. With 256 levels of red, green and blue a total palette of 16.7 million colors is possible.) One approach was to use an 8 bit frame buffer and the “look-up table”.
In an 8 bit buffer, 8 bits of data represented all the information for a pixel. A normal configuration would allow 3 bits each for the Red and Green color channels, and 2 for Blue. Thus Red and Green could achieve 8 levels each (2x2x2), and Blue had 4 levels (2×2). Hence one could represent a total palette of 256 colors, which is barely capable of representing realistic color. Better realism was achieved through the use of an “adaptive” palette of colors. In this case the image is analyzed and the best fit of 256 colors is directed to a look-up-table (LUT) where each color is associated with an 8-bit address in the table. So rather than representing the intensity/color directly with the 8-bit element of the frame buffer, the location of the intensity/color stored in the table was recorded as an 8-bit address in the frame buffer. To refresh the screen, the address was read from the frame buffer, the processor then pulled the intensity value from the table at that address, and refreshed the image accordingly.
Still, early frame buffer designs were constrained by the high bandwidth required to refresh the entire screen. Software and hardware modifications made this less of an issue, with implementations like the Bit BLT (block transfer of bits) operator that allowed larger portions of the image to be updated as a unit, and word organizations in which the buffer was organized into words that contained multiple pixels. Later, the frame buffers were enhanced with hardware; the z-buffer was introduced by Catmull in 1974, hardware implementations of which included RAM to represent not only the RGB colors for each pixel, but also the z depth from the 3D geometry to be used for update comparisons. Hardware, such as the enhanced frame buffer described by Whitted, was added to the memory to perform operations such as the z-comparison separate from the CPU.
The frame buffer was developed and expanded in both commercial and proprietary environments from the late 1960s through the early 1980s. Bell Labs developed a 3-bit system in 1969; Dick Shoup developed an 8 bit frame buffer at Xerox PARC for the SuperPaint system (Shoup later founded Aurora Systems); NYIT developed the first 24 bit RGB buffers; a flexible system was designed at North Carolina State University, and later modified at Ohio State; companies like Evans and Sutherland, Genisco and Raster Tech developed commercial versions; the NCSU buffer evolved into a commercial programmable display developed by Nick England called the Ikonas system, which later became the Adage system. In 1984, Loren Carpenter introduced the alpha channel to the frame buffer, allowing images to be efficiently composited with antialiasing.
Most installations used the same workflow configuration during this time: a mainframe computer or minicomputer was accessed by multiple users, each with a terminal but sharing a frame buffer, which was connected to the computer with a Bus interface. The concept of the single user workstation, configured with its own internal frame buffer was still on the horizon.
Catmull, Ed. A Subdivision Algorithm for Computer Display of Curved Surfaces, Ph.D. Thesis, Report UTEC-CSc-74-133, Computer Science Department, University of Utah, Salt Lake City, UT, 1974
Whitted, Turner, “Hardware Enhanced 3-D Raster Display System,” Proceedings of the 7th Man-Computer Communications Conference, 1981.
Fuchs, Henry. Distributing A Visible Surface Algorithm Over Multiple Processors. Proceedings of the ACM National Conference 1977
For a discussion of paint systems and the development of hardware support, see Alvy Ray Smith’s article Digital Paint Systems: An Anecdotal and Historical Overview, from the April 2001 issue of the IEEE Annals of the History of Computing