09-03-2017 (2106 lectures) | Categoria: Articles |
Take a good look at this paragraph. You're reading it thanks to the magic of a computer display, whether it be LCD, CRT, or even a paper printout. Since the beginning of the digital era, users have needed a way to view the results of the programs they run on a computer--but the manner in which computers have spit out data has changed considerably over the last 70 years. Let's take a tour.
While almost every early computer provided some sort of hard-copy printout, the earliest days of digital displays were dominated by rows of blinking indicator lights--tiny light bulbs that flashed on and off when the computer processed certain instructions or accessed memory locations.
Photos: Computer History Museum, Deutsches Museum
Â
The ENIAC, among other early electronic computers, used Hollerith punched cards as both input and output. To write a program, an operator typed on a typewriter-like machine that encoded the instructions into a pattern of holes punched into a paper card. A person then dropped a stack of cards into the computer, which read and ran the program. For output, the computer punched encoded results onto blank punch cards, which operators then had to decode with a device like the IBM 405 tabulator (shown at right), which tallied and printed card values onto sheets of paper.
Photos:Â Computer History Museum, IBM, Benj Edwards
Â
As an alternative to punched cards, many early computers used long rolls of paper tape punched in a pattern that represented a computer program. Many of those same computers also punched the program results onto the same type of paper tape. An operator then ran the tape through a machine like the one shown here, and the electric typewriter automatically typed the computer output in human-readable form (numerals and letters) onto larger rolls of paper.
Photos: Ed Bilodeau, Creed & Company
The first cathode-ray tubes first appeared in computers as a form of memory, not as displays (seeWilliams tubes). It wasn't long before someone realized that they could use even more CRTs to show the contents of that CRT-based memory (as shown in the two computers on the left). Later, designers adapted radar and oscilloscope CRTs to use as primitive graphical displays (vector only, no color), such as those in the SAGE system and the PDP-1. They were rarely used for text at that time.
Photos (clockwise from top): Computer History Museum, MITRE, DEC, Onno Zweers
Â
Prior to the invention of the electronic computer, people had been using teletypes to communicate over telegraph lines since 1902. A teletype is an electric typewriter that communicates with another teletype over wires (or later, over the radio) using a special code. By the 1950s, engineers were hooking teletypes up to computers directly, to use them as display devices. The teletypes provided a continuous printed output of a computer session. They remained the least expensive way to interface with computers until the mid-1970s.
Photo: Systems Engineering Laboratories
Sometime in the early 1960s, computer engineers realized that they could use CRTs as virtual paper in a virtual teletype (hence the term "glass teletype," an early name for such terminals). Video displays proved far faster and more flexible than paper; such terminals became the dominant method for interfacing with computers in the early to mid-1970s. The devices hooked up to computers through a cable that commonly transmitted code only for text characters--no graphics. Until the 1980s, few supported color.
Photos: UNIVAC, Grant Stockly, DEC
Â
Teletypes (even paper-based ones) cost a fortune in 1974--far out of reach of the individual in the do-it-yourself early PC days. Seeking cheaper alternatives, three people (Don Lancaster, Lee Felsenstein, and Steve Wozniak) hit on the same idea at the same time: Why not build a cheap terminal device using an inexpensive CCTV video monitor as a display? It wasn't long before both Wozniak and Felsenstein built such video terminals into computers (the Apple I and the Sol-20, respectively), creating the first computers with factory video outputs in 1976.
Photos: Steven Stengel, Michael Holley
Â
In addition to RF television output, many early home PCs supported composite-video monitors (shown here) for a higher-quality image. (The Commodore 1702 also offered an alternative, higher-quality display through an early S-Video connection.) As the PC revolution got into full swing, computer makers (Apple, Commodore, Radio Shack, TI) began to design and brand video monitors--both monochrome and color--especially for their personal computer systems. Most of those monitors were completely interchangeable.
Photos: Radio Shack (left), Shane Doucette
With video outputs came the ability to use ordinary television sets as computer monitors. Enterprising businesspeople manufactured "RF modulator" boxes for the Apple II that converted composite video into a signal that simulated an over-the-air broadcast--something a TV set could understand. The Atari 800 (1979), like video game consoles of the time, included an RF modulator in the computer itself, and others followed. However, bandwidth constraints limited the useful output to low resolutions, so "serious" computers eschewed TVs for dedicated monitors.
Photo: Apple
In the 1960s, an alternative display technology emerged that used a charged gas trapped between two glass plates. When a charge was applied across the sheets in certain locations, a glowing pattern emerged. One of the earliest computer devices to use a plasma display was the PLATO IV terminal. Later, companies such as IBM and GRiD experimented with the relatively thin, lightweight displays in portable computers. The technology never took off for PCs, but it surfaced again years later in flat-panel TV sets.
Photos: Simon Bisson, Corestore, Steven Stengel
Yet another alternative display technology--the liquid crystal display--arrived on the scene in the 1960s and made its commercial debut in pocket calculators and wristwatches in the 1970s. Early portable computers of the 1980s perfectly utilized LCDs, which were extremely energy-efficient, lightweight, and thin displays. Early LCDs were monochrome and low contrast, and they required a separate backlight or direct illumination for users to read them properly.
Photos: PC-Museum.com, Old-Computers.com, Steven Stengel
In 1981, the IBM PC shipped with a directly attached monochrome video display standard (MDA) that rivaled a video terminal in sharpness. For color graphics, IBM designed the CGA adapter, which hooked to a composite-video monitor or the IBM 5153 display (which used a special RGB connection). In 1984, IBM introduced EGA, which brought with it higher resolutions, more colors, and, of course, new monitors. Various third-party IBM PC video standards competed with these in the 1980s--but none won out as IBM's did.
Photos: IBM (left), Steven Stengel
The first Macintosh (1984) included a 9-inch monochrome monitor that crisply rendered the Mac's 512-by-342-pixel bitmapped graphics in either black or white (no shades of gray here). It wasn't until the Macintosh II (1987) that the Mac line officially supported both color video and external monitors. The Mac II video standard was similar to VGA. Mac monitors continued to evolve with the times, always known for their sharpness and accurate color representation.
Photos: Apple
The 1980s saw the launch of PC competitors to both the Macintosh and the IBM PC that boasted sharp, high-resolution, color graphics. The Atari ST series and the Commodore Amiga series both came with proprietary monochrome and RGB monitors that allowed users of those systems to enjoy their computer's graphics to the fullest.
Photos: Bill Bertram (left), Steven Stengel
In the early days of the IBM PC, users needed a different monitor for each display scheme, be it MDA, CGA, EGA, or something else. To address this, NEC invented the first multisync monitor (called "MultiSync"), which dynamically supported a range of resolutions, scan frequencies, and refresh rates all in one box. That capability soon became standard in the industry.
In 1987, IBM introduced the VGA video standard and the first VGA monitors, in league with IBM's PS/2 line of computers. Almost every analog video standard since then has built off of VGA (and its familiar 15-pin connector).
Photos: NEC, IBM
When LCDs first appeared, they were low-contrast monochrome affairs with slow refresh rates. Throughout the 1980s and 1990s, LCD technology continued to improve, driven by a market boom in laptop computers. The displays gained more contrast, better viewing angles, and advanced color capabilities, and they began to ship with backlights for night viewing. The LCD would soon be poised to leap from the portable sector into the even more fertile grounds of the desktop PC.
Photos: Altima, Texas Instruments
In the mid-1990s, just about all monitors--for PCs and for Macs--were beige. This was the era of the inexpensive, color, multisync VGA monitor that could handle a huge range of resolutions with aplomb. Manufacturers began experimenting with a wide assortment of physical sizes (from 14 inches to 21 inches and beyond) and shapes (the 4:3 ratio or the vertically oriented full-page display). Some CRTs even became flat in the late 1990s.
Photos: Radius, ViewSonic
Computer companies had experimented with desktop LCD monitors since the 1980s in small numbers, but those monitors tended to cost a lot and offer horrible performance in comparison with the more prevalent CRTs. That changed around 1997, when a number of vendors such as ViewSonic (left), IBM (center), and Apple (right) introduced color LCD monitors with qualities that could finally begin to compete with CRT monitors at a reasonable price. These LCDs used less desk space, consumed less electricity, and generated far less heat than CRTs, which made them attractive to early adopters.
Photos: ViewSonic, IBM, Apple
Today, LCD monitors (many widescreen) are standard across the PC industry (except for tiny niche applications). Ever since desktop LCD monitors first outsold CRT monitors in 2007, their sales and market share have continued to climb. Recently, LCD monitors have become so inexpensive that many people experiment with dual-monitor setups like the one shown here. A recent industry trend emphasizes monitors that support 3D through special glasses and ultrahigh refresh rates.
With most TV sets becoming fully digital, the lines between monitor and TV are beginning to blur just as they did in the early 1980s. You can now buy a 42-inch high-def flat-panel display for under $999 that you can hook to your computer, something that would make anyone's head explode if you could convey the idea to people in the 1940s--back when they were still using paper.
Photos: Asus, Go.Video, Samsung
You May Also Be Interested In:
*Â 3D Displays and TVs Will Be Ubiquitous at the 2011 Consumer Electronics Show