Output Devices


MonitorsAlso known as "displays" and sometimes just "screens" used to be big, heavy, bulky things known as CRTs (Cathode Ray Tubes). CRTs were based on old TV/video technology, and had large, glass "tubes." They took up a lot of space on your desk, and were very difficult to move.

Slowly, CRTs were replaced with newer flat-screen monitors, known as LCDLiquid Crystal Display or TFT"Thin Film Transistor". The new flat-screen monitors are worse in most ways: the images they present are less sharp; they are less flexible in showing different resolutionsfor the meaning of "resolution," see below.; and for a long time, they were more expensive, sometimes much more. However, LCD screens had one important feature: they were thin. They didn't take up much space on people's desks. And that was very important for a lot of people. Eventually, CRTs just disappeared, and today it is difficult to find them any more.

In order to understand monitors better, we should learn some basic terms: pixel, resolution, and interlaced/progressive scan.

Let's take a quick look at each one:


A pixelshort for "picture element," or "PIX ELement", simply, is one dot on your screen. It can show one color at a time.

These dots are usually too small for you to see. If you look very closely at the screen, you might see a "screen door" effect; that is the pixel grid. You are seeing the black spaces around the pixels.

By putting these colored dots together, you can make a larger picture.

The pixel has three basic colors: red, green, and blue, and can go from completely dark to full brightness. Using these three basic colors of light, known as RGBRed - Green - Blue, they can combine to make any color. For example, red plus green equals yellow. I know that sounds strange, but see the color wheel below and you will see.

You have probably seen these colors before. Maybe you looked very closely at your TV screen--it has these three colors. Or you might have seen a giant-screen TV on the street or in a sports stadium, and could see the red, green, and blue lights up-close.


"Resolution" simply means the number of pixels on the screen. Resolution is usually shown with two numbers--how many pixels horizontally (left-to-right), and how many pixels vertically (top-to-bottom). For example, a standard 17-inch monitor might have a resolution of 1280 x 1024--meaning that if you could count the pixels from the left to the right, you would count 1,280 of them, and 1,024 from top to bottom.

Resolution comes from counting the number of pixels on a screen. Click on the above image to see a full-size chart on standard resolutions.

The basic rule is, a higher number is higher resolution, and higher resolution always means better quality image.

Interlaced vs. Progressive Scan

I will not go into a detailed explanation of what these mean. I will simply point out a few facts:

  • Progressive is higher-quality than Interlaced
  • Progressive means that you see a whole picture flashed 60x per second
  • Interlaced means that you see half a picture flashed 60x per second
  • Progressive means a sharper picture, and is standard for computers
  • Interlaced is old technology, and is only used because changing would be expensive

Computer monitors have always used only progressive scan. Interlacing is only important when thinking about TV sets. Some TV sets use interlaced-scan technology, and others use progressive-scan. New TV sets can use both.

When talking about resolution for computers, you see two numbers--horizontal and vertical--such as 1280 x 1024, or 1900 x 1200. When talking about resolution for TV sets, only one number--vertical--is shown, with a small "i" or "p" to show whether the image is interlaced or progressive. Here is a chart with TV resolutions:

Type Resolution
NTSC (American and Japanese TVs) 480i
PAL (European TVs) 576i
HDTV 720 720p
HDTV 1080 1080i
Blu-ray Video 1080p

As you can see, older TVs used in Japan and America (480i) have the worst resolution, because they have the fewest number of lines, and they used interlaced scan. Now, the question is, is 720p or 1080i better? 720p has progressive scan, but 1080i has a higher resolution. The answer: it depends on the situation. In fact, most people can't tell the difference.

However, there is a special case, as you can see from the chart above: Blu-ray video. Blu-ray videos are shown at 1080p, a combination of higher resolution and progressive scan. If you watch older movies, this won't make any difference, because older movies are often not very sharp and focused. But some new movies, especially computer-animated moves, are extremely sharp--and on 1080, they look much sharper and better. When I first got an HDTV set myself, I thought the quality was great. However, when I saw my first Blu-ray movie (it was Kung Fu Panda, if you are curious), I was amazed by the picture quality.

What this also means is that You can use your Hi-vision (HDTV) set as a giant computer monitor. Most HDTVs have VGA and HDMI ports in the back. Plug your computer in, and you will probably see the computer screen in bright, giant detail!



Personal computer printers started with a very basic and low-quality type: the dot matrix printer. This printer had rows of pins which poked an ink ribbon and typed characters by building them up with little dots. The first such printers were very low quality. In the example at right (a travel itinerary of mine from 1983), the dot matrix printer I used made letters on 7 dots tall.

Over time, dot matrix printers improved, but they always were of poor quality--and they made a very loud noise when printing.

Another printer type was available at the time, but it was very expensive: the laser printer. Many people imagine that a laser burns the letters onto the page, but this is not how it works. Instead, the laser "paints" the page to be printed on a metal drum; the drum picks up ink where the laser painted; and the ink is then pressed on to a piece of paper using heat.

The advantage of laser printers was clear: they were fast, and had produced high-quality prints. The only problem was that they were much too expensive for most people--at thousands of dollars, they were more expensive than the computers! Over time, prices went down, and today, cheap laser printers start at about ¥10,000. The ink for laser printers, called "toner," is still expensive, especially for color toner.

The next printer technology to appear was inkjet printers in the late 1980's. Inkjet printers cost less than laser printers for the average person, and in the 1990's, color inkjet printers gave the technology another big advantage--the ability to make cheap color prints with photo quality. Today, inkjets start at about ¥4,000.

One problem with all modern printers is the ink. Whether it is toner for laser printers, or ink cartidges for inkjet printers, you will probably spend more for one charge of ink than you will for the printer itself. Inkjet cartidges used to be refillable, but the printer companies, looking for profits, made that much more difficult.


Measuring Quality

With monitors and printers, one way of measuring the quality of the output is in how many dots per inch one see. With video monitors, this is called PPI, or "pixels per inch"; with printers, it is called DPI, or "dots per inch."

Do not confuse PPI with resolution; they are very different things. Resolution is the number of pixels in total. PPI is the number of pixels across one inch--in other words, PPI refers to the size of the pixels. A good way to remember it is, the smaller the pixels, the better the quality. A higher DPI means a sharper, clearer picture.

For example, computer monitors commonly have a "pixel density" at or under 100 pixels per inch. The iPhone 3G had a screen with 163 ppi, so it was better quality than most computer monitors. The iPhone 4 had an even higher quality screen--it is the same size as the iPhone 3G, but has 4 times the number of pixels; it is 326 ppi. This pixel density is close to what the human eye is capable of seeing.

You can see an example of this at the right side. The image shows an icon on the 3G's 163 ppi display (left), and the same icon on the iPhone 4's 326 ppi display (right). It is fairly easy to see the difference in quality.

Printers use a different but similar measure, DPI, or "dots per inch." Some printers advertise very high DPI rates, up to 9600 x 2400 dpi for inkjet printers. However, this is a maximum, and does not mean that you can print 9600 pixels of image data in one inch. To make one image pixel, an inkjet must print several dots to get the color just right. Also, "9600 dpi" might refer to the highest-quality printing, using the highest-quality paper, which is expensive and takes a lot of time and ink.

For plain text, 300 ~ 600 dpi is fine; for color printing, 600 ~ 1200 should be good enough for most people.