DPI and some other scanner talk

February 01, 1999|By Mike Himowitz

Whenever I write about scanners, the e-mail starts pouring in. And from the tone of the messages, people are fascinated by the technology but find it hard to understand.

When I reviewed the Canon FB620 a couple of weeks ago, the mail fell into two categories. The first group complained that they couldn't find the $99 scanner in local stores. I checked with a Canon spokeswoman, who said the unit has been very popular and that many retailers sold out of their first shipments quickly. More are on the way.

The second group had questions about scanning and digital images. One reader said he had decided to buy a scanner with a resolution of 600 dots per inch but backed off when a friend told him that nobody who knew anything would buy a scanner with less than 1,200-dpi resolution.

``Is a 1,200-dpi scanner really twice as good?'' he asked.

The answer: not really, at least for home and small office use. Here's why.

A scanner works by dividing an image into zillions of little dots, or pixels. It records color and intensity information about each dot, while your scanning software processes that information and stores the image in a format that can be recognized by other programs running on your computer (such as photo editing software, Web browsers, or desktop publishing suites).

A scanner's resolution is determined by the number of dots per inch that it can record. More dots mean greater detail, which is great, up to a point. The problem is that more dots require more computer memory to process, and they take up space on your hard drive. And these requirements grow exponentially as resolution increases.

Here's the arithmetic - bear with me here, it's not very hard. Let's consider a one-square-inch picture. Scanned at 300 dots per inch, it requires 90,000 dots (that's 300 x 300) worth of storage. If you ``double'' the resolution to 600 dpi, you now need 360,000 dots (that's 600 x 600), which is four times as much as the original image. At 1,200 dpi, we're talking 1,440,000 dots, or 16 times as much space as the original.

Consider that your computer has to record very precise information about the color and intensity of each dot. To reproduce the range of colors that the human eye can detect, it requires 24 bits (digital ones and zeros) of information for each dot, or pixel. Many scanners use 30 or 36 bits per pixel for additional information that makes it easier to reproduce shadow detail. When you multiply all this, you come up with some very big numbers.

To test this, I scanned a 4-by-5.5-inch photo of my son twice - once at 300 dpi and once at 600 dpi. I saved the files in TIFF format (an industry standard that uses some compression to save space but doesn't lose any detail). At 300 dpi, the photo occupied 5.8 megabytes of disk space. At 600 dpi, it chewed up more than 25 megabytes of my hard drive. This is enough space to store more than 20 average novels, so the old adage that a picture is worth a thousand words is really an understatement.

Then I created a one-page document using Microsoft Publisher and imported both pictures - or tried to. The 300-dpi image loaded with no problems, but Publisher couldn't handle the 600-dpi image at all. It crashed the first time and created a blank picture box the second time.

So I went back to my photo editing program (Ulead PhotoImpact) and cropped the pictures to 2 inches by 3 inches so that they occupied less space. This time, both loaded properly into Publisher, but when I printed them out side by side on a Hewlett-Packard DeskJet 890C, they were virtually identical. In fact, when I asked my son whether he could tell which was which, he couldn't do it.

There's a good reason for this, and it illustrates why resolution isn't everything. First, every imageshas to be printed or displayed on some device - whether it's an inkjet, laser printer, photo imagesetter (used by newspapers and magazines), or your screen. Second, these images are viewed by humans, whose brains are pretty good at interpreting visual data. Given the resolution of my printer (the number of dots per inch it can reproduce), and my brain's ability to turn all those dots into an image, the difference 300 and 600 dpi images wasn't enough to matter.

In short, the resolution you really need depends on what you'll be doing with the image. For most color photos, 300 dots per inch is plenty if you're using an ink jet printer and decent paper, although you might want a 600 dpi image if you're working with glossy, so-called ``photo'' paper. Here at The Sun, we generally scan photos at 200 dots per inch, the best match for our imagesetters and newsprint.

It's hard to image a home or small-office application that would need more than 600-dpi scanning, unless you're planning to reproduce large-scale photos for a glossy magazine or book. If you're primarily interested in posting photos on a Web page, 72 or 96 dpi will do fine - that matches the resolution of most monitors.

There are some exceptions to this rule, according to professionals I've talked to. If you want to store your images in the compressed JPEG format - which shrinks file sizes at a cost of some quality - a higher resolution scan will be degraded less by compression than a low-resolution scan.

But for most users, a scanner's ease of use, color fidelity and speed are more important than a bump up to the next higher resolution.

Baltimore Sun Articles
|
|
|
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.