Precision of computer-chip circuitry gives rise to exquisite patterns

MICRO ART ON DISPLAY

March 30, 1992|By Josh Hyatt | Josh Hyatt,Boston Globe

Boston -- Forgive me for saying it, but the IBM 1986 logic chip really speaks to me.

I think it's something about the details of that last metal layer, those spellbounding blue patterns formed like musical instruments. To me, those guitar and mandolin shapes look like they jumped right out of a Picasso.

Not that I don't appreciate some of the other microprocessor designs now on display at the Massachusetts Institute of Technology Museum. It is hard to resist, for instance, the playfully skewed perspective of the stark IBM dynamic random-access memory chip or the psychedelic overtones of AT&T Bell Laboratory's CRISP microprocessor. Peter Max himself couldn't have done any better.

Indeed, it's hard to believe all 31 of the computer-generated diagrams were designed for the sole purpose of powering computers. Sponsored by the Intel Corp. Foundation and organized by New York's Museum of Modern Art, "Information Art: Diagramming Microchips," features circuits by about a dozen manufacturers and universities. It will be showing at MIT until April 5.

Integrated circuits, also called microchips, are no larger than a thumbnail, but they are designed using computer diagrams that are as much as 200 times bigger. The show's presenters say this effort -- with the designs framed and labeled -- represents the first exhibition to examine the computer chip as a work of art.

The art world has historically been rocked by such movements as cubism and abstract impressionism. But is it ready for an emerging school that might best be called -- mirroring its cinematic counterpart -- Revenge of the Nerds?

The answer is a thundering yes. At the exhibition's opening reception, several viewers made compelling arguments that, like any lasting art, the chip designs served as emotional Rorschach tests, reflecting and reshaping the sensibilities that viewers brought to them.

Vincent Galluccio, for instance, allowed himself a misty moment while viewing Intel Corp.'s 8086 chip, developed in 1978. Like a nostalgic Norman Rockwell print, it evoked a simpler time. "I was around for the inception of this technology," recalled Mr. Galluccio, who recently retired after 28 years at International Business Machines Corp. "It's amazing what microprocessors have done for us. They've come a long way since we were tapping little vacuum tubes."

Mr. Galluccio gradually drifted into a computer-generated reverie. "The first time I saw a computer was at Prudential Insurance, and it took up a whole room," he said. "That was in 1963. Now they can take 1,000 of those rooms and put them on a single chip."

Some were less struck by the chips' technology than they were by the underlying humanity of the designers, which often shone through. Several people peered into the Intel 386 chip; snuggled amid its 229,000 transistors were initials, among other things.

Like avid New York Times readers scouring Hirschfeld caricatures for hidden NINAs, viewers let out a collective squeal when they discovered, for instance, the initials E.T. next to a crude outline of a phone. "This is not cold technology," observed Jerry Wheelock, director of management information systems at Goodwill Industries. "There's a little bit of personality in there, as well as silicon."

Indeed, more than a few folks were anxious to perform armchair psychoanalysis on the designers who can stand to devote their lives to managing such complexity.

"I'll bet if you magnify some of these things, it says 'Satan Lives' in there or something like that," offered Mark Roy, a network operations consultant at John Hancock Financial Services. "Those guys think that way."

As with any crowd, there were those who tried to take leadership roles by offering their interpretations as the clearly obvious exegesis. With all the boisterousness of Gen. H. Norman Schwarzkopf tracing the path of a so-called "smart bomb" for the press, one man argued that Digital Equipment Corp.'s 1975 chip was actually a map of Kuwait, replete with mosques.

Others agreed that the IBM 1986 logic chip would make a nice shopping mall.

I found it hard to dispel LSI Logic's 1988 chip's resemblance to Grant Wood's "American Gothic" -- minus the two people, the pitchfork and most of the farmhouse. The chip looks exactly like two windows of the house, with their shades drawn.

For purists who found it inexcusable to separate form from function, the experimental neural net developed by Synaptics Inc. was a favorite.

Kurt Hasselbalch couldn't help but stare at it; oddly enough, it seemed to feel the same way about him. The chip was designed to mimic some of the functions of the human retina. "I like the fact that it's used to help people see. It looks a lot more organic than the rest of them," said Mr. Hasselbalch, curator of MIT's Hart nautical collection. "It looks real. Then again, I've never seen my own retina."

Any art critic with half a grasp of modern cultural history couldn't help but embrace the Synaptics chip; in "Terminator 2" bionic Arnold Schwarzenegger warns his dastardly nemesis that he is an advanced processor "equipped with a neural net CPU."

Not that anyone attending the show need bring such a sophisticated frame of reference. The chips could be understood on many, many levels.

"They are not art in the sense that they were made by people who thought of themselves as artists," said Penny Noyce, whose father, Robert Noyce, co-invented the integrated circuit and later co-founded Intel. "But they are art because they are pleasing intellectually, as well as in shape, form and color. The requirements for precision in design make them beautiful."

Baltimore Sun Articles
|
|
|
Please note the green-lined linked article text has been applied commercially without any involvement from our newsroom editors, reporters or any other editorial staff.