Alan Perlis: The First Computer Scientist
I was a teenager in 1957 when the Russians launched Sputnik. In the national reaction to it I was inspired to pursue science. I was all set to go to the Massachusetts Institute of Technology or the California Institute of Technology to become a physicist, when the Carnegie Institute of Technology offered me a full scholarship.
Since my father had just died, and I was my mother’s only child, going to Carnegie Tech was a better choice. It was also better for my career, as it turned out.
Even though I could walk to Tech, I lived in a fraternity in order to have a semblance of college life. One fall day I was enjoying a beer and watching TV when one of my fraternity brothers said, “The professor in my class said that anyone who was in the rat-race for grades could simply write the grade they wanted on the front of their exam paper, and they’d get it.” I said, “That sounds like a fascinating subject. What is it?” The subject was computer programming, and the professor was Alan Perlis.
I enrolled in his spring programming course and joined hundreds of geeky students for the first class. Perlis had a striking appearance with no hair on his head and no eyebrows. In an enthusiastic, glib voice he started telling us how to program. But, for first of many times, I couldn’t comprehend what he said. I learned that Perlis’s idea of lecturing was to talk about whatever was amusing or perplexing him that day.
The homework involved creating programs by punching holes in cards and putting them in a drawer. They would be carried into a glass-enclosed computer room. An operator, or sometimes Perlis himself, would feed the cards into the computer. Hours later, I would get back some sheets of paper, usually telling me my program had been rejected because of some niggling syntax error. Everyone else seemed to think this was fun.
Perlis assigned whimsical homework problems, such as “Design a language for describing motel layouts,” apparently because he had recently stayed in a motel. I thought all this was pretty stupid compared to understanding the physics of the universe.
My friends kept telling me that computing was the future, so I tried another course by a teacher whose instructions were more clear. He said, “Write a program to solve any problem you like.” I decided on a program to schedule an elevator. Perlis dropped by the class one day and suggested to me the possibility of kicking people off the elevator before they reached their floor, confirming my skepticism about him.
On the Wednesday afternoon before Thanksgiving, sitting in my dorm room, I started to write the program before heading out to meet my high school buddies. The next thing I knew it was 11 PM. I had finished a draft of the program, missed dinner, missed my buddies, and was hooked on programming, if not Alan Perlis.
In most universities, computers were a precious resource often sequestered in labs, like scientific instruments. Perlis, who ran the computer center and had been a student at Tech himself, made student computing his priority. He was like the Pied Piper, leading students to this new adventure. He had random, swashbuckling ideas. He seemed to be saying, “Here is an open field that is going to grow, so do whatever you like. It will probably be useful. You don’t have to be a genius; just go for it.”
Perlis served on international committees that defined new programming languages. He was the president of computing’s first academic association. He was the first winner of the Turing Award, which became computing’s equivalent of the Nobel Prize. I began to think, “If this doofus is a big cheese in computing, there can’t be much competition, especially compared to physics with its Newtons and Einsteins. Computing looks like a good career choice.” I spent the rest of my time at Carnegie Tech becoming a computer nerd. Many of my future colleagues, notably Butler Lampson and John Reynolds, who did great theoretical computer science here, had also bailed out of physics.
While I was in graduate school at MIT, Perlis, who had been an MIT graduate student, came to give a presentation. He talked about his programming language, Formula Algol, which combined algebraic manipulation with numerical computation. At one point, he bragged that the system could apply L’Hopital’s rule, an obscure calculus trick. The room started to buzz, and someone asked how often it had to be used. Without missing a beat, Perlis said “Once!” and continued his presentation, which ended abruptly as he rushed out to catch a plane. I believe Perlis often tried to say the most unexpected thing he could think of.
Sputnik had also inspired the U.S. Department of Defense to create the Advanced Research Projects Agency, known as DARPA, which began funding professors at favored universities urging them to teach people about computers. The recipients of this largesse at Carnegie Mellon University, which Carnegie Tech became, were Perlis, Herbert Simon, and Allen Newell. Perlis became the head of a new computer science department because Simon and Newell much preferred concentrating on research. Actually, Perlis didn’t appear to be managing anything either. He held no meetings. He simply made seat-of-the-pants decisions as they arose. His motto seemed to be, “Ready, fire, fire, fire.” The three of them built this great department without squabbling or even discussing much.
Perlis introduced the reasonable person principle, which says, “We can’t have rules to cover every situation, so people should do what they think is reasonable; and we’ll worry about the consequences later.” The DARPA grants allowed him support Ph.D. students without regard to what advisors they chose to work with. At an early stage, some students complained about the way the graduate program was organized. Perlis said, “Okay, show me an alternative.” They did, and he adopted it.
By the summer of 1971, I was married and teaching at University of California, Berkeley. While my wife, Susan, was introducing to Europe we attended a computer science conference in Bavaria. One of the organizers had been in the German Army during World War II and some people refused to participate. One might have thought Perlis, a descendant of Russian Jews from Pittsburgh, would be one of them; but he plunged in with great enthusiasm, bringing his wife and young daughter. We spent many evenings with them, drinking beer and kibitzing. He loved to tell jokes, many containing folk wisdom. Here’s my favorite:
On a cold winter day a baby bird falls out of its nest and a passing peasant comes upon it. He carries the bird for a while and then happens upon a steaming cow pie. He pokes a hole in the feces and deposits the bird. The warmth revives the bird who begins to sing. A passing fox hears the bird, pounces on it and devours it. THE MORAL: It’s not always your enemies that put you in it. It’s not always your friends who get you out of it. But when you’re in it up to your neck, don’t sing!
The computer conference featured a recurring debate between Perlis and Edsger Dijkstra, a forbidding Dutch intellectual. Their differences about computer science were rooted in their outlooks. Perlis was a pragmatic optimist and believed in human progress; Dijkstra was theoretical, a pessimist and nasty. There was an ironic contradiction between their appearances and personalities. Perlis could pass for an extraterrestrial while Dijkstra looked warm and fuzzy with a beard and glasses.
Perlis resigned from Carnegie Mellon and moved to Yale University to head its computer science department. But he always considered Carnegie Mellon his home. He came back to celebrate our department’s 10th anniversary. He gave an inspiring talk about keeping the fun in computer science.
Years later, in a hallway discussion with my colleagues at Xerox’s Palo Alto Research Center, I was talking about Perlis’s unique contribution:
Perlis realized, earlier than most, that programming was where the action was. Most computer science pioneers were just applying their previous specialties—digital engineering, mathematics, psychology, whatever—to computing. Perlis had the vision of software as a new, world-transforming technology. He once remarked, “You’re talking like mountaineers arguing how to climb Mount Everest most efficiently, when the real problem is how to transport thousands of people a day to the top of hundreds of mountains.” That’s why he made programming languages a focus for computer science.
As I continued, I was interrupted by Perlis’s voice coming out of a nearby office that he was visiting: “Do I hear my name being mentioned?” Embarrassed, I said, “Oh, hello Alan.” Today, I’m happy he heard my words that way. It would have been awkward to say them to his face.
At the 25th anniversary of the department he started, I gave a laudatory speech about him at a banquet attended by his wife and children, then organized a fund-raising drive to create an endowed chair named after him.
James H. Morris is a retired professor of computer science and dean of the West Coast campus of Carnegie Mellon University. In this series of blogs for Pittsburgh Quarterly he writes about some of the computing pioneers he encountered during his career.