It’s the end of the world as we know it

I’ve had this feeling coming over me in the last week that something significant has been happening in the computer world. To some observers it may not seem like it. With the exception of stuff moving to the web, things are “same as it ever was” to quote David Byrne. The reason I bring this up is I’ve been hearing more and more lately signs that computer science in academia is losing its vibrancy, at least in the U.S. It began several years ago at the end of the Y2K crisis, and the dot-com bust. Enrollment in computer science at universities nationwide fell off dramatically. One article I read a few years ago said that enrollment was the lowest it’s been since the 1970s.

About a week ago I found an interview with Alan Kay (AK), by Stuart Feldman (SF), in ACM Queue from the Dec. 2004/Jan. 2005 issue. Kay had been getting some publicity in ’04 and ’05 since he was given the Turing Award in 2003 for his work on Smalltalk 30 years earlier. The interview was enlightening in many ways. Kay has been saying these things for the last several years. I didn’t totally understand until now why it’s been important. I think it’s time people listen. I encourage you to read the article, but I will put some salient quotes below:

(AK) One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.

You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.

So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.

SF So Smalltalk is to Shakespeare as Excel is to car crashes in the TV culture?

AK No, if you look at it really historically, Smalltalk counts as a minor Greek play that was miles ahead of what most other cultures were doing, but nowhere near what Shakespeare was able to do.

If you look at software today, through the lens of the history of engineering, it’s certainly engineering of a sort—but it’s the kind of engineering that people without the concept of the arch did. Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.

The languages of Niklaus Wirth have spread wildly and widely because he has been one of the most conscientious documenters of languages and one of the earlier ones to do algorithmic languages using p-codes (pseudocodes)—the same kinds of things that we use. The idea of using those things has a common origin in the hardware of a machine called the Burroughs B5000 from the early 1960s, which the establishment hated.

SF Partly because there wasn’t any public information on most of it.

AK Let me beg to differ. I was there, and Burroughs actually hired college graduates to explain that machine to data-processing managers. There was an immense amount of information available. The problem was that the DP managers didn’t want to learn new ways of computing, or even how to compute. IBM realized that and Burroughs didn’t.

The reason that line lived on—even though the establishment didn’t like it—was precisely because it was almost impossible to crash it, and so the banking industry kept on buying this line of machines, starting with the B5000. Barton was one of my professors in college, and I had adapted some of the ideas on the first desktop machine that I did. Then we did a much better job of adapting the ideas at Xerox PARC (Palo Alto Research Center).

Neither Intel nor Motorola nor any other chip company understands the first thing about why that architecture was a good idea.

Just as an aside, to give you an interesting benchmark—on roughly the same system, roughly optimized the same way, a benchmark from 1979 at Xerox PARC runs only 50 times faster today. Moore’s law has given us somewhere between 40,000 and 60,000 times improvement in that time. So there’s approximately a factor of 1,000 in efficiency that has been lost by bad CPU architectures.

The myth that it doesn’t matter what your processor architecture is—that Moore’s law will take care of you—is totally false.

SF It also has something to do with why some languages succeed at certain times.

AK Yes, actually both Lisp and Smalltalk were done in by the eight-bit microprocessor—it’s not because they’re eight-bit micros, it’s because the processor architectures were bad, and they just killed the dynamic languages. Today these languages run reasonably because even though the architectures are still bad, the level 2 caches are so large that some fraction of the things that need to work, work reasonably well inside the caches; so both Lisp and Smalltalk can do their things and are viable today. But both of them are quite obsolete, of course.

I’m sure there are Lispers and Smalltalkers out there who would disagree. To clarify, Kay said here in 2003:

“Twenty years ago at PARC,” Kay says, “I thought we would be way beyond where we are now. I was dissatisfied with what we did there. The irony is that today it looks pretty good.”

Back to the ACM Queue interview:

AK A commercial hit record for teenagers doesn’t have to have any particular musical merits. I think a lot of the success of various programming languages is expeditious gap-filling. Perl is another example of filling a tiny, short-term need, and then being a real problem in the longer term. Basically, a lot of the problems that computing has had in the last 25 years comes from systems where the designers were trying to fix some short-term thing and didn’t think about whether the idea would scale if it were adopted.

It was a different culture in the ’60s and ’70s; the ARPA (Advanced Research Projects Agency) and PARC culture was basically a mathematical/scientific kind of culture and was interested in scaling, and of course, the Internet was an exercise in scaling. There are just two different worlds, and I don’t think it’s even that helpful for people from one world to complain about the other world—like people from a literary culture complaining about the majority of the world that doesn’t read for ideas. It’s futile.

I don’t spend time complaining about this stuff, because what happened in the last 20 years is quite normal, even though it was unfortunate. Once you have something that grows faster than education grows, you’re always going to get a pop culture. It’s well known that I tried to kill Smalltalk in the later ’70s. There were a few years when it was the most wonderful thing in the world. It answered needs in a more compact and beautiful way than anything that had been done before. But time moves on. As we learned more and got more ambitious about what we wanted to do, we realized that there are all kinds of things in Smalltalk that don’t scale the way they should—for instance, the reflection stuff that we had in there. It was one of the first languages to really be able to see itself, but now it is known how to do all levels of reflection much better—so we should implement that.

SF If nothing else, Lisp was carefully defined in terms of Lisp.

AK Yes, that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over.

I realized that anytime I want to know what I’m doing, I can just write down the kernel of this thing in a half page and it’s not going to lose any power. In fact, it’s going to gain power by being able to reenter itself much more readily than most systems done the other way can possibly do.

All of these ideas could be part of both software engineering and computer science, but I fear—as far as I can tell—that most undergraduate degrees in computer science these days are basically Java vocational training. I’ve heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification.

[Make a note of this point. I’m going to refer to it later–Mark]

SF What do you think a programming language should achieve and for whom, and then what is the model that goes with that idea?

AK Even if you’re designing for professional programmers, in the end your programming language is basically a user-interface design. You will get much better results regardless of what you’re trying to do if you think of it as a user-interface design. PARC is incorrectly credited with having invented the GUI. Of course, there were GUIs in the ’60s. But I think we did do one good thing that hadn’t been done before, and that was to realize the idea of change being eternal.

I wanted to give you some pieces from the article to give you the idea that while Kay is critical of where things are today, he’s discussing ideas for how to rejuvenate computer science.

Referring back to that thing that Kay said about the current state of affairs in computer science programs at some universities, that they’re nothing but Java vocational training, I’ve been hearing this from several quarters. One of which is a blogger I’ve read frequently named Justin James. He wrote an entry recently called “Ripoff Educations”. Quoting from it:

I have recently been talking to someone who is in the process of studying “Computer Science” at the University of South Carolina. I put “Computer Science” in quotation marks, because as far as I can tell, they are teaching “How to Program In Java” more than Computer Science. His education is so threadbare, he did not know what Perl was for (I am not saying he had to learn it there, but to never hear of it?), and had never heard of Lisp, Scheme, or even functional programming. In sum, they are teaching how to perform certain tasks in Java, subject by subject, and completely ignoring the fundamental scientific and mathematical underpinnings of programming.

Indeed, the coursework is so blindingly rudimentary that a course that was CS 112 (Data Structures, which I had to take in the elderly language known as “C”) at my alma mater (Rutgers College) is not introduced until the student is nearly complete with their degree, CS 350! Color me amazed. Without learning the basics (and a course in trees, hashes, sorting, etc. is pretty basic in my book) until the junior year, how can someone even claim that this is a serious program? In fact, not one course in formal logic is mandated. My experience with programming is that, at the end of the day, it is work with formal logic. Yet, it is not taught.

Well, it isn’t. It is a sad joke. I would not pay a dime for this “education”, and neither should the prospective students who are interested in learning to be a good programmer. It is a real shame that this is the norm, thanks to schools slapping together a shake-n-bake program of mediocrity to attract students.

Where I went to college, texts like Knuth’s works were used. KBR was a textbook. A typical student graduated having worked with at least 5 different languages, having enough UNIX experience to call themselves a junior systems administrator, and knowing enough math to practically have minored in it without trying.

I met with a friend recently who had just dropped out of a computer science masters program at his (and my) alma mater, after 1 semester. It wasn’t that it turned out to not be his taste, or that it was too hard, but rather it appeared that the faculty he was dealing with did not care to teach, and in one case was incompetent. While he said there were a few good faculty who were doing good work there, he found that most of the faculty was disengaged from the task of teaching. Freshman enrollment in CS was down dramatically from when we were undergrads at this university.

Meanwhile “across the way” at the Business College of the same university things were hopping. Their CIS program has dramatically improved from what it used to be almost 20 years ago. They have a lot of enrollment. Many of the freshmen coming into the CIS program are students who probably would’ve enrolled in computer science in the past. CIS used to teach COBOL, some C, and MVS JCL. Now it has a strong Java program, with some computer science theory, but the emphasis is on hands-on experience. Further they teach the soup-to-nuts of building web applications. The computer science program does not do this. They teach Java, and computer science theory, but they don’t appear to teach a thing about the web, except the network architecture of Ethernet. I’m not necessarily complaining here, but I think the program is mired in the past. It’s losing favor but doesn’t appear to know what to do with itself.

My friend speculated that perhaps what’s going on with other universities is that computer science departments are seeing this migration of students, and they are copying the CIS program of the Business College in an attempt to compete. Such unoriginal thinking, as Justin James points out, cheats the students, because they could get trained in Java anywhere and for much cheaper. It also cheats the discipline of computer science.

What would I do? I’d explore taking a risk. I’m not responsible for any computer science program at a university, so this will be easier said than done, I imagine. Why not try dredging up and using some excellent computer science that was done and proven in the past, but has been largely ignored or marginalized? “Ignored?” you might say. Yes, ignored.

Here is the truth in what I think is going on. Thirty years ago Dr. Edsgar Dijkstra used to complain about the data processing mindset that was pervading computer science. It’s still there. Paul Murphy, a blogger at ZDNet, has complained about it frequently. This mindset has run its course. That’s a major reason the CIS program at my alma mater is kicking the pants of the CS program. The technology that was part and parcel of this mindset has matured. The only reason that people will come back to computer science at this point is if it blazes new trails, and even that’s relative. Computer science departments could put a major focus on dynamic languages and the curriculum would seem “new”, because they’ve been marginalized for so long. It must focus less on teaching students for what’s currently out there, and focus more on teaching students for research, fostering entreprenuership, and technology transfer. “This won’t be attractive. We’ll just have the same problem we have now,” you say. Not if it’s exciting and new. What’s exciting? How about developing programming languages for the web that help to dramatically reduce the amount of code people have to write to get something done? What about languages that help the programmer express their intent clearly? What about rethinking human-computer interaction on a GUI or the web interface, and the way information is stored and retrieved? That may not seem like computer science, but it’s not beyond technical study.

Whatever happened to dynamic programming languages? Whatever happened to teaching the mathematics/formal logic of computer science? I’m not saying it’s gone everywhere, but it seems to be fading. Computer science was originally an offshoot of math. What about rethinking the hardware architecture–delving into the von Neumann machine itself? What about studying the architectural and software decisions that went into making the OLPC laptop? A lot of scientific innovation went into it. Why not study what makes a virtual machine tick? Why not give up the traditional data processing model (or phase it out), and have these things as the basis of a new computer science program? The masters and Ph.D. programs could build from there, encouraging students to “think new thoughts” about the science of computing. The foundations for this transformation are already out there, sitting around. They’ve been there for 30 or 40 years waiting to be used. Yes, they failed once, and I’m not saying the program should just teach those languages and technologies and say, “Go forth and use this.” I’m saying use them as a basis for building something new. Computer science, now more than ever, needs to nurture bringing forth new ideas. I’ve already talked about computer science luminaries who have worked hard to innovate in academia. All it would take is looking at what they’ve published.

This might, however, have greater success at the top universities in the country. Prestige gets people to pay attention to what you’re doing. Computer science used to be okay with following some trends. It’s time for it to lead.

Even so, putting more of a focus on dynamic programming languages in the curriculum, for example, is actually not too risky at this point, even if a program is concerned with producing graduates who can go out and find jobs. They are coming more into popular use. Examples are Python and Ruby. Even Java and .Net are getting into the game. Sometime this year or next year Microsoft is scheduled to ship their Orcas release, which upgrades C# and VB.Net with dynamic language features. I’ve talked about this in a past postGroovy for Java was just released last month. It’s a FOSS dynamic language written in Java.

I would like to see computer science departments start by reconsidering what computers are to society and how they can best make their contribution to it.

I’ve referred to Lisa Rein’s Tour Of Alan Kay’s Etech 2003 Presentation before, but if you haven’t seen it (the videos of the presentation on the site), I’d encourage you to look at it. I’ve done so a few times, and it’s inspiring. You’ll come away thinking of computing in a whole new way. This, in my opinion, is what many computer science programs have been missing.

Edit 2/14/07: Early in the first excerpt of the interview with Alan Kay he says that the commercialization of computing produced a retrograde effect on computer science. I think this is true, however, I have mixed feelings about it. Had commercialization not happened when it did I might not have entered the computer field at all. I got caught up in the “pop culture” he talked about while I was growing up. So on the one hand I’m sorry to see the results of this pop culture on computer science, but on the other I see it as a good thing. It’s what got me interested in all of this in the first place.

Edit 2/16/07: I took out some of the quoting I did of the ACM Queue article, since I felt it distracted from my main argument. What I took out is in the ACM article. Follow the link.

This is one of a series of “bread crumb” articles I’ve written. To see more like this, go to the Bread Crumbs page.

2 thoughts on “It’s the end of the world as we know it

  1. Pingback: On our “low-pass filter” « Tekkie

  2. Pingback: Reminiscing, Part 1 « Tekkie

Leave a comment