Feeds:
Posts
Comments

Archive for October, 2007

Compute! Magazine

Cover of Compute! Magazine

Back to reminiscing! I’m dedicating this part in the series to this magazine, because I think it was that good. Compute! was published from 1979 to 1994. Though it started out focusing exclusively on computers that used the MOS 6502 CPU, or some variant, like the Apple II, Atari 8-bits (400, 800, XL and XE series), and Commodore 8-bit computers (the PET, Vic-20, and C-64), it made a real effort to cover a variety of computer platforms. They added “support,” I’ll call it, (regularly featured articles and type-in programs) to the Radio Shack Color Computer (a Motorola 6800 8-bit model), and the Texas Instruments TI-99/4A (with the TMS9900 16-bit CPU) early on. Later they added the Atari ST, and the Commodore Amiga, which were 16-bit Motorola 68000 models. Last, they added the IBM PC to their roster.

They occasionally had some news items on the Apple Macintosh, but they didn’t cover it much, and never published type-in programs for it.

In Compute!‘s early days they covered a bunch of kit computers. Kit computers came as a collection of parts (boards, chips; possibly coming with a case, keyboard, disk drive(s), etc.) that adventurous consumers could buy. Using supplied instructions, solder, and a soldering iron, they literally built the machine themselves. Some were simple, meant to be educational, showing how a computer worked. Others were full-fledged computer systems. These machines dropped off the radar of the magazine by the time I discovered it in 1983. Eventually the Commodore PET and the Color Computer were dropped from it as well.

Appeal

As you can see, the style of the magazine was welcoming. I’d even call it “pedestrian.” It didn’t come off as an unapproachable technical magazine, but rather a magazine for “Home, Educational, and Recreational Computing.” As a teen it drew me in. The general format within matched this style for the most part. The articles and supplemental sections that were included in each issue guided you step by step on what you needed to do to enter the programs it contained, and use them. The overall feel of the magazine was “user friendly.”

It was literally everywhere, too. I’d often see it in the grocery store magazine aisle. Other computer magazines like Byte and Creative Computing were there, too. My grandmother got me gift subscriptions to Compute! out of the Publishers Clearing House sweepstakes whenever she’d enter.

Type-ins

One of the things I thought was great about the magazine was its type-in programs. The internet existed then, but most people didn’t have access to it. If people were plugged into a network it was a proprietary one, like CompuServe, and it was expensive. People paid for connection time by the hour, and in some cases by the minute. Flat-rate pricing for this kind of access didn’t exist then. So software was distributed on diskettes (floppy disks), or ROM cartridges. In most cases the source code for the software didn’t come with it. You were just expected to use it.

Compute! published the full source code to programs in their magazine. They published utilities, occasionally applications, and plenty of games. If you wanted to use the program you had to type it in first. They eventually added “disk” editions of their magazine, for a premium, which contained the most prominent programs they published, ready to run.

They kept things simple. They published programs in BASIC or machine code. It was rare if they published code in other languages. In many cases they intermixed machine code with BASIC, to achieve optimal speed for something. Machine code was encoded either in bytes in a string (they had notation in the listings to show you how to enter them) to make the code relocatable, or as decimal numbers in DATA statements in BASIC which were POKE’d into fixed addresses. Later they added MLX, a utility that allowed you to type a complete machine code program in decimal without having to write a BASIC program to create the machine code.

Some background

Every computer back then, with the possible exception of the Mac, came with a programming language, usually some form of BASIC. The operating system was almost invisible on most 8-bit computers. More often than not the BASIC language was your user interface, even if you wanted to do something like list the files you had on disk. There were OS-level APIs, but they were minimal, and you had to use machine code to access them. Sometimes the BASIC language you used contained built-in commands that would do what you wanted, adding a layer of abstraction. In many cases you had to POKE values into specific memory locations to get a desired result, like create a “text window” of a specific size on the screen that would word wrap, or custom configure a screen for certain graphics characteristics. On some models you couldn’t create sound without doing this. Compared to today things were extremely primitive.

Windowing systems for 8-bits like GEOS, and Microsoft Windows 1.0 didn’t come along until later. People by and large used these computers in “modes.” There were one or more text modes, and a couple or several graphics modes. It was possible to mix text and graphics. Depending on what you wanted to do, in which mode, you might have it easy, or you might have to jump through a lot of hoops to make it work in terms of programming the computer to do it.

One thing was for sure. If speed was your top priority you had no choice but to program in machine code (just typing in numbers for opcodes and data) or assembly language. Most commercial software for 8-bit computers was programmed this way.

For their time, these machines were rather advanced. They had graphics and sound capabilities that some of the more powerful machines of the day didn’t have, because the people who made them didn’t think that sort of thing was important. The majority of earlier computer platforms were harder to use, and less user friendly.

Compute!’s evolution

In the early days Compute! didn’t publish too much machine code. They mostly published articles in BASIC, because they wanted something that was easy to learn, and could be understood by mere mortals. In the early editions of the magazine they explained the programs to you.

Later on they began to add machine code. Usually it was in little dribs and drabs. Most of the program would be written in BASIC, so not much of the educational value was lost.

As time passed, the accompanying articles got less educational. This didn’t matter too much to me. I could get an idea of what was going on by going through the BASIC code, and learn some new ideas about programming.

For some reason they never got into assembly language. They could’ve gotten the speed of machine code and written something educational at the same time. I think they compromised. It seemed like their main principle was they always wanted to publish programs in the language they knew came with the machine, so that the reader didn’t have to go out and buy another language or an assembler in order to enter their programs and get them to run.

As time passed, they tended more and more to publish programs that were written entirely, or almost entirely in machine code–in decimal or hexadecimal, depending on the platform. They had a short program you could type in and use called MLX that allowed you to type in the numbers in sequence, save it as a binary file, and then run it. This was a real retrogression. Professional programmers used to program in machine code in a manner similar to this as a matter of course in the 1950s. It took all of the educational value out of it, unless you wanted to buy a disassembler and read the code after entering it.

I suspect this happened because more readers were buying the “disk” edition of the magazine, so it made no difference to the reader what language the program was in, unless they were interested in actually learning programming. Maybe Compute! came to see itself as a cheap way for people to get software without having to work for it.

In 1988 their flagship Compute! magazine quit publishing type-ins entirely. I remember I was heartbroken.

I had a subscription to the magazine for years, and getting each new issue was the highlight of each month for me. I read it cover to cover. I would often find at least a few programs I liked in each one, and I would eventually get around to typing them in. The magazine did a fairly good job of giving you enough of a description of the program so you could decide whether it was worth the effort. Often they included screenshots, so you could see what it should look like when you were done. It was always a joy to see it run.

Bugs

When I first discovered the magazine I learned quite a bit about debugging. You would type a line in, the BASIC editor would accept it (it was syntactically valid), but it was logically wrong. I had to figure out what went wrong. That was part of the process at first.

Eventually Compute! added checksums to their type-ins, and they gave you a short program called “Automatic Proofreader” that you could type in, and it would checksum the lines you typed for any program. You would enter the line into the editor, you would get the checksum, and you could compare it to the checksum they had by the same line in the published program. If they didn’t match, you knew you did something wrong. What made life interesting was their checksum program didn’t always help you. The Atari 8-bit version (the kind I used sometimes) wasn’t sophisticated enough to check for transposed characters. So it was still possible to type something wrong and not know until you ran it.

You had to be wary or patient with type-ins. Every month the magazine published code errata in a column called “Capute!” (pronounced “kah-PUT”).

Even though I looked forward to each issue, and would pick out what I wanted, I was often behind. Typing in those programs took a good amount of time, a couple hours at least. I was busy writing my own software sometimes, and I had a life outside of this stuff. So I’d usually get the corrections to something they’d published before I typed it in. It was kind of maddening trying to keep track of this. I literally had a list I updated regularly of programs I wanted to enter, what issue they were in, along with the issues for the corresponding errata. It would’ve been nicer if they had tested their stuff more thoroughly before publishing it, but maybe that was expecting too much.

Compute!’s amazing editors

Programmers who wanted to get their stuff published sent submissions to Compute!‘s editors. Usually it was for a single platform, like the Atari 800, or the Commodore 64. If their editors liked the program they 1) paid the author, and 2) they usually ported the original program to a bunch of different platforms. I never figured out how they did this. If they got a program that was originally written for the C-64, for example, they would port it to some or all of the following: the Apple II, the Atari 8-bit, the TI-99/4A (sometimes), the Atari ST, the Commodore Amiga, and/or the IBM PC. They usually published these versions in the same issue, so it was easy to compare them side by side. And with rare exception they did this every issue! Amazing, particularly since each platform was totally incompatible with the other, and they implemented things like graphics and sound totally differently. The editors didn’t try to make each version an exact clone of the original, either. They tried to keep the overall theme the same, but each would display the editor’s own style. About the only time a program would port easily was if it was just an algorithmic exercise, and its input and output was all text. That was rare.

Compute!’s franchise

The publisher did not just produce Compute!. They had a more popular magazine called Compute!’s Gazette that focused exclusively on the Commodore computer line. It was the same format as the flagship magazine, articles and type-in programs. Later they added platform-exclusive magazines for the Apple II, Atari ST, the Commodore Amiga, and the IBM PC & PCjr (the PCjr was a consumer model IBM produced for a brief time–the magazine was also short-lived), again following the same format. The platform-specific magazines continued publishing type-ins for at least a few years after the flagship Compute! stopped.

Compute! changed its format to focus exclusively on the needs of computer users in 1988. It continued on this path for about another 6 years, when it finally went under.

News & Reviews

Another thing I always looked forward to was Compute!‘s coverage of what was going on at the trade shows: the Consumer Electronics Show (CES), and Comdex. They covered them every year. I think I came upon Compute! at just the right time for this. The period from 1983-1985 was a very exciting time for personal computers. As I think I mentioned before in another post, everybody and his brother was coming out with a machine, though the bubble burst on this party within a year or two.

Reading about these events, and their reviews of newly released computers, I got a sense that I was watching history in the making. In hindsight I’d say I was right.

1984 saw the release of the first Apple Macintosh. Compute! covered the unveiling, describing the event to a tee, but printed no pictures of the event or the computer. For a bit there I didn’t even know what it looked like. For years I remembered some of the description of the event from this article. I was gratified to finally catch some video of it in Robert X. Cringely’s “Triumph of the Nerds” documentary, which was broadcast in 1996.

1985 saw the release of the Atari ST and the Commodore Amiga. My mind was blown by what I was reading about them. It was so exciting. These computers blew the doors off of what came before. Only the Macintosh could compare.

The best of Compute!

Since I’ve talked a lot about why the magazine was so great, I figured I should show you their best stuff. Here are what were, in my opinion, the magazine’s best type-in programs. I wanted to do more video, but I’ve had to struggle to show this much. So most of the depictions below are still images of them.

(Update 9/25/08: I’ve managed to add a few more videos below, replacing the still shots I had, and I added one more old favorite.)

“Superchase”, by Anthony Godshall, on Atari 8-bit, October 1982, p. 66

This game was a thriller way back when. You played a treasure hunter, going through caves, leaving tracks as you went. A monster is chasing after you. You try to get all of the treasure before the monster catches up to you. The thrilling part is the maze generating algorithm sometimes forces you to backtrack and hide out in hopes that the monster will go right past you so you can escape behind it. The monster follows your tracks, but my guess is it doesn’t necessarily follow the direction of the tracks. If it sees tracks going in two different directions it looks like it makes a random guess about which way to go. You see this a bit in the video.

There was supposed to be a trick you could use to slip past the monster if it was coming right at you. The article said if you “shook the joystick” back and forth real fast you could get the monster to accelerate, and you could breeze right by it. I tried that at the end of the video, but no joy…

“Closeout!”, by L. L. Beh, on Atari 8-bit, March 1983, p. 70

Even though it doesn’t look like it, the game had the feel of Pac Man. The gist was you’re a shopper in a store, picking up “sale items” (the dots on each level) in aisles, and baddies were chasing after you. You have a gun, and so do the baddies. What makes it interesting is the bad guys each had guns of different range, so you could get away from one at a short distance, but not from the others. The bad guys had decent AI, too. Fun game.

Caves of Ice

“Caves of Ice”, by Marvin Bunker and Robert Tsuk, on Atari 8-bit,
September 1983, p. 50

As best I can remember this was the only 3D game they published. It was based on “QuintiMaze”, by Robert Tsuk, written for the Apple II. QuintiMaze was published in Byte Magazine. Bunker and Tsuk collaborated on this version. The goal was to escape from a 5x5x5 3D maze. The maze was randomly generated each time you played. There were no bad guys you had to watch out for, or time limits. It kept track of how much time it took you to escape.

Maze games and maze-generating algorithms were popular around this time in Compute!. A bunch of games of this type had been published before this, all 2D of course.

 “Worm of Bemer”, by Stephen D. Fultz, on Atari 8-bit, April 1984, p. 74

I liked this game because it had the kind of polish that I typically saw in commercial games at this time. The goal was to eat mushrooms and escape each room. Each time you’d eat a mushroom you’d grow longer. The catch was you were always moving forward, no stopping. You had to be careful not to trap yourself. The game also made it difficult to escape, making the exits very narrow. I don’t know if this was done deliberately or not, but the response time from when you moved the joystick in a direction to the time when it actually responded was slow. You had to literally think one or two moves ahead, or else you’d screw up.

“Acrobat”, by Peter Rizzuto, on Atari 8-bit, February 1985, p. 56

The point of this game was to dodge obstacles in all sorts of ways. You had to jump, flip, and slide to get over and under stuff coming at you. It was a sideways scroller. It was unique because it had a moving background to convey motion, and the action with the obstacles got complicated. It was a “thinking” action game. You had to think fast on your feet.

SpeedScript

“SpeedScript”, by Charles Brannon, on Atari 8-bit
(Image from Wikipedia.org)

SpeedScript was originally written for Commodore 8-bit computers by Charles Brannon, and published in the January 1984 issue of Compute!’s Gazette (see the link). It was later ported to other platforms.

They also published “SpeedCalc,” a spreadsheet program.

Years later I got SpeedScript off of a BBS. I wasn’t crazy enough to type in the whole program. It was long. When I got my own 8-bit computer I bought a commercial word processor along with it. I found SpeedScript quite handy for viewing text files, though. It had a small footprint on disk, and was quick to load.

You can find the following issues here.

Commodore 64 version, March 1985, p. 124

Commodore VIC-20 version, April 1985, p. 100

Atari 8-bit version, May 1985, p. 103

Apple II version, June 1985, p. 116

“Biker Dave”, by David Schwener, on Atari XL/XE (8-bits),
November 1986, p. 38

The action in this game reminded me of some coin-op video games I saw at the time, though this was nowhere near coin-op quality. You’re a stunt biker trying to jump a bunch of cars. You had to get your bike up to the right speed to make the jump, or else you’d crash. You also had to make sure you didn’t accelerate too quickly, or else you’d wipe out before you made the jump. Each time you successfully made the jump, another car was added, so you had to increase your speed to the jump each time. Neat game!

“Laser Chess”, by Mike Duppong, June 1987, p. 25
Atari XL/XE translation by Rhett Anderson, Assistant Editor

The above video is my meager attempt at a demo, playing both sides. It’s not that interesting in terms of strategy (I barely had one for each side), but it shows some of the action. I didn’t play a complete game, because one side’s laser got blasted. I figured after that the end was inevitable, and not very interesting.

I referred to this game in another post on learning Squeak, since Stephan Wessels had published a tutorial whose end product looked similar to it.

Laser Chess was one of the few programs Compute! published that had a lasting legacy. It was originally published in a 1987 edition of Compute!’s Atari ST Disk & Magazine. Mike Duppong had won a $5,000 programming contest put on by the magazine, with this game. He originally wrote it in Modula-2. It was adapted by Compute!’s editors for the Amiga, Commodore 64, Apple II, and the Atari XL/XE, using BASIC and machine code.

Update 12-21-2013: There used to be online versions of this game, but I haven’t found them recently. Other versions of it are out there, if you look for it.

I could see that there was something special about this game, but it was not one that I got into much. I was never that good at playing chess to begin with. Not to say that chess strategy was necessary here, but rather it evoked its own strategy, and it was complicated in its own way. In regular chess you strategize based on the position of pieces and how they can move. Here, you strategize mostly by position and orientation of pieces, though it’s possible to capture pieces by just moving yours on an opponent piece’s square, as in regular chess. The difference here is each player gets two moves per turn, and every piece can only move one space in a lateral direction (forward, back, or sideways) per move. You could move one space diagonally by taking a shortcut that combined a vertical and sideways move, taking up a whole turn. You could also rotate a piece, which counted as one move. This added another dimension, because most pieces have a reflective surface. The reason being that each player has a laser. If you positioned your pieces in the right orientation at the right time you could blast an opponent’s piece off the board! Firing the laser counted as one move. Each side could only fire their laser once per turn.

You had to be careful setting up your laser shots. Your opponent could use your pieces’ reflective surfaces against you. You could also accidentally kill one of your own pieces if you didn’t carefully consider where the laser beam went.

Each side also had a “hypercube” piece. When you moved it onto another piece’s square it randomly placed that piece on an open space on the board. I didn’t use it in the above video. I think it was the hollow square piece.

Edit 5-13-2012: I couldn’t resist this. Here’s a hilarious scene from “The Big Bang Theory” TV show where the gang plays “secret agent laser obstacle chess.” 😀 It’s from Season 2, Episode 18, called, “The Work Song Nanocluster.”

Crossroads

“Crossroads”, by Steve Harter, on the Commodore 64,
Compute!’s Gazette, December 1987

This game was published exclusively in Compute!’s Gazette. I only know about it from a friend who used to use Commodore computers. This game is one of the greats. It’s still remembered by the people who played it.

I’ve played it a bit. You’re a guy trying to pick up “shields” (they look a bit like swirling Japanese flying stars), and meanwhile beasties are trying to chase after you and eat you. Picking up shields makes you more resistant to attack. You have to pick up a certain number to go on to the next level. The baddies can pick up the shields as well, and gain strength from them. You have to kill them to get their shields.

What was unique was that the beasties would go after each other as well (that’s the mayhem you see in the screenshot above). Sometimes you could provoke them to do this, to distract them. It was a pretty involved game. I would’ve liked to have posted a video of this, but no matter what I tried it didn’t turn out well.

“Screen Print,” by Richard Tietjens, April 1988, p. 64 – This was a utility (so there’s no picture for it), but I thought it really deserved a mention. Screen Print was one of the last programs they published for the Atari 8-bit. It was one of the most useful utilities they ever published, in my opinion. I could literally give it any Atari graphics file format, it would decode it properly, and display it on the screen. If I wanted it to, it would also print out a nice copy on my dot matrix printer.

Something was curious about it though. As I typed it in I noticed a couple of code sections labeled “Poker” in the comments. At the time I had no idea what it stood for.

I had gotten my own Atari 8-bit computer and a modem around this time (I explain this in Part 1), and I began to explore BBSes. Somewhere along the line I found that people had uploaded graphics images from the computer game Strip Poker. Duh! I was a teenage boy. Of course I downloaded them! I tried looking at them with a simple bitmap viewer I had written, and all I got was garbage. They were encoded. Made sense. They didn’t want people peeking. You had to win some hands in the game to see the naughtier pictures. It occurred to me one day, remembering back to when I typed in “Screen Print,” “Hmm. There were those sections called ‘Poker’. I wonder…” I loaded it up, tried it out, and sure enough it loaded those Strip Poker images just fine. 😉 I wonder if the editors at the magazine knew about this. They certainly didn’t mention it.

Strip Poker was a computer game that had been out for a long time, made by Artworx. I remember seeing (tasteful) ads for it in my earliest issues of Compute!, going back to 1983. The earliest versions of it ran on 8-bit computers: Atari, Commodore, and Apple, using artistically rendered graphics (no digitizing). Later versions of it were made for the Atari ST, and Amiga, and eventually the PC. Amazingly, Artworx is still hanging on, making the same product!

Honorable mentions

There were other games Compute! published that I enjoyed, but I put them in a bit of a lesser category. It’s subjective:

Outpost, by Tim Parker, June 1982, p. 30 – A character-based (as in, ASCII), turn-based game where you had to defend yourself against computer-controlled attacking ships. It reminds me of the old Star Trek game, but you were stationary.

Goldrush July 1982 – You mined for gold with explosives. You had to watch out for cave-ins, and try not to get trapped with no dynamite left.

Ski! (Slalom), by George Leotti and Charles Brannon, February 1983, p. 76 – You skied down a mountain, went through gates, avoided obstacles, and picked up points.

The Witching Hour, by Brian Flynn, October 1985, p. 42 – It was published for Halloween. It was witches vs. ghosts, kind of like checkers.

Switchbox, by Compute! Assistant Editor Todd Heimark, March 1986, p. 34 – Kind of like Pachinko, but since it was in BASIC, much slower.

Laser Strike, by Barbara Schulak, December 1986, p. 44 – A clone of Battleship.

Chain Reaction, by Mark Tuttle, January 1987 – A unique turn-based game. You played on a board with “explosives.” It really functioned more like a combination of a nuclear chain reaction and Reversi. You played against an opponent. Each space on the board had a different “critical mass” for explosives. If your space exploded, it would shoot explosives of your color into the adjoining spaces, changing the color of all explosives in that space to your color, and adding the new explosive to the ones already there. This could set off–you guessed it–a chain reaction. You tried to reach “critical mass” with your explosives in just the right places at just the right time to gradually change all of your opponent’s colored explosives to your color. It was an easy game to play. You literally could set up chain reactions that would go on for about a minute. But then, this was mostly due to the slowness of BASIC.

There’s a site dedicated to Compute! articles. They’ve gotten permission to publish many of them on the web. There are many they haven’t obtained permission for, and so you see them mentioned, but no links to articles. When Compute! went under, all copyrights reverted to the original authors. So anyone wanting to republish articles legally has had to try and track down the authors and get their permission. A real pain in the keester. 😦

Edit 5-11-2012: Ian Matthews has digitized every issue of Compute! into searchable PDF format. Now you can read full issues, all the articles, see all the programs with full source code, and see what was selling and what it was selling for (yikes!) Now (hopefully) Compute! will live on in posterity where everyone who wants to see it can find it. He broke up many of the issues into sections, as the PDFs get pretty large to download.

I updated all of my Compute! links in this post, so that you can look at the articles for yourself. This way you can finally see these issues as I saw them. Matthews has requested that if people want to view more than a few issues on his site that they purchase a DVD of the complete set, as downloading a bunch of issues will cost him in bandwidth. So I’m not linking directly to his PDFs.

I envisioned doing something like this about 15 years ago, but felt squeamish about it 1) because of the copyright issue, and 2) when I learned that the best way to do it would destroy every issue I had. I’d have to separate every single page from its binding (using something like a razor blade) and then run it through the scanner. I’m glad someone else had the courage to do it right. 🙂 Here’s a video Ian made, showing every single Compute! cover, from the first to the last issue.

Conclusion

The time while Compute! lived was a happy time. I still have fond memories of it. I guess today young, aspiring programmers are getting the education I got by working on open source software. Compute! was the open source software of the time, as far as I was concerned. It wasn’t the only magazine that did this, but in my opinion it was the best. There were other magazines publishing type-ins for Atari computers, like Antic, and A.N.A.L.O.G. I switched to Antic when Compute! stopped publishing them. I can’t remember, but I may have kept on with them until they went under. I found out about Current Notes in college, and came to really like it. No type-ins, but it had interesting articles on all things Atari.

Compute! was a key part of my education as a programmer. The thing I loved about them was they always had a focus on making computing fun. Sure it was frustrating to spend hours typing in a program, and debugging it, but when you were done, you got what you wanted, and you learned some things. Secondly, they had the notion that everyone could share in the experience. They hardly ever published a program for just one computer. They made sure that most of their subscribers could enjoy a program even if the original author only wrote it for one platform. That was mighty generous of them. I imagine it was hard work.

In closing I’d like to thank the editors of Compute!: Robert Lock, Richard Mansfield, Charles Brannon, Tom R. Halfhill, and anyone else I’ve forgotten. You helped make me the programmer I am today, and my teen years something special.

—Mark Miller, https://tekkie.wordpress.com

Advertisements

Read Full Post »

I saw that lispy wrote about this. I happened to spot the original speech by Richard Stallman on reddit. The title intrigued me: “My Lisp Experiences and the Development of Emacs”. I’ll go through some pieces of it, because there are some interesting stories in here.

My first experience with Lisp was when I read the Lisp 1.5 manual in high school. That’s when I had my mind blown by the idea that there could be a computer language like that.

This reminded me of a quote from Alan Kay’s (AK) interview with Stuart Feldmen (SF) at ACM Queue from a few years ago, which I’ve cited a few times before:

SF If nothing else, Lisp was carefully defined in terms of Lisp.

AK Yes, that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over.

I realized that anytime I want to know what I’m doing, I can just write down the kernel of this thing in a half page and it’s not going to lose any power. In fact, it’s going to gain power by being able to reenter itself much more readily than most systems done the other way can possibly do.

All of these ideas could be part of both software engineering and computer science, but I fear—as far as I can tell—that most undergraduate degrees in computer science these days are basically Java vocational training.

Recently I watched a speech by Ian Piumarta on his COLAs (Combined Object-Lambda Architectures) computing system. He is working with Alan Kay at Viewpoints Research Institute on the project of building a complete end-user system within 20KLOC (PDF). There’s another article about it here. At the start of his talk he said that in order for the audience to understand what he was presenting, they needed to read the Lisp 1.5 manual. It’s sounding like the bible of symbolic computing.

It’s a system model that is not based on the traditional machine language 3-address code (opcode, operand, operand) way of doing things, but rather on symbol manipulation (function, parameters), where “parameters” is made up of atoms and/or lists. It’s a system that is not based on module binding (early- or late-bound), but late binding to objects (Lisp has a notion of objects). This is something I’ve been seeking to understand, so I began reading this book recently. You can find the illustrious Lisp 1.5 manual here (PDF).

Stallman goes on to talk about how he created Emacs. It was originally written in assembly language for the DEC PDP-10. From the beginning people had the ability to extend Emacs while they were using it. Originally this ability was provided through a command language that was designed for the TECO editor. He said it “was an extremely ugly programming language”. I’ve never seen it, but from what I’ve read it had a lot of features and its command language was extremely cryptic. One typing mistake could hose hours of work. I remember reading a joke saying that programmers would sometimes try typing their name into the command prompt for TECO to see what it would do. Despite this, people added extensions to the language to make it more powerful. Stallman saw it as unsustainable. A solution had to be found, and in the process they discovered a useful lesson about teaching programming:

The obvious lesson was that a language like TECO, which wasn’t designed to be a programming language, was the wrong way to go. The language that you build your extensions on shouldn’t be thought of as a programming language in afterthought; it should be designed as a programming language. In fact, we discovered that the best programming language for that purpose was Lisp.

It was Bernie Greenberg, who discovered that it was. He wrote a version of Emacs in Multics MacLisp, and he wrote his commands in MacLisp in a straightforward fashion. The editor itself was written entirely in Lisp. Multics Emacs proved to be a great success — programming new editing commands was so convenient that even the secretaries in his office started learning how to use it. They used a manual someone had written which showed how to extend Emacs, but didn’t say it was [programming]. So the secretaries, who believed they couldn’t do programming, weren’t scared off. They read the manual, discovered they could do useful things and they learned to program.

So Bernie saw that an application — a program that does something useful for you — which has Lisp inside it and which you could extend by rewriting the Lisp programs, is actually a very good way for people to learn programming. It gives them a chance to write small programs that are useful for them, which in most arenas you can’t possibly do. They can get encouragement for their own practical use — at the stage where it’s the hardest — where they don’t believe they can program, until they get to the point where they are programmers.

Dr. Caitlin Keller, one of Dr. Randy Pausch’s former students suggested using Alice in conjunction with a course focus on storytelling, rather than programming, to “fool” students into learning programming–teach students programming while they’re doing something else. It’s apparently worked before. Just don’t use that dreaded “P” word at the start.

Another thing to notice is that Greenberg used Lisp for a non-AI project. Lisp is a real general purpose programming system. It always has been. I think one of the injustices that’s occurred in computer science is CS professors have categorized Lisp exclusively as an artificial intelligence language. When I took CS in college we worked with Lisp for a few weeks, and that was it. Courses in AI were part of an optional curriculum, and students would get further exposure to it there. If students didn’t take them, most of them spent the rest of their CS major programming in Pascal and C. This was in the late ’80s and early 1990s.

Lisp has been used quite a bit for AI, but by no means is that all it’s good for. As Stallman went on to talk about further into his speech, Lisp is capable of implementing a fully functional operating system. That’s what created the impetus for a few people at the MIT AI lab to found two companies: Lisp Machines Inc., and Symbolics. Both produced systems that ran Lisp down at the hardware level. Both are now defunct. Symbolics managed to make it to 1995 before it died out.

I thought an interesting story that came into play was James Gosling’s involvement with Emacs.  Gosling had developed “GosMacs” (also known as “gmacs”) which was written in C, and was the first version of Emacs to run on Unix, but:

I discovered that Gosling’s Emacs did not have a real Lisp. It had a programming language that was known as ‘mocklisp’, which looks syntactically like Lisp, but didn’t have the data structures of Lisp. So programs were not data, and vital elements of Lisp were missing. Its data structures were strings, numbers and a few other specialized things. [my emphasis]

I concluded I couldn’t use it and had to replace it all, the first step of which was to write an actual Lisp interpreter. I gradually adapted every part of the editor based on real Lisp data structures, rather than ad hoc data structures, making the data structures of the internals of the editor exposable and manipulable by the user’s Lisp programs.

Stallman doesn’t say whether Gosling wrote mocklisp, but I infer from his story that he probably did. I think it explains a lot about why Java turned out the way it did.

Stallman said he thought about creating a GNU Lisp operating system, but he decided against it because it required specially programmed processors to run efficiently. Instead he’d focus on creating a Unix-like OS as part of the GNU project. This inspired him to create GNU Emacs, based on GosMacs.

Around 1995, due to actions taken by Sun Microsystems, Stallman decided to create Guile, a version of Scheme, as an extension language for all GNU projects. I like his vision for it, because he describes what I think is sad about the current state of affairs:

Our idea was that if each extensible application supported Scheme, you could write an implementation of TCL or Python or Perl in Scheme that translates that program into Scheme. Then you could load that into any application and customize it in your favorite language and it would work with other customizations as well.

As long as the extensibility languages are weak, the users have to use only the language you provided them. Which means that people who love any given language have to compete for the choice of the developers of applications — saying “Please, application developer, put my language into your application, not his language.” Then the users get no choices at all — whichever application they’re using comes with one language and they’re stuck with [that language]. But when you have a powerful language that can implement others by translating into it, then you give the user a choice of language and we don’t have to have a language war anymore. That’s what we’re hoping ‘Guile’, our scheme interpreter, will do.

Amen to that! I really do wish that we didn’t have to make choices about which language we’re going to use for a project, based on what VM architecture someone has already chosen. This is gradually being resolved on the popular VMs, but there are still literal speedbumps that have to be overcome, because the VMs weren’t designed for dynamic languages. Microsoft has at least started to address this issue.

It’s interesting to learn that Stallman has long had an affinity for Lisp. He strikes me as a pragmatist, since from my experience a lot of GNU software is written in C. Computers are clearly powerful enough now that every GNU project could be written in Lisp or Guile, but I assume there are other considerations. C is still a popular language, and it would disenfranchise most programmers if all the projects were rewritten in these more powerful languages. Plus, some people are still running slow hardware, and the GNU project wouldn’t want to penalize them.

I’ve said this before, but I think this is where computer science education is getting off on the wrong track. Python is being taught in many CS programs in certain classes, but powerful, dynamic languages are still relegated to backwater status. C++ and Java are the main languages taught now. I don’t think computing is going to rise to the next level, so to speak, until it’s recognized that we need more powerful programming metaphors and representations, and we need greater flexibility in our development systems.

The reason this is not generally recognized is that computing is largely a tool-using culture. As Alan Kay said once in one of his “The Computer Revolution Hasn’t Happened Yet” speeches (though he may have been quoting someone else), most people are instrumental reasoners, meaning they only find a tool or idea interesting if it meets a need they have right in front of them. Only a minority of people tend to be interested in ideas for their own sake.

So what might bring about this next step is the issue of programming for multiple cores, and parallel processing. The traditional languages don’t deal with this very well, requiring programmers to manually control threads in code. There’s been talk that dynamic languages might provide the best answer for this. So that might provide the incentive to advance.

—Mark Miller, https://tekkie.wordpress.com

Read Full Post »

Love it!

Trivia question: Who’s that staring back at you in the poster?

I agree with one of the comments: They should make one up with Dijkstra’s mug in it. 🙂

Read Full Post »

I was prompted by another blog post I read to check up on the situation with indquery.com, which I had complained about earlier. It looks like it’s been resolved. I checked the site and all I saw was an ASP.Net directory structure. So it looks like it’s down.

I was getting ready to take any action I could to help with this, but I was running up against getting ready for a trip I had scheduled for months. So I didn’t get to it. Perhaps they only planned to keep the site up temporarily. I don’t know how these scams work. Or perhaps other folks did the legwork of getting Google to remove the ads from his “honeypot”. They had copied other people’s stuff, too. If so, thank you very much, whoever you are. It was distressing to see articles I’d put effort and thought into blatantly copied and reattributed to the site’s owner just so they could make a Deutsche Mark (the site was German).

I’ve been signing my posts as a matter of course since then, so hopefully even if my posts are copied again, people will be able to know who wrote them and how to get here.

—Mark Miller, https://tekkie.wordpress.com

Read Full Post »

Apple ][ 

The most commonly found computers in public schools in the early to mid-1980s were the Apple ][ plus and //e. These were 8-bit computers, running MOS 6502 CPUs (or some variant), running at 1 Mhz. They typically had anywhere from 48 to 64 kilobytes of RAM. They were the models Apple made before the Lisa and Macintosh.

When they finally installed a computer lab at my junior high school, around 1984, I got myself acquainted with Applesoft BASIC, which was a version of Microsoft BASIC. What was kind of unique about it is in addition to being an interpreter used for program development, it functioned like a command line for the computer’s operating system. After booting with a floppy diskette, which contained Apple DOS, you could do every disk management function by typing commands into it, rather like the MS-DOS command line.

Programming

I mostly used BASIC with the Apples, though I got into Apple Pascal my senior year in high school. Like with Atari BASIC (which I mentioned in Part 1), each segment of Applesoft BASIC code (which could contain several commands) had to have a line number for sequencing and labeling.

Fortunately over the last several years I’ve been able to scrounge together some images of my old Apple II disks that I can run in an emulator, so I can show some of the stuff I did while I was in school.

The music you hear for the next two videos is just some MODule background music I picked, playing on my PC. The first for the video below is called “Leave the brain at home.” This video is of a computerized version of Mad Libs.

Ad Libs

Since I was in about 5th grade my friends and I loved to play Mad Libs. The problem was we’d always write our substitution words in the books we’d get. If we wanted to play them over again we’d have to erase what we wrote in before, eventually wearing out the paper. I thought this was wasteful, so I tried to computerize the game. I wrote a suite of programs when I was in junior high school: Ad Libs Creator, Ad Libs Displayer (what you see in the video above), and Ad Libs Editor. Creator and Editor were rudimentary. I didn’t have a very sophisticated concept of text at the time. The best I could do with the skills I had was to have the user enter each word and blank of their Mad Lib one word/blank at a time, at prompts. Not user-friendly. Displayer was rather popular with my friends. When I was in the computer lab I’d sometimes hear them laughing off in some corner. One day I went over and asked them what was so funny. They said they were playing with my Ad Libs program. How gratifying! 🙂 That was the idea, to make the game fun to play.

Trying to get published

Compute! was my favorite magazine. I had a subscription to it for years during the 1980s. I looked forward to each issue with bated breath, and I read it cover to cover. The most exciting things for me were the type-in programs. Each issue had programs people had submitted to the magazine. The editors would pick some and publish them in BASIC, with complete source code. The original idea, I think, was to create an educational vehicle. Readers were expected to type in the programs, and when done, use them. The early issues had articles that explained how the programs worked, part by part. I found them very educational.

I updated my Ad Libs programs some in high school. I felt as though I had finally gotten them debugged and in a form I liked. I decided to try to get them published in Compute!. I spent what felt like months writing and editing my article that would go with them. Once I felt that was done, I took my shot. I put my programs on disk, printed out my article, and sent them to the magazine. Some months later I got a rejection letter. I vaguely remember one of the things it said was they did not take unsolicited material. It also said that regardless, my article and programs didn’t fit in with their editorial focus, or something. Sigh… Anyway, that was one lesson learned: Pass the idea by one of the editors first. Give them a summary of what your submission will be about before you go through the trouble of writing it. That way if they don’t like it you haven’t wasted the effort. I didn’t write these programs to get published, but I figured I’d try.

Trying again, or…not

The background music for this video is a MODule called “Earthquake”.

Week-In-Advance, written in BASIC, compiled using Einstein BASIC compiler

In the 1980s it was common for computer shows to take place at local malls, about once a year. I looked forward to them every time. I could spend almost a whole day at those things. One of those times I saw an Apple Lisa on display. It was around 1985, or later. I had the opportunity to try it out. I remember I was running a calendaring application on it. It would show me a month in the traditional calendar layout. I could click on individual days in the month, and it would bring up a form where I could enter new events for the day. This was so it could post reminders about upcoming events for the user. It made quite an impression on me. I must’ve thought, “Gosh, I could use something like this!” When I was in junior high and high school I often wrote down due dates for assignments and tests in a pocket notebook I carried with me. It got confusing and messy sometimes. I wanted something that was better organized. The only computers I could program on in the school were Apple IIs. I don’t remember when, but at some point while I was in high school I resolved to write my own scheduling program. Inspired by the way the Lisa worked, I wanted to create my own windowing interface. I made an effort to make it easy to use. From the outset I wanted to try (again) getting it published in Compute!.

Not being that great with graphics, I decided to do it all in text mode. It’s hard for me to remember how long it took. From the time I started on it, I was done with it anywhere from 6 months to a year later. I’d spend 1 or 2 hours a day on it if I didn’t have a heavy homework load. It was a real learning experience. There were some hard skills I had to learn in order to accomplish what I wanted.

I tried out a couple different interface designs, which I threw out, before I finally figured out how to create expanding and shrinking windows. The main problem I had with the other two designs was they were too slow.

One of the things I realized was the only way I could manage the amount of code I was creating was to break it up into “procedural” chunks that I could GOSUB to and RETURN from. If I had done it in a language like Pascal it would’ve been easier, I think.

The next realization I made is I needed to be able to edit events, not just add and delete them. So I created my own little text editor, which you can see a couple times in the video. It shows up as a subwindow inside the larger events window, where events are listed for a day. I don’t show all the features of it in this video. You see me just appending text to an event. I could put it into “insert mode”, or delete characters, through a couple of key combinations.

The last realization I made is I couldn’t use the program all the time. I didn’t have time to make it to the lab every day so I could view the things I had coming up. I needed a way to print out my schedule. So I added that (which I couldn’t demonstrate in the video). Finally, in February 1988 I got it all done.

I used it for real scheduling for about 5 months, until I graduated. It worked well. It was my first experience in creating something that was actually useful. Too bad I didn’t start on it sooner.

By the time I was done I had sworn off trying to get it published. It had grown way beyond the size I was anticipating, to about 800 LOC (program lines on the display, not in line numbers). The print columns in the magazine were narrower than the columns displayed for a listing on the screen. I figured it would be too large to publish. It may not have made a difference. Compute! would stop publishing type-ins in May 1988, though I think their platform-specific magazines (they had one for Apple IIs) continued to publish type-ins. Anyway, that’s water under the bridge.

Looking back on it now the interface design had some flaws in it, but I felt very satisfied with it at the time.

Apple II music video

I thought this video deserved an honorable mention. Someone wrote a BASIC program in text mode to create a computerized music video for a song. The music you hear is not being output by the computer (it didn’t have the power for that), but the computer’s display is synchronized to the music using wait loops.

At the end you see some of the code that was used to generate the display, and the Apple II computer itself. The square box to the left of the computer is the 5-1/4″ floppy disk drive for the machine.

Games

There were many classic games for the Apples. These are some that made an impression on me. I put up the next 3 videos on YouTube. Karateka was put up by someone else.

Star Blazer, by Tony Suzuki, Star Craft, Inc.

You go on 3 missions: 1) bomb a radar dish, 2) bomb a fast tank while ships fly at you, and 3) bomb a radar dish while balloon-launching ships try to blow you up (the balloons have bombs attached). It’s been years since I played this game, so my technique is not so good as you can tell. I remember what I used to do with the tank is I’d get some “yo-yo” action going with it, back and forth, and then I’d fly up and forward, get a little bit in front of the tank, then drop down fast and launch a bomb. This would cause the bomb to drop faster than usual, hopefully on the tank. Timing was crucial.

One of the challenges is going after your objective while also picking up fuel packets that are dropped by parachute by an orange craft that flies across the top of the screen every once in a while.

Drol, by Aik Beng, Broderbund Software

Choplifter, by Dan Gorlin, Broderbund Software

Karateka, by Jordan Mechner, Broderbund Software

This was one of the few adventure games I saw on the Apples. It has some properties that appear in today’s fighting games, like a real-time health indicator for each fighter. The goal was to kill all the fighters, including the “boss” character at the end, and rescue the princess. Getting past the fighters was pretty straightforward for me, though as you see in the video, you had to remember to get into the fighting position to defend yourself, otherwise you get killed in one punch. There were a couple tricks in the game. One was a razor sharp doorway that looked like a set of vertical bars (like in a jail). You had to figure out how to get past that. Once you get past the “boss” you had to be careful how you approached the princess. If you went into her cell in fighting position, she’d kill you in one kick. You were supposed to stand up straight and run in. Once I figured these things out the game was pretty easy to beat. It was challenging enough that it was fun, but after a while it got boring except for the graphics and sound effects.

As you’ll notice most of the games I cite here were from Broderbund. On the Apple they were quite the game company. They had many excellent titles.

There were some other games I played, like Sabotage, Russki Duck, Apple Panic, Canonball Blitz (a game like Donkey Kong), and Horizon V. I’m familiar with Aztec, Robot Wars by Muse Software, Castle Wolfenstein by Muse Software (the inspiration for Wolfenstein 3D, and Return to Castle Wolfenstein on the PC), Bilestoad, and the Ultima role-playing series, because I used to watch other kids play them.

In terms of graphics a lot of these were not high quality, but I think all of them really utilized the hardware to its potential, and had interesting gameplay. Even though the sound quality was not good (all you had was the internal speaker), a lot of times these games had sound effects that did not appear in the Atari 8-bit versions for some reason. Not as much effort was put into them.

Here’s a long video of me playing Robot Odyssey.

Robot Odyssey, by Mike Wallace and Leslie Grimm,
The Learning Company

This was a sequel to another educational game called “Rocky’s Boots,” which put the learner through a series of challenges, one per room. You solved them by building circuits in the rooms out of logic gates. This game is similar, except the player is programming robots, and they work autonomously. The interiors of robots are really like “rooms.” Even though they’re all the same size, you can put robots inside each other. The player has to pick up certain items and go through mazes throughout the game. S/he is usually prevented from doing so by “sentries.” In these cases the player needs to program a robot to go get items or take him/her through mazes.

I don’t show you all the action. I tried to just show the interesting stuff (not always succeeding). The audio gets behind the video towards the middle. That’s the fault of my screen capture software.

When I was a teen I found this game fascinating. It was the most advanced gaming environment I’d seen. Things happened in real time. Even if you walked out of a room, it didn’t mean everything had stopped. This was often the case with other 8-bit games of this style. Things only happened in a room if you were in it. In this game you can get a robot going doing something, leave the room and go do something else, and then come back. Everything will have happened as if you were in the room with the robot the whole time. I learned a lot about building logic circuits through “Rocky’s Boots” and this game.

Graphics resolution was a problem. It was hard building anything but basic circuits, because if you built something complex you’d get a jumble of gates and connections, and it was difficult to sort out. Fortunately the game gives you a “chip lab” where you can build your circuits inside a prototyping chip, “burn some EPROMS” from it, save the chips, and use them in the game. This keeps circuitry manageable.

My high school had this game and I played it every chance I got. I never finished it. Since I have an emulator now I’ve taken time to play it from time to time. The gameplay is not so nice anymore. I’m used to better environments, but I still hope to finish it. 🙂 There’s nothing else like it out there today.

The original Flight Simulator, by subLogic

This was the predecessor to Flight Simulator II, which was bought and developed further by Microsoft. Flight Simulator was extremely simple. It had no sound effects. The only graphics you had were wireframe. You couldn’t hit anything except the ground, and you could land anywhere. You could “declare war” and “enemy planes” (dots) would come up to fight you. I found this mode more frustrating than fun. You could shoot at them (I think), but you had no targeting crosshair or tracer bullets, so you had no idea where you were firing.

I could never get used to FS II, particularly on the 8-bits. It was too boggy. You would start into a turn, and realize 5 seconds later how deep a turn you went into. I found it impossible to maneuver. Even though this version is also slow, it ran at a fast enough speed that I didn’t get too surprised by what I saw happening. The instrument panel made more sense to me, too. The graphics were crap. Most personal computer users of the time probably didn’t realize it, because it was the best 3D graphics people knew how to do on such a limited platform.

I had a lot of fun with the Apples. I did some of my most creative work with them as a teenager.

—Mark Miller, https://tekkie.wordpress.com

Read Full Post »

James Robertson is no fan of Microsoft, but yesterday he had to call out the machinations of the EU as “stupid”. I agree.

First, the EU demanded that Microsoft sell a version of Windows in the EU market without Windows Media Player. Microsoft complied, with “Windows n” (for (n)o media player), a configuration that became an instant flop. Fortunately they were allowed to sell the regular configuration as well, which continued to sell decently. The reason “n” was a flop was Microsoft was allowed to sell it for the same price as the regular configuration. What do you know? Customers noticed, and decided to get the one that was full-featured. Something that was also pointed out by Microsoft proponents was that Windows Media Player was one of the few (if not the only) free media players on the market that came with no ad-ware or spyware. Maybe people preferred that, too.

Now, the EU is considering a proposal by their Globalization Initiative that says all PCs in the EU should be sold without an operating system. Now, maybe they mean the OS should not come bundled. Rather the customer will be allowed to tell the vendor which OS to install. That wouldn’t be too bad, though I imagine it would still confuse a lot of consumers who are not into studying which OS they need. I think a good compromise would be if the PC vendors were allowed to have a “default” option, such that if a customer just orders a PC, without specifying anything else, they’ll get the vendor’s standard configuration, keeping things simple. If the customer specifies a preference, that’s what they’ll get, without having to install it themselves.

Here’s hoping the EU doesn’t go off the deep end and take their computer industry back 25 years.

Read Full Post »

“Idiocracy”

Mike Judge, who created the movie “Office Space”, came out with a movie that’s on DVD now called “Idiocracy”. The name and the cover drew my attention. It shows the classic “Ascent of Man” from ape to homo sapien, and then shows man devolved into something less. (Update 10-3-07: I should point out this movie was rated R for language.)

The first 10 minutes of the movie are priceless. If you’re at all aware of population trends in the U.S., or anywhere in the Western world, really, you’ll get the joke immediately. Being a comedy it puts things in stark, absurd terms. It begins with some background.

As the 21st century began, human evolution was at a turning point. Natural selection: the process by which the strongest, the smartest, the fastest reproduced in greater numbers than the rest; a process which had once favored the noblest traits of man [here it shows pictures of Einstein, Beethoven, Darwin, and works by famous Renaissance artists] now began to favor different traits [here it shows images of a “skank chic” 20-something, Joey Buttafuoco, WWF wrestling, and a female boxer]. Most science fiction of the day predicted a future that was more civilized, and more intelligent [here it shows a mural of gleaming, sleek futuristic cities, monorails, sleek jet cars and flying personal craft]. But as time went on, things seemed to be heading in the opposite direction, a dumbing down. How did this happen? Evolution does not necessarily reward intelligence. With no natural predators to thin the herd, it began to simply reward those who reproduced the most, and left the intelligent to become an endangered species.

Then it plays on stereotypes. It shows two families. The first is a cautious, white, well-to-do, highly educated couple talking at first about the financial difficulties of having children, and then years later, the fertility problems. The other is an economically and educationally disadvantaged white husband and wife with a few children who realize they’re “pregnant again”. Not only that, he’s been sleeping around, and has produced more children. The same goes for his eldest son, the football star. As I watched this I couldn’t help but chuckle. I’ve had much the same thoughts and images running through my head from time to time for a few years now, probably induced by the media images I see. I think what it really shows is being educated and thoughtful has its downside. You can tend to overthink problems and issues, which leads to indecisiveness and paralysis. I’ve certainly experienced that in my life. What it also says is trying to control your life too much leads to having no legacy to pass to anyone, whereas those who just take life as it comes, and don’t think about it that much, do, for good or for ill.

The part about science fiction predicting a more civilized, intelligent, and technologically advanced society, contrasts it with an exaggerated present reality. It really hit me. I hadn’t examined this, but I must have held this expectation somewhere in the back of my mind, and been disappointed that we haven’t come even close to this idealized goal yet. It may be another few hundred years before this vision becomes reality. I have thought from time to time that 2001: A Space Odyssey (the book) predicted we would have sent humans to Saturn by now. We haven’t even sent anyone to the Moon since the early 1970s. It’s depressing to realize that the manned missons to the Moon were little more than Cold War political stunts with advancements in electronics and planetary science being a side-effect. I hear these bold pronouncements occasionally about new human missions to the Moon, and to Mars, but somehow I doubt NASA will be sending people back anytime soon. We’ll have better luck with the private sector. Government is going to be spending the next 30 years dealing with Social Security and Medicare, not to mention the current War on Terror.

A particularly relevant scene in the movie follows the introduction, showing a military librarian, Joe Bauers (played by Luke Wilson), being taken off his job because “no one comes down here anymore”. He’s put on a new assignment: to be a test subject for a cryogenic “skilled human preservation” experiment. One thing that’s pointed out is he is totally alone. He has no parents, is not married, and has no children. He gets picked because if the experiment goes awry no one will care. That’s the reasoning, anyway.

He and another test subject, a prostitute named Rita (played by Maya Rudolph), think they’re only going to be in stasis for a year, but because of a military scandal the project is forgotten, and the two test subjects end up in stasis for 500 years. Don’t these stories always end up this way? They should be paying royalties to the people who made Buck Rogers in the 25th Century. Anyway, that’s how the story starts.

Joe wakes up in a trashed out, dumbed down society that looks like it was made by the people from WWF and Jerry Springer. Rita is awakened at the same time. The movie is a farce, so take what follows here through that lens.

If you’ve ever wondered what our modern society would look like if we re-entered the Dark Ages, this would be a good example. Unfortunately the movie wastes a half-hour making fun of “all the stupid people”. It’s not funny. I was worried it was going to turn out like “Howard The Duck”, where the introduction was good, but the rest of the movie sucked. The movie picks up again with some good material eventually. It’s even interesting to look at from an anthropological perspective after the boring stuff is overwith. It’s kind of a rip-off of H. G. Wells’s The Time Machine, but without the Morlocks. There are several references to a “time machine” in the movie. Perhaps it was an inside joke?

Joe is hired by the government (also made up of dimwits) to fix the nation’s crop problem. Nothing grows. What gradually dawns on him is that societal decisions have been heavily influenced by certain corporations. Of course government officials are so out of it they don’t realize this has happened. They’ve just taken the corporate marketing hook, line, and sinker. He realizes nothing is growing because they’re not using water to irrigate the crops. Instead they’re using a sports drink called “Brawndo”, made by a company of the same name. He tries to get them to use water, but to them water is only used in toilets. They don’t even drink it. They have a serious problem understanding why he wants to use “toilet water” to irrigate the crops. Even though the evidence is staring them in the face, they’re stuck on the idea that Brawndo is good for the crops. This made me smile. Being in the computer field, I can recognize this kind of cognitive dissonance in myself and others. You know, when the only tool you know is a hammer, everything looks like a nail? It happens in this field all the time. It’s a sign of cultural backwardness.

An aspect that keeps getting harped on is how “gay” Joe sounds. He’s just articulate. This causes the others to not take him seriously. Even while he’s trying to explain to them what they need to do, they just laugh him off. He gets so discouraged. He can’t believe he’s the smartest man on Earth, and he can’t believe everyone else is so dumb. At one point Rita asks him, “You think Einstein walked around thinking everyone was a bunch of dumbs__ts?” Interesting question.

Despite them blowing him off, he manages to get water to the crops, but then a conflict arises over a classic public interest problem: employment vs. a public health issue. Since they’re not using Brawndo for crops anymore, thousands of workers at the company get laid off. It illustrates the kind of one public/corporate interest vs. another public interest battles that I’m sure go on in government all the time.

The movie satirizes our popular culture today, but the creators may have done that just to create something that seems familiar. I think they lost a good opportunity to make a point about it like, “Hey, we’re capable of being smart, but our culture is making us dumb,” something of that sort. For the most part it was just “smart people” vs. “dumb people”, as if never the twain would meet.

The point of the story is about doing the work of maintaining civilization, and not taking it for granted. If necessary, do the work of making yourself smarter on your own. We can’t slouch on the job. I can appreciate that message, even in a comedy. I talked about this topic here.

Overall I’d give “Idiocracy” 2-1/2 stars out of 5. The acting was OK. I think the concept was good, but it was hindered by mediocre writing in some spots. Even so I got some good laughs out of it, and it conveys a message that’s rarely heard in our popular culture.

Edit 2/28/08: I watched the movie again, and thought I’d brush up this review.

Read Full Post »