Reminiscing, Part 4

Atari ST

While in college I got an Atari Mega STe, around 1992. It had a 16-bit Motorola 68000 CPU, and was soft-switchable between 8 Mhz and 16 Mhz. I got it with 1 MB of RAM. I eventually upgraded it to 4 MB, and added an internal 40 MB hard drive. That was kind of the average size at the time.

Some history

Jack Tramiel, the founder of Commodore Business Machines (the maker of the Commodore 8-bit and 16-bit computers), dramatically left his company and bought Atari from Time Warner dramatically left his company and bought Atari from Warner Communications in 1984. Commodore and Atari were fierce competitors, so he was competing against his own handiwork. After he took over, Atari produced new models: the XE 8-bit series, and the ST series. The first ST models came out in 1985, just a year after the Apple Macintosh was introduced. They ran on the Motorola 68000 CPU, running at 8 Mhz, using Digital Research’s Graphical Environment Manager (GEM) for the desktop interface. It and the Commodore Amiga came out around the same time. Both featured GUIs that had a similar “look and feel” to the Macintosh, with the main difference being they had color displays, whereas the Mac ran in monochrome. An exciting feature of the Amiga was it had pre-emptive multitasking; the first personal computer I knew of to have that, though I recently discovered that the Apple Lisa, which had been released a couple years earlier, could multitask as well.

Here’s a Computer Chronicles episode from 1985 that profiled the Atari ST (and Commodore Amiga):

The ST developed a niche with musicians. I wasn’t a musician, so I didn’t buy one for that reason, but I used to hear about some famous musical artists who used STs in the recording studio and in live concerts. The most prominent of them was Tangerine Dream. The ST also had a 16-bit, 3-voice Yamaha synthesizer chip. From what I understand it was the same sound chip used in the TI-99/4A personal computer that was sold in the early 1980s.

The STe series had 8-bit stereo PCM sound capability as well, similar to what was available on the Commodore Amiga, though the Amiga had more sound channels.


At first I used my STe mainly for connecting up to the school’s computers. I finally got decent 80-column ANSI terminal emulation. I also used it to write a few papers, using what was called at the time a “document processor” called “Word Up.” My main accomplishment on it was writing an article for a magazine called Current Notes.


After I got my bachelor’s degree I did some C coding on it, using Sozobon C. Of all the computers I had used up to that point, I did the least amount of programming on it. I always intended to get into programming GEM, but I never got around to it.

I started into something that really interested me though. A few years earlier I had been introduced to the idea that the solid inner planets in the solar system had been formed by asteroid collisions. I was curious about exploring this idea, so I created a “gravity simulator” in C, using Newtonian physics to see if I could recreate this effect, or just see what would happen. I got some basic mechanics going, creating orbits with dots on the screen, but it wasn’t very exciting. For the first time the speed limitations of the hardware were holding me back. It was able to handle maybe 10 or 20 gravitational objects on the screen at once, and after that it really started to bog down. I continued this exploration in C++ when I got my first Windows machine some years later, to better effect.

MiNTing an Atari

In the meantime I had found out about an open source adjunct kernel for the ST called MiNT (a recursive acronym for “Mint is Not Tos”), written by Eric Smith. The term “open source” didn’t exist back then. There was just GNU, and “free software” with source code available. “TOS” was the name of the ST’s operating system. It literally stood for “The Operating System.” Original, huh? MiNT offered a Unix work-alike environment. It had a complete C/Unix API. It came with a lot of the amenities, too, like Bourne shell, GCC, UUCP, NNTP, rn, mail, man pages, X/Windows, etc. A lot of the GNU stuff had ports that worked on the ST. I remember taking my semester project I had written in a graduate level course on compilers, and compiling it on MiNT using GCC, Flex, and Bison.

It implemented pre-emptive multitasking. This only worked inside the MiNT environment, and it only went so far. I quickly realized that even though I could run multiple simultaneous processes, if I ran too many (more than two or three) it would grind down the machine. Since it was an adjunct kernel, I could still run a GEM program at the same time. GEM only offered a single-tasking environment. MiNT didn’t help there.

GEM had a feature for a special kind of program called a “desk accessory.” The Macintosh had them, too. You could call one up from the “Atari”/”Fuji” menu. A desk accessory called “TOSWin” was my friend. It enabled me to run a MiNT and GEM session simultaneously. For some reason GEM allowed any desk accessory to run side by side, simultaneously with a GEM application, as if it was multitasking, but it didn’t allow you to run more than one GEM application at a time. I remember one of the things I did with TOSWin was I’d sometimes call a BBS that was really busy. Instead of running my GEM terminal program and letting it redial continuously, making it impossible for me to do anything else, I wrote a shell script in MiNT that would redial the number for me until I got a connection. In the meantime I could run another task in GEM while I waited.

The only thing MiNT really lacked was virtual memory. It was not able to page to disk, so I was limited to a maximum of 4 MB in which processes could run. For what I did with it I don’t think I ever hit a memory ceiling. It had virtual memory in the sense that it ran programs using relative addressing. It also had a unique way of handling multiple instances of the same program in memory. It created a new data area, and a new program counter for each instance of a program, and each program counter iterated over the same in-memory program image for each duplicate instance. This helped save memory from redundancy.

It eventually got to the point where I was working in MiNT most of the time, and I was using GEM hardly at all.

Soon after I got into this stuff I realized that MiNT was not unique on the 68000. A Unix-like OS had been around for years on the Radio Shack Color Computer, which used Motorola’s 8-bit 6800 CPU, called OS9. There was also a version for the 68000. MiNT was a lot like it.

Just a note of trivia. Eric Smith came to work at Atari around 1991/92 and worked on the operating system for the Falcon 030. The end result was MultiTOS. MiNT was the new kernel (it was renamed “MiNT is Now Tos”). A new GUI layer was written on top of it, called XAES, which enabled users to run multiple GEM programs at once. The OS was loaded from disk instead of ROMs.

A bit about the Falcon

The Falcon was a pinnacle achievement for Atari. It had some things going for it. Like the TT, their first 32-bit model, it produced VGA-compatible video, though it had 16-bit color. The TT had 8-bit color. One of its features that attracted a lot of developer interest initially was its digital signal processing (DSP) chip; the same one that came in NeXT computers. The problem was it was too little, too late. The market for Atari computers was dying by the time it was released. Atari underpowered the Falcon, forcing its 68030 CPU to run at 16 Mhz so it wouldn’t compete with the TT, which ran at 32 Mhz. Atari made the Falcon for about a year. They stopped producing computers altogether in late 1993.


These are the games I can remember playing a lot on my STe.

Llamatron, by Jeff Minter, Llamasoft

This video was put up by someone else. This game was a version of Robotron 2084, except you’re a…um, a llama. The game was freeware, though Minter took donations. It was ported to a few platforms. I know it ran on the Amiga, and PC as well. I thought it was incredibly creative. Each level was something new. A great feature of it is Minter knew you’d get beat by it. So he put in several “Do you want to continue?” prompts whenever you lost your last life, so you could get a new set of lives and continue. The point was to have fun with the game, not to lose and get discouraged and have to start all over.


MIDI Maze, by Xanth Software F/X and Hybrid Arts

I just had to mention this one, even though I only played it once. Oh my god! This game was SO MUCH FUN! 🙂 It was the first 3D multiplayer networked first-person-shooter I ever played. It was released in 1987. Every player was a round smiley face of a different color. The only opportunity I got to play it was at a computer show at a mall. I can’t remember who set it up, maybe a user group. A bunch of STs were set up, all networked together via. their MIDI ports (hence the name of the game). The early STs had no built-in networking capability, but someone figured out how to use the ST’s MIDI ports as network ports. No special adapters were required. Just plain ‘ole MIDI cables were used to daisy chain a bunch of STs together.

We had to find each other in the maze and shoot at each other. All the mazes were single-level. Scoring was done on the musical staff you see in the upper-right corner of the screen. Once you got up to the G note, you won. 🙂

Fire and Ice, by Graftgold

I only had a demo of this from a disk I got out of ST Format magazine. I played it often anyway. You play a character named “Cool Coyote.” It was an entrancing platform game. The map I played is the one you see in the video above. It kind of felt like you could take your time. Even the music was calming. You’re moving in kind of slow motion through the water, freezing and then smashing sea creatures to get pieces of a key you need to escape the map. Along the way you pick up score items like gems, stars, and dog bones. The audio is a bit out of sync with the video. I couldn’t help that, unfortunately.

F-15 Strike Eagle II

F-15 Strike Eagle II, by Microprose

This was my flight simulator. It was a port of the version from the PC, so it wasn’t great. Everything was represented by non-shaded, non-texture-mapped polygons, which was how most games did 3D on the ST series. You went on bombing missions of enemy targets, taking off and landing from various locations. Along the way you had to evade SAMs and enemy jet fighters trying to shoot you down. The most “exciting” location you could land at was an aircraft carrier. I hated it. I could land the plane fine, but unless I hit the deck right where the landing strip began, I risked going all the way across it as I slowed to a stop, thinking I had made a successful landing, and then tipping over the opposite edge “into the drink.” I could have a completely successful mission, and then “fail” on the landing like this. I couldn’t tell if this was a bug, or if they made it this way on purpose.

The ones that got away

I always admired Starglider and Starglider II. I saw others play them. For some reason I never got them for myself. Maybe they had gone out of print by the time I got my STe.


Like I was saying in Part 1, I really liked the demos that were works of art. They were usually created at “demo meets” where coders would get together for annual conventions, and they’d either prepare the demos beforehand, or make them on-site. They were released for free on the internet. I used to download tons of them. They’d usually come compressed into disk images, which I’d have to decompress to one or more floppy disks, and then I’d try to run them. That was the thing. Sometimes they’d use tricks that were so specific to the machine they coded them on I had to be using the exact same model to get it to work.

The demos were about showing mastery of the hardware technology, showing just how much computing power you could squeeze out of the machine. I don’t know what went into creating them, because that was never my thing, but I’m sure there was a lot of assembly coding involved, maybe some C, and a lot of math. To the best of my knowledge there were no multimedia construction sets around for the ST, so each demo was a unique creation from hours of coding. Of course, they were all from Europe.

These are a few of the best demo examples I could either find on You Tube, or put up myself. “High Fidelity Dreams” and the “Synergy megademo” ran on most STs.

High Fidelity Dreams, by Aura, 1992

Edit 5-11-2017: There was a demo I’d mentioned before that I couldn’t get a screencast of because my computer was too slow. Someone else posted a video of it playing on an Atari STe, and I’ve added it here. Fair warning if you are easily offended, there is a little mature language in this video towards the end when the credits roll.

Grotesque demo, by Omega, 1992

You wouldn’t know it by watching this, but in my opinion this is was the best Atari ST/STe demo I’d seen for many years, in terms technical accomplishment. I remember it sounded really cool if I ran the audio through an amplifier! There are some strobe effects in the demo (probably not good to watch if you have epilepsy!).

When the credits roll the author gets into some of the technical specs. for the demo. Right towards the end the author put up a 3D cube that obscures some of the text. So here it is, misspellings and all:

It took me three weeks for the first version. One weeks extra for the final. I made an extra version because I found some bugs in the demo. And to make a new version more interesting, I added the vector objekts under this text. Just to show the speed of my polygon routine. Even I need to boast…

I have even made up some ideas for future demos. But I would be surprised if Omega released more demos on the Atari… Sadly to say, but the new Atari computers doesn’t live up to my ideals. And even more sad is, that my ideals are not unrealistic high… So for the moment I think that the Archimedes is the best choise. It only have 4096 colors, eight 8 bit sound channels, but the Archimedes 5000 model runs on 20 mips! It can move a half meg in one VBL. The Falcon moves about 56K in the same time…

But time will show us what computers (or media…) Omega will turn up next time.

Bye. Micael (TFE) Hildenborg


Brain Damage demo on the Atari STe, by Aggression and Kruz, 1993

Megademo intro., by Synergy, 1993

All of these demos came out in the early 1990s. I think they’re exceptional for the time period. The soundtracks rock! But that’s just my opinion. 🙂

I always enjoyed the Synergy Megademo intro. Members of the group as they introduced themselves flashed words on the screen that were on their minds. I thought this was unique and intriguing. It told you a little about them, but left you guessing. Some of them always gave me a smile, because of the sequencing, like: “hairy,” “Sharon,” “Stone.” Yech! What an image! 😛

I can explain some of the stuff in it:

“No Precalc”: A common trick in demos was coders would put precalculated tables or coordinates into the graphical parts, so it would look impressive, but the computer wasn’t working very hard to show it to you, just rendering data. Coders got more respect if they had their demo calculate the graphical effects, because it showed true optimizing cred.

“MULS”: A 68000 opcode for “signed multiply” of 16-bit integers.

“NO CARRIER”: This was a phrase computer users of the time were intimately familiar with, but you barely see anymore. It’s the message you used to get in a terminal program from an aborted phone call when using a modem.

“YM 2149”: This was the model number of the ST’s Yamaha sound chip.

“DTP”: This is an acronym that could’ve stood for many things. My guess is “desktop publishing.” There were a couple professional DTP packages for the ST.

“Autoexec”: There was a folder you could set up on an Atari disk called Autoexec where you could put executables, and the computer would automatically run them when you booted the machine.

“GFA”: This stood for GFA Basic, a programming language.

“133 BPM”: I thought at first this meant “bits per minute,” but it really means “beats per minute” for music.

“Front 242”: A lot of ST demos put the name of the industrial music group “Front 242” somewhere in their demo, usually in the credits. You see it in this intro. I tried listening to some Front 242 music to see what was so inspiring about it, but I couldn’t get into it. The music in these demos was usually real good, and I kept thinking it was from the group.

There’s also a graphic from an “Art of Noise” album cover, a group that was popular in the 1980s.

Edit 5-22-2015:


Summer Delights on the Atari STe, by Dead Hackers Society, 2011

An interesting thing about this one is it combines use of the STe’s Yamaha and PCM sound chips.

Edit 5/11/2017:

Sweetness Follows on the Atari STe, by KÜA, 2015

All the audio in this demo is coming from the computer.

Edit 1-5-2020: I’ve got some more…

Thunderdome – Making Of & Lost Scenes on Atari STe, by Checkpoint, 2015

Everything you see and hear is playing on an Atari.

Bad Apple on the Atari ST or STe, by StackDesign, 2016

Again, everything you see and hear is playing on an Atari.

There are a ton of versions of this demo running on all sorts of platforms. It seems like demo coders of a bunch of stripes took up making this digital music video as a kind of challenge.

Edit 3-29-2010: I’ve looked for Falcon 030 material to put here, but a lot of it has been lackluster. I just found the following demos, which were ported to a modified Falcon from the Amiga 4000 a few years ago. These demos play on a Falcon 030 with an accelerator card, containing a Motorola 68060 running at 66 Mhz. They’re some of the best I’ve seen on an Atari computer. I said earlier that in my opinion the best graphics & sound demos are works of art. These take what had been my concept of computerized art up to another level. They were created by a group called “The Black Lotus” (TBL). According to the descriptions, they contain some features unique to the Falcon versions. All the music you hear is coming from the Falcon as the demo runs. Enjoy.



Ocean Machine

Edit 5-22-2015: The Black Lotus does it again. They came out with a new demo for the Amiga 4000, and once again ported it to an enhanced Atari Falcon030 (with a 66 Mhz 68060 accelerator board). It came out in January 2015.


Edit 5/11/2017: I found yet another good enhanced Atari Falcon demo (run with a 68060 accelerator). Once again, it was originally written for the Amiga in 2012, and was ported by the same group for the Falcon, in 2013.

Kioea, by Mad Wizards

I finally found a couple good stock (unmodified) Atari Falcon demos.

In2ition, by Mystic Bytes, 2013

Electric Night, by Dune and SMFX, 2016

(The part that looks like it’s coming from a VCR at the beginning is coming from the demo.)

Atari popular in Europe

Atari had its success with its computers in Europe, not in the U.S. The ST was able to create some buzz in the computer industry the first couple years after its introduction, but after that its influence gradually faded away.

I went to the UK in 1999. While in Salisbury I visited an internet cafe. As I entered I noticed a shelf unit running along a wall. Piled on the shelves were old Atari STs and Commodore Amigas. I’m not sure why they were just sitting there. I talked to the guy running the shop, saying I was glad to see them. They brought back memories. I think he said people in the neighborhood dropped them off because they were getting rid of them. I can’t remember if the guy said he was giving them away, or what. Anyway, I’d venture to guess this would be an unusual sight in the U.S. The Atari ST didn’t make it big here. It might have gotten to 1-2% market share at most, and even that’s a rough guess. A lot of the people who bought them here were programmers and musicians.

The end of an era

There were 4 generations of Atari 16- and 32-bit computers before Atari (the consumer electronics company) faded away: 1) the original ST line, released in 1985; 2) the Mega ST line, which came out in the late 1980s; 3) the STe line, and the TT (their first 32-bit model), which came out in 1990/91; and 4) the Falcon 030 (the 2nd and last 32-bit model), which came out in 1992.

Atari had a complex history, but the company that was named Atari, which produced the home console gaming machines and computer lines, stopped making computers in 1993. They tried to focus exclusively on their video game systems. Atari faded away to nearly nothing. The company was bought by a disk drive manufacturer, JTS, in 1996. The purchase of the company was really a way to sell Atari’s intellectual property. The company ceased to exist for all intents and purposes. There were no employees left to speak of, and the Tramiels did not come over to JTS. A couple years later the IP was bought by Hasbro, and they brought out updated versions of some old Atari classic video games like Pong and Frogger. A few years after that the IP was bought by Infogrames. Infogrames officially changed its name to “Atari” in 2003. In name, Atari still exists, but it’s not the same company. The only legacy that still lives on from its former existence is its video games.

In my opinion Atari computers were a casualty of the internet, though I could be wrong. The PC was finally surpassing them in capability as well. When the internet began to get popular, Atari was caught flat-footed. You could always hook up a modem to an Atari, but it had no TCP/IP stack at the time, and no web browser or e-mail applications. This was true even on the Falcon. The Mega STe and TT had LAN capability, but that was used for P2P connections, if anything. I remember hearing in the mid-90s that someone had developed a free TCP/IP stack as a desk accessory. I believe it was called “STiK.” I don’t recall hearing about a web browser at that time though. The only browser that may have existed for it was probably lynx, an open source text-only HTML browser.

I kept using my STe until 1997, when I got my first Windows PC.

The passing of Atari and Commodore (they declared bankruptcy in 1994) from the computer scene marked the end of an era.

The first generation of personal computers

Beginning in the late 1970s many companies got in on the “personal computer craze.” By the early 1980s everybody and his brother was in on it, though this mass proliferation of hardware designs flamed out within a few years. When the shakeout was overwith, the only companies still in the business in the U.S. were Commodore, Atari, Apple, and IBM. Commodore and Atari made low-end and mid-range consumer machines, and Apple and IBM made high-end business machines. At some point in the late 1980s, IBM retreated from the personal computing business, largely due to the PC clone manufacturers who undercut it in price. It didn’t completely leave the business until a few years ago when Lenovo bought its PC manufacturing operations.

The era was characterized by an emphasis on what the hardware could do. The operating system was incidental. Different computers had different capabilities. This began to change with the introduction of GUIs and then multitasking to personal computers. The operating system became more visible, and you communicated with it more, rather than some programming interface. In the early machines it was always possible to access the hardware directly. In the 1990s this became less possible. The operating system became your “machine.”

Edit 11/17/2012: I used to like this idea of the operating system becoming your “machine,” but I now realize it gave no semblance of a computing environment, as the older machines did. You still had an “instruction set” of sorts, but it wasn’t as nice.

A common theme in the 1980s was that each system was incompatible with the other. You couldn’t exchange software or data, because each computer stored stuff in a different format. Each computer used a different CPU and chipset. Attempts at compatibility were tried through emulation, but it was rare to see an emulator that ran fast enough to make it truly useful to most people. Computers were not powerful enough then for software emulators to provide a satisfying experience. The best emulators were hardware-based, containing some of the original chip technology, like an Intel CPU, for example.

Eventually standardization set in. IBM became the standard business machine, with Microsoft becoming the standard bearer of the operating system that ran on it, and its clones. The Apple II and then the Macintosh became the standard computer used in schools and creative businesses, such as publishing. The Commodore 64 became the “standard” computer most people bought for the home, though the PC eventually took over this role once it came down in price. The same has happened with the schools, with PCs taking over there as well.

Returning to what I said earlier, what really changed the game in the 1990s was the internet. The computing platforms that have survived are the ones that best adapted to it. The ones we use today are a product of that era, though the GUIs we use are the lasting legacy of the 1980s.

I’ve got “one more part in me” for this series, coming up.

—Mark Miller,

9 thoughts on “Reminiscing, Part 4

  1. I played Starglider and Starglider II extensively.

    The first was one of the first games released for the ST. I remember being at an Atari “fair” where they had demos going of Starglider, Mudpies, and Time Bandit. Starglider blew everything else away. It had digitized music of a real band playing: “Staaaaaaar Glider… (do da de da dooda do da deedee da) … From Rainbird (Rainbird rainbird rainbird….)”

    The game play was fantastic. Every wireframe bad guy had different behavior. The Stomper, the Walker, and the Starglider all had differenent “personalities.” You weren’t just blowing up clay pidgeons. Also cool was the way you launched missiles– you’d switch to the missile’s perspective and guide it to the target. If the ST was the spiritual heir to the Commodore 64, then Starglider was the spiritual heir to the 8-bit Star Raiders.

    Starglider II upgraded to filled-in polygon graphics. The bad guys were much less interesting, but you got an eniter solar system to explore. This game was had more of a graphic-adventure feel to it. This is one of the few games that I beat without using a cheat book. The game came with a tape of synthesizer music that I played over and over and over….

    (To complete your picture of Atari history, you should definitely mention the Lynx and the Jaguar.)

    I can think of a couple of dozen games that were extremely significant on the 8-bit Atari computer. There were only a few for the ST, though. And for the Jaguar and Lynx, even fewer. The big shift in computing for me is when you move to a platform where its impossible for a lone teenage to accomplish something significant up in his bedroom. (I completely lost interest in anything to do with computer RPGs when they ceased to be more-or-less the product of one person’s work. That would be about Ultima IV or so, I think.)

    Another thing about 8-bit computers– if it looked or sounded “digital” it was somehow cooler. We loved tapes of beepy versions of Bach and so forth…. If a poster looked like a bad dot matrix printer made it, it was somehow better– because it was “computery.”

    I think that one thing that underlies my interest in Lisp (and even Smalltalk and Emacs) is the fact that I still carry with me a feeling that the wrong platform won the home computer wars. Why learn machine language or assembler for the x86 computer when its clearly the wrong hardware?!

  2. Hi Lispy.

    I don’t remember when Starglider I came out. As I recall, it came out for 8-bit computers first. I’ve seen versions of it for the C-64 and Apple II. It wasn’t ported to the Atari 8-bit for some reason. Yep, I remember the music for it on the ST version. 🙂

    I remember really being blown away by Starglider II. The 3D animation seemed reasonably fluid, and the fact that you could fly between planets was awesome! You’d literally be doing stuff on one planet, take off from it, and then you were up in space, doing stuff there. It felt like a complete, realistic universe.

    I’ve looked at Starglider II recently and I don’t have the excitement about it I used to have. I think it was because at the time it was the best game I’d ever seen. Now I’ve seen better in terms of graphics and action. It’s definitely aged.

    My focus in this series was on the evolution of personal computers, talking about the best accomplishments that were made on them, from what I can remember. It’s pretty subjective.

    The only time I remember being really into pure gaming platforms (consoles) was when I was a kid. I remember I wanted an Atari 2600 really bad in the late 70s/early 80s. So I can relate to when you say “anything computery was cool”. That was the appeal of Tron, though there was something more about it to love when I finally saw it. Looking back on it, it was a “pop culture”.

    I’ve been a believer in the personal computer as a platform. When my Atari STe was showing its age, and the web/internet was growing in popularity I glommed on to the Wintel PC, because I felt as though it was the only popular platform left that I could relate to. It gave some vestigial connection to the past. What I realized when I discovered Alan Kay’s work much more in depth last year was that he was basically the “founder” of the personal computer. He would say there were others who came before him who did groundbreaking work as well, but he’s the most visible figure and champion for that vision. And from this perspective, the Apple Macintosh was actually closer to “the real thing” (though not all the way there) than what I’ve been using. What I also realized in a profound way was that while he had laid out a vision of what personal computing, and programming could be, it was ignored, or only partially adopted by the industry. So I’m in agreement with you that the wrong platform won out. 🙂 Alan Kay explained it this way in an interview he did with ACM Queue in 2004:

    Perhaps it was commercialization in the 1980s that killed off the next expected new thing. Our plan and our hope was that the next generation of kids would come along and do something better than Smalltalk around 1984 or so. We all thought that the next level of programming language would be much more strategic and even policy-oriented and would have much more knowledge about what it was trying to do. But a variety of different things conspired together, and that next generation actually didn’t show up. One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.

    You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.

    So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.

    And so we come full circle… Gosh, there’s so much in those 3 paragraphs. Several of the posts I’ve written have been based on different aspects of this interview, elaborating on them and interpreting them, based on my own experience.

    It’s ironic that even though there was a significant educational aspect to Smalltalk when it was invented, I don’t think it made it into the schools. I never saw it. What we saw was Logo, from one of Alan Kay’s mentors, Seymore Papert, and even then the schools didn’t get it. We were taught it as a way to introduce us to programming in a procedural fashion. That was never the point. The point was to teach children programming so that they could learn Calculus using linear algebra! Somewhere along the way the message got lost. I couldn’t tell you how it happened. I know now that Papert wrote the book Mindstorms that talked about his experience using Logo with kids. It sounds like most educators never read it. Maybe it should’ve been shipped with every copy of Logo. Maybe the people who ported Logo to different platforms never understood its purpose either. It’s as if we were to try to introduce rocketry to tribal people who had never seen or heard of it before, by just giving them the technology for it, not telling them what it’s really for, and then all they learn how to do is build spears in the shape of fuselages, because for some reason whoever gave them the technology forgot to give them the knowledge Werner von Braun or Robert Goddard had about it.

    There was a book I started reading a while ago on Kay’s recommendation, called The Myth of the Machine: Technics and Civilization, by Lewis Mumford. You can find Kay’s reading list here. Mumford said one of the great myths of our society is that technology is what’s brought us progress. The true cause of progress is the evolution of our own way of thinking. Kay says that’s what school is about, to teach those inventions of thought so that we don’t have to repeat the cycle of relearning them every thousand years.

    Papert and Kay worked to bring Logo into the schools in the early 1980s, and they both eventually realized what I see now: that teachers were missing the point. Perhaps they learned the hard way what Mumford was talking about.

    Lately I’ve gone back to learning about Lisp, reading the Lisp 1.5 programming manual, because it’s come up a few times as a foundational document for symbolic computing. I’ve already learned something that mystified me for years about Lisp: conses and lists are NOT the same things! Secondly, it’s helpful to think in pairs in Lisp. By now I’m wondering why I wasn’t shown this when I was learning Lisp (briefly) in college for cripes sake! I don’t know for sure, but my guess is it would’ve saved me a lot of confusion.

    Anyway, enough rambling for now.

  3. I read Mumford in college and Papert when I was in college in the early nineties. (I would waste huge amounts of time looking at stuff not on the curriculum.) You can count on Teachers to mangle anything they get their hands on, though. “Build spears in the shape of fuselages….” Heh. With the wasteland that is public highschool on one side and the “Java-School” ComSci departments on the other, I’m decidedly pessimistic of how things are going to play out from here. It’s the kids that write “Caves of Ice” on their own time that will find a way– in spite of all of the grown ups that think they’re trying to help them.

  4. Re: “Build spears in the shape of fuselages…”

    I was going to say they would “build bombs out of the rocket fuel”, but somehow I felt as though that wasn’t as realistic. I can imagine the “spears” thing happening. It would be pragmatic for their current level of knowledge, if they were not taught anything else. Maybe they’d eventually get around to the “bombs”.

    I don’t know, but my suspicion is that the reason things happened the way they did was that despite teachers saying they were interested in knowledge for its own sake, they had their own sense of pragmatism, and what was important for students to learn in order to make it in the world. There’s also the problem of just basic computer literacy with teachers. It’s been a joke for years that it’s the kids who understand this stuff, not the adults, and it’s not far from reality.

    In their minds teaching 10-year-olds Calculus has no practical application, because they won’t encounter it again in the normal curriculum for another 7 years, at least. Plus, they probably didn’t take Calculus themselves, or if they did, didn’t like it. And what if the kids got it and they didn’t? Wouldn’t that make them feel stupid.

    There’s a CS professor at GA. Tech. who I reference from time to time in my blog, Mark Guzdial, who I think has a decent level of vision, more than many. Recently he was somewhat critical of eToys (an educational package in Squeak) in the way that Alan Kay has tried to use it, to teach kids differential equations. He made a pragmatic argument. He wondered whether the kids were really learning DiffEq, first off, and secondly, because the eToys experiment didn’t cover algebra at all, he thought that even if they did learn DiffEq, he doubted whether the kids would be able to map their knowledge into the algebraic notions of it when they would first encounter it in college several years later. They’d probably forget what they learned by then, he figured. He therefor thought this particular attempt at using eToys was kind of pointless. He was thinking, “eToys is not going to put a dent in the overall school curriculum. Therefor you’re wasting time.” I think he’s wrong. I had a direct experience that showed it’s possible kids will remember these lessons over a period of time, though maybe the timing of it is significant. Maybe its effectiveness would be more effective in the current curriculum if it was just introduced later, though I’m sure the message that Kay was trying to convey is “the math curriculum is obsolete”.

    I participated in a couple ACM programming contests while I was in college. On one, as we were being transported to the contest site, I eavesdropped on a conversation a student and our sponsor professor were having. Our sponsor taught computer graphics. The student was discussing with him how he was taught to draw circles in Logo, and that he was taught that what they were doing was a form of Calculus (I can’t remember what it was called now). I felt rather betrayed. I felt like, “Hey. I had Logo. We drew circles in it. Why wasn’t I told this??” I don’t know when the student was taught this, maybe in high school.

    I’ve read characterizations of what Papert and Kay tried to do which have said they tried to “un-school” education. So anyone who believed staunchly in “school” as it existed fought it, and found ways to integrate what was being introduced into the “school” model, which effectively took everything that was powerful about it out.

    And there are people who staunchly believe in the school model as it exists, thinking it’s the best model for creating wonderful educated people. I’ve seen examples of this especially when the subject of school reform comes up. All it seems the teachers can talk about is the institution, not the students. They swear up and down that they’re teaching the students well, and I’m sure they believe it. It would hurt their pride to admit otherwise.

    I used to say that teachers were doing a good job and deserved to be paid more and such. From the sound of things though it’s sounding more and more like that’s not true, at the rate that professionals are preferring private schools, or so-called “semi-private schools” (public schools in exclusive, wealthy districts) over public schools, even in places that used to be known for having excellent schools, like New York City. I read one account a few years ago of someone who is now in his 60s, who revisited a school he attended in New York, and found that kids were now learning in 9th grade what he learned in 8th.

    An interesting perspective Alan Kay gives is that a lot of the knowledge that’s taught in schools now is from the 19th century and earlier. Little of what’s been discovered in the 20th century is being taught. Obviously some has been taught. For example, continental drift is a 20th century discovery. Maybe he’s talking more about math and modes of thought.

  5. Just watched the Computer Chronicles episode. Wow, that takes me back. The Opinion guy basically said to ignore the ST and Amiga, but the CGA program that got rave reviews looked awful– and the guy specifically said how cute the opening screen was!

    The ST was the obvious choice: “Power without the price.” (I wonder how Stoneware did in the long run… and if could they afford big-haired California girls to ship out their stuff for much longer….) I’m probably biased.

    Did you see on the hard-drive they had Forth and Logo and other good programming stuff? 🙂

  6. No, I didn’t notice the programming stuff. I know their systems used to come with ST BASIC, which I think was a direct port of GEM BASIC, or something, from the PC version of GEM. That’s the version of BASIC Compute! used for it.

    By the time I got my Mega STe all the programming stuff was gone. I think I got something that was called a “language disk”, but it had no programming languages on it. I don’t remember what it had.

    I can’t remember the “opinion guy’s” name, but I think he gave a realistic analysis of the situation. He didn’t say that the ST and Amiga should be ignored. He said that the ST and Amiga were likely to suffer from lack of software support, because the majority of developers were working on stuff for the PC and the Mac, and were unlikely to change from those two.

    A while back I wrote a blog post called “Great Moments in Modern Computing”, or something like that, where I had a link to another Computer Chronicles episode that covered the Apple Macintosh, a year after its release (1985). Paul Schindler (he’s the guy who reviewed the CGA program–I think it was called “Stock Trader” or something) gave a review of it saying he didn’t think the Mac was going to make it with business types, because it was slow and graphics-based, had hardly any software support at the time, and because it lacked color. He turned out to be right, perhaps because of the software support issue alone.

    Yep. The PC was the dominant platform then. IBM was the dominant company. “No one got fired for buying IBM” was the mentality. It was viewed like Microsoft is now, big and bad, but it set the standard. In fact, the government was pursuing antitrust charges against them at the time, which were later dropped.

    With the Stoneware segment I noticed that it looked like they had ONE programmer! There was a guy sitting in front of an ST and you could see he was working with code, though the video was too blurry to see what language it was in. Whatever environment he was working with, it didn’t look like an IDE. How primitive the stuff was then. I’m sure to a lot of developers Turbo Pascal on the PC seemed like a dream.

  7. Um… Turbo Pascal on the PC *was* a dream! To this day, it is still one of the best coding environments (relative to the task at hand) I have ever worked in. I would rather use Turbo Pascal to write DOS programs that Visual Studio (pre-2003) to write Windows apps, and pre-2005 for Web apps. 🙂


  8. Pingback: The 150th post « Tekkie

  9. Pingback: Jack Tramiel has passed away « Tekkie

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s