Looking at the rare Apples

I came upon some videos from a vintage computer collector named Rudolf Brandstötter. He goes by the handle “alker33” on YouTube. He also has a blog here.

I was gratified to come upon these videos, because they allowed me to take a look at these old computers. The Apple models I’d used growing up were the Apple II+, the IIe, the 512K “Fat” Macintosh, either the Lisa 1 or Lisa 2, and the Mac Plus.

The size of each video window below is so small that if you want to take a good look at what’s going on on the video screen of each computer, you need to blow the video up to fullscreen. You do this by hitting the “broken square” icon in the lower-right corner of the video window. To take it back to normal size, hit the “shrink” icon in the lower-right corner, or hit the ESC key.

The Apple I

This was a kind of mythical machine back when I was a teenager in the early 1980s. People didn’t talk about it much, but when they did, there was some admiration for it, because it’s what started the company, and it was the first computer Steve Wozniak built. Even back then they were collector’s items, fetching (as I recall) tens of thousands of dollars. I don’t think I got an idea of what it actually was until the early 1990s, for it was not sold as a consumer computer. It was just a motherboard. It came out in 1976, and famously sold for $666.66 ($2,691.95 in today’s money). Steve Wozniak was asked about the price recently, and he said it wasn’t a marketing gimmick. There was no satanic meaning behind it. He said they looked at the cost it took to make it, and ran it through a standard profit margin formula, and it just happened to come out to that number!

It did not come with a keyboard, a monitor, power supply, nor a storage device. It came with a MOS Technology 8-bit 6502 microprocessor running at 1 Mhz, and 4 kilobytes of memory, upgradeable to 16K. It came with composite video output built-in. The only software it contained in Read-Only Memory (ROM) was a machine language monitor that it booted into when you turned it on (that was the operating system!). The owner could use it to enter and run programs in hexadecimal via. keyboard. This was an improvement over earlier microcomputers like the Altair, which had their owners entering programs one byte at a time using a panel of switches. A tape recorder interface card had to be purchased separately to store and load programs. The card came with Steve Wozniak’s Integer Basic on cassette tape.

An Apple I in a homemade case, from Wikipedia

Brandstötter, as I recall, got his from someone who had framed the motherboard. Some of the chips had to be replaced, but otherwise it was in working condition, as you can see in this video.

An original Apple I

I have to admit, watching this is anticlimactic. There’s not much to see, but I was glad I got to see it anyway. Finally, I could look at this machine that I’d heard rumors about for decades.

Brandstötter takes you on a “tour” of the machine, and shows the Apple I manuals. The computer boots into the machine language monitor, from which he can either program in hexadecimal, or load machine code from a cassette tape. Brandstötter does one of the standard tests of typing in a “hello world” program in hexadecimal. You can read about the built-in operating system (all 256 bytes of it!), and the assembly mnemonics of this little test program here. Then he loads Integer Basic from tape (after which he types in a short Basic program), and then he loads a program that displays Woz’s and then Steve Jobs’s mugs as ASCII art. The Apple I did not have a graphics mode.

Lest one think that maybe Apple used some sort of digitizer to get their faces into binary and saved them to tape, I learned not too long ago that it was common back in the 1950s and 1960s among those who were into digital media to hand-digitize photographs in ASCII. That may have been what happened here.

What’s special about this, as Brandstötter notes in the video, is he owns one of 6 working original Apple I motherboards in the world. There are modern Apple I replicas that have been made that work exactly like the original. What I read in the discussion to this video (or one of the other Apple I videos I found) is that when Apple came out with the Apple II (the next one I show below), they had a trade-in program where people could turn in their Apple I motherboards. The reason they did this was it saved them on customer support costs. So there aren’t that many vintage motherboards around. (I’ve read a couple claims that there are between 40-60 of them in the whole world.)

The Apple II

The video below was an interesting find. I remember hearing from somebody years ago that there was such a thing as an Apple II before the II+. This is the first time I’ve seen one of them. Just from how it runs, it seems no different from a II+, though there were some minor differences (I derived this information from this Wikipedia page).

The II came with the same 6502 CPU as the Apple I, with configurations from 4 kilobytes up to 48K of memory. In its lowest configuration it sold for $1,298 ($4,921 in today’s money). The 48K configuration sold for $2,638 ($10,002 in today’s money). It came as a complete unit, with its own case (no assembly required), ready to be hooked up to a monitor or TV. It had several internal expansion slots, Woz’s Integer Basic, and an assembler/disassembler built into ROM. If you just booted into the ROM it would take you to Integer Basic by default. When it came out in 1977, owners still could only load/save programs on cassette tape. A disk interface, created by Steve Wozniak, and disk drive, along with a Disk Operating System (DOS) written by a company called Shepardson, came out for it a year later. Applesoft Basic (written by Microsoft), which handled floating-point, also came out for it later on tape. As you’ll see in the video, the II had a graphics mode. What you don’t see (due to the monochrome monitor) is that it was capable of displaying color.

The II+ came out in 1979, with 16, 32, or 48K configurations, and could be expanded to 64K. It had a starting price of $1,195 ($3,777 in today’s money). It came with Applesoft Basic in ROM (replacing Integer Basic as the standard language). The assembler/disassembler was removed from ROM to make room for Applesoft Basic, though it retained a machine language monitor that users could enter using a “Call” command from Basic. Owners could load Integer Basic from disk. In addition, the graphics capabilities were enhanced.

The Apple II (not the II+)

Brandstötter types in a brief Basic program that prints numbers across the screen. He copies some files from one disk to another to show that both disk drives work. He brought back memories loading up Apple Writer. This was one of the first word processors I learned to use in Jr. high school. Lastly, he loads up a game called “Bug.”

The Apple III

I only saw Apple III‘s back in the day used as props on a TV show called “Whiz Kids.” The computer came out in 1980, and was designed as a business machine. It used a Synertek 8-bit 6502A 1.8 Mhz CPU. I think the model Brandstötter uses in the video has 128K of memory (it was capable of going up to 256K). It sold for from $4,340 -$7,800 ($12,084 – $21,718 in today’s money). The OS it booted into was called SOS, for “Sophisticated Operating System.” As you’ll notice, it defaults to an 80-column display (the prior Apple II models had 40-column displays). Interestingly, the OS runs through a menu system, not a command-line interface. It’s reminiscent of ProDOS, which I remember running on the Apple IIe sometimes.

The Apple III was designed to either run off of a floppy drive, or a 5 MB hard drive Apple sold called “Profile” (or both). You don’t see the Profile in this video, though you’ll see it in the video I have below on the Lisa.

The Apple III

Here’s Steve Jobs in 1980 describing how the company got started with the Apple I, his philosophical outlook at the time with the Apple II, and what he was looking forward to with “future products.” I really liked his perspective on what the computer enabled at the time. I get the sense he had a very clear idea of what its value was. With regard to “future products,” I think you can read from some of his answers that he was talking about the Apple Lisa, and possibly the Macintosh, though he was being tight lipped about getting into specifics, of course. Unfortunately there are some glitches in the video tape, and there’s a part that’s unintelligible, because it’s too badly damaged.

The Lisa

This is the Apple GUI computer that preceded the Macintosh. It came out in 1983. It used a 16-bit 5 Mhz Motorola 68000 processor, and came with 1 MB of memory, expandable to 2 MB. It sold for $9,995 ($23,022 in today’s money). The OS featured pre-emptive multitasking, enabling the user to run more than one application at the same time.

Here’s the Wikipedia article on it.

The Apple Lisa

Brandstötter tells an interesting story about the “Twiggy” floppy drives. They’re the two 5-1/4″ drives on the right side of the case. They were named after a 1960s fashion model who was famously thin. From a get-together Jobs had with Bill Gates 7 years ago, where Jobs mentioned them, I thought he was talking about the 3-1/2″ drives that ended up on the Macintosh, but in fact he was talking about these drives. Brandstötter says that Apple tried using it on the Mac during its development, but they ended up going with the Sony 3-1/2″ drive, because these 5-1/4″ drives were so unreliable.

They used special floppy disks that were only made for use on this drive (talk about lock-in!). They stored 870K per disk. Brandstötter shows them to the camera, and my goodness! They have two “windows” per side where they can be accessed by two read-write heads, in a double-sided fashion. (Normal 5-1/4″ disks only had one “window” per side.) The two read/write heads (one on top, one on the bottom) were positioned on opposite sides of the spindle, and moved in tandem across the disk, because the designers were concerned about head wear in a conventional double-sided disk configuration (with two heads opposing each other, on the same side of the spindle). Each head was opposed by a pad to press the disk against the head.

Apple ended up having buyers trade in their Lisas for Lisa 2’s (which came out in 1984), which had the more reliable 3-1/2″ drive. However, Brandstötter shows off the fact that his “Twiggy” drives work. After a reboot, he shows a bit of what LisaDraw can do.

This was a really interesting romp through some history, most of which was before my time!

Postscript:

As I was doing my research for this post, I noticed that there was some talk of Apple I software emulators, enabling people to experience using this vintage machine on their modern computer. If you desire to give it a whirl, here’s a page with several Apple I, and Apple 8-bit series emulators that run on various platforms. I haven’t tried any of them out. It looks like there might be a little setup necessary to get the Apple I emulator running. I noticed there was a tape Prom file (for, I assume, accessing tape files, to simulate loading/saving programs). Usually this just involves putting files like the Prom file in a known location in your storage, and directing the emulator to where it is when you first run it. Also, here’s the Apple I Operating Manual, which contains the complete hex and assembly listing of the machine monitor, and a complete electronic schematic of the motherboard. It’s up to you to figure out what to do with it. 🙂

I leave it up to readers to find emulators for the other platforms. I know there are at least Apple II emulators available for various platforms. Just google for them.

Related posts:

Remembering Steve Jobs and Apple Computer

Reminiscing, Part 2

Remembering Steve Jobs and Apple Computer

Looking at the retrospectives on Steve Jobs, after news of his death, they mostly recalled the last 10 years of his life, the iPod, the iPhone, and finally the iPad. Most of the images show him as an old man, an elder of the digital age “who changed the world.” This is not the Steve Jobs of my memories, but it is one of our modern era, one that has created digital devices that run software in tightly controlled environments for the consumer market.

My fond memories of Apple began with some of the company’s first technology products, back in the early 1980s. I didn’t know who Steve Jobs was at first, but I found out rather quickly. My memories of him are as a vibrant young man who believed that small, personal computers were the wave of the future, though “small” meant something that would fit on your desk…

I can say that I “experienced” Steve through using the technology he helped develop.

Apple’s first big success: The Apple II

My first experience with their technology was the Apple II Plus computer.

The Apple II Plus, from Wikipedia.org

The electronics were designed by Steve Wozniak, who has been called a “Mozart” of electronics design. Through the creative use of select chips and circuitry, he was able to pack a lot of functionality into a small space. Jobs designed the case for the computer. At the time that the first Apple II came out, in 1977, it was one of the first microcomputers that looked like what we might expect of a computer today. Most computers of the time were constructed by hardware hackers. The Apple was different, because you didn’t have to worry about building any part of it yourself, if you didn’t want to. The thing I heard that was really appealing about the Apple when it was launched was that the company was very open about how it worked. They wouldn’t talk about 100% of everything in it, because some of it was proprietary, but they’d tell hackers about everything else. They said, “Go to town on the darn thing!” That was the reason it got an early lead on everyone else, because Jobs recognized that its market was mainly software hobbyists. It was appealing to people who wanted to do things with the electronics, to expand upon what was there, but it was targeted at people who wanted to manipulate the machine through software.

My first encounter with a II Plus was at my Jr. high school in 1982. The school had just 3 of them. One was in the library, and students had to sign up for time on it. The other two were owned by a couple teachers, and were kept in their offices. The following school year my school got a computer lab, which was filled with Apple IIe’s. That same year the local public library made an Apple II Plus available. Most of the programming I did in my teen years was done on the II.

It was a very simple, but a very technical machine, by today’s standards. When you’d start it up, it would come up in Applesoft Basic (written by Microsoft), a programming language environment that doubled as the computer’s command-line interface/operating system. All you’d see was a right-square-bracket on the lower-left side of a black screen, and a blinking square for a cursor.

Applesoft Basic, from Wikipedia.org

It offered an environment that allowed you to run an app., and manage your files on disk, by typing commands. What I liked about it was that it offered commands that allowed me to do commonsensical things. With other 8-bit microcomputers I had used, I had to go through gyrations, or go to a special program to maintain disk files. If I wanted to do some graphics, the commands were also right there in the language. With some other popular computer models, you had to do some complicated maneuvers to get that capability. It offered nothing for sound, though. If you wanted real sound, you had to get something like a Mockingboard add-on that had it’s own synthesizer hardware. The computer had several internal expansion slots. It was not designed for sound out of the box. If you had nothing else, you had to go into machine language to “tweak” the computer’s internal speaker to get that, since it was only designed to beep. This was not “fixed” until the Apple IIGS, which came out in 1986. Regardless, Apple games tried their best to get sound out of the computer’s internal speaker.

Basic was considered a beginner’s programming language, for newbies. It was less intimidating to learn, because the commands in the language looked kind of like English. Even though it was looked down upon by hackers, Basic was what I used on it most of the time. It was technology from an era when learning something about how the computer operated was expected of those who bought them.

To really harness the computer’s power you had to program in assembly language, or type bytes into the machine directly, using what was called the computer’s built-in machine monitor. The square bracket prompt would change into an asterisk (“*”), and you were in no man’s land. The Basic environment was turned off, and you were on your own, navigating the wilds of the machine’s memory, and built-in routines, giving commands and data to the machine in hexadecimal (a base-16 numbering system). This is what the pros used. You had total command of the machine. You also had all of the responsibility. There was no protected memory, and the machine gave you as much rope as you needed to hang yourself. If your program wandered off into disk operating system code by accident, you might corrupt the data you had on disk. Most of the commercial software written for the Apple II series was written in this mode, or using a piece of software called an assembler, that allowed the programmer to use mneumonic codes, which were easier to deal with. Programs written in machine code ran faster than Basic code. Basic programs ran in what’s called an interpreter, where the commands in the program were translated into executable code as the program ran, and this which was a slower process. As a consolation, some people used Basic compilers to translate their programs into a machine code version in one go, so they’d run faster.

If you wanted to run a commercial app., you would insert a disk that contained the app. into the floppy disk drive, and reboot the machine. The app. would automatically load off of disk. If you wanted to run a different app., you’d typically remove the disk from the disk drive, and insert a new one, and repeat the process. There was no desktop environment to operate from. You booted into each program, and you could only run one program at a time.

This was pretty much the world of the Apple II. Once graphical interfaces became popular, Berkeley Softworks came out with a piece of software called GEOS that gave the II a workable graphical interface, though I think most Apple users thought it was a novelty, because most of the applications people cared about didn’t run on it.

Another big market for the II was in the public schools. For many years it was the de facto standard in computing in America’s schools. A lot of educational software was written for it.

Stickybear on the Apple II, from atarimagazines.org

A third big market opened up for it when VisiCalc (Visible Calculator) came out in 1979, written by Dan Bricklin and Bob Frankston. It was the world’s first commercial spreadsheet, and it came out first on the Apple II. It was the II’s “killer app,” a piece of software so sought after that people would buy the computer just to be able to use it.

VisiCalc, from Wikipedia.org

I first learned what a spreadsheet was by using VisiCalc. Modern spreadsheets use many of the same basic commands, and offer the same basic features that it pioneered. The two main features it lacked were it did not support macros, and it had no graphing function. Each cell could contain a formula that could draw values from other cells into a calculation. Other than that it was not programmable.

An interesting bit of history from this era is that some of the software from it lives on. Castle Wolfenstein, by Muse Software, one of the popular games for the Apple II has had quite a lot of staying power, into our modern era. Remember Wolfenstein 3D by Id Software, and Return to Castle Wolfenstein on the PC? Wolfenstein started on the Apple II in 1981. The following video is from its sequel, Beyond Castle Wolfenstein, which came out in 1984. It gives you the same flavor of the game. Unlike its modern translations, it was a role-playing game. The object was to pick up items that would help you get to your objective. Shooting was a part of the game, but it wasn’t the only thing you had to do to get through it. As I remember, the ultimate goal was to blow up the castle.

Beyond Castle Wolfenstein

Another Apple original that has had a lot of staying power is Flight Simulator. It was originally written by a company called subLogic. They came out with Flight Simulator II, which was ported to a bunch of different computers, including the IBM PC. This was the second version of the product, which was a huge improvement on the original. It featured realistic maps of cities (as realistic as you could get with such a low-resolution display), colorized landscapes (rather than the wireframe graphics in the original), realistic weather conditions you could select, and a variety of aircraft you could fly. Later, expansion disks came out for it that featured maps of real cities you could fly to. Microsoft purchased the rights to Flight Simulator II, and developed all of its subsequent versions.

The original Flight Simulator on the Apple II

Their flops

Apple had some early flops. The first was a now-little-known computer called the Apple III, which came out in 1980. It was a slightly faster version of the II, using similar technology. It was designed and marketed as a business machine. Unlike the II it had an 80-column text display. The II had a 40-column text display, though in the early 1980s there were 80-column expansion cards you could get for the IIe. It had a higher memory capacity, and it was backward-compatible with the II, through a compatibility mode.

The Apple III, from Wikipedia

Their next flop came soon after, the Apple Lisa, which came out in 1983.

The Apple Lisa, from Wikipedia

A screen from the Lisa, from Wikipedia

It was also marketed as a business computer. Most people give props to the Macintosh as being Apple’s first computer with a graphical user interface, and a mouse, but it was the Lisa that had that distinction. This was Apple’s first crack at the idea. It had some pretty advanced features for microcomputers at the time. The main one was multi-tasking. It could run more than one application at a time. Its biggest problem was its price, about $10,000. Unlike the Apple III, the Lisa had some staying power. Apple marketed it for the next few years, trying variations on it to try to improve its appeal.

I had the opportunity to spend a little time with a Lisa at a computer show in the mid-1980s. It had a calendaring desk accessory that was a revelation to me. It was the first of its kind I had seen. In some ways it looked a lot like iCal on OS X. My memory is it functioned just like it. It would give you a monthly, weekly, and daily calendar view. If you wanted to schedule an event for it to alert you about, you entered information on a form (which looked like a conventional dialog box), and then when that date and time came up, it would alert you with a reminder.

When I was in Jr. high and high school, I used to carry around with me a pocket spiral-bound notebook so I could write down assignments, and when I had tests coming up. It looked pretty messy most weeks. I really wanted a way to just keep my schedule sane. The Lisa demonstrated that a computer could do that. I didn’t have regular access to a Lisa computer, though, and there was absolutely no way my mother could afford to get me one, especially just to give me something with a neat calendar! So in high school I set out to create my own weekly planner app. on an Apple II, using Basic. I didn’t own one, but the school had lots of them. I figured I could use them, or the one at the local public library, which I used regularly as well. I wrote about the development of it here. I called my app. “Week-In-Advance,” and I wanted it to have something of the feel of the Lisa calendar app. I saw. So I set out to create a small “graphical interface” in text mode. I succeeded in my efforts, and it showed me how hard it is to create something that’s easy to use! It was the biggest app. I had written up to that point.

The Macintosh

If you’re a modern Mac user, this was kind of its great-granddaddy… Anyway, it’s related. I’ll explain later. It came out in 1984, and was Steve Jobs’s baby.

The first Macintosh, from Wikipedia

I had the thought recently that Jobs invented the idea of the “beige case” for a computer with the Macintosh, which PC makers followed for years during the 1990s, and everyone got tired of it.

This almost was Apple’s third flop. It created a sensation, because of its simple design, and ease of use. Steve Jobs called it “The computer for the rest of us.” It was targeted at non-techie users who just wanted to get something done. They didn’t want to have to mess with the machine, or understand it. The philosophy was it should understand us, and what we wanted.

My local public library got a Mac for people to use a year after it came out. So I got plenty of time using one.

It was a cheaper version of the Lisa, so it was more accessible, but there wasn’t a whole lot you could do with it at first. The only applications available for it at its launch were from Apple: MacPaint, a drawing/painting program (rather like Microsoft Paint today, except with only two colors, and a bunch of patterns with which you could paint on the screen), and MacWrite, a word processor. Just from looking at it, you can tell that no Apple II programs would run on it, and I don’t think you’d want that anyway.

As you can see, it had a monochrome display. It could only display two colors, white and black. This drew some criticism, but it was still a useful machine. The Mac wouldn’t have a color display until the Macintosh II came out in 1987. Incidentally, other platforms had color graphical interfaces a year after the Mac first came out. There was GEM (Graphics Environment Manager) by Digital Research (which was mainly for the IBM PC), the Atari ST (which used GEM), and the Commodore Amiga, not to mention Version 1.0 of Microsoft Windows.

The Mac was probably the first computer Apple produced that represented a closed design. The first Macs were hardly expandable at all. You could expand their memory capacity, but that was it. It had a built-in floppy drive, but no internal hard drive, and no ability to put one inside. The Mac popularized the 3-1/2″ floppy disk format. Before it came along the standard was 5-1/4″ disks. It had an external connector for a hard drive, so any hard drive had to exist outside the case. It had some external ports so it could be hooked up to a printer, and I believe a phone modem.

In that era we were all used to floppy drives making noises. The first Mac’s floppy drive was also a bit noisy, but it had a “hum” sound to it, as it spun the disk. It sounded like it was “humming” some tune that only it knew. Bill Gates and Steve Jobs, when they talked about the development of the Mac at their D5 Conference appearance (below), called it a “twiggy” drive. The reason for this sound it made, I later discovered, is it used a data compression technique that Steve Wozniak had developed for the Apple II’s disk drives, called Group Code Recording (GCR). The drive was developed in an effort to store data uniformly on the disk, so as to fit more on it to make floppies a more reliable storage medium. The reason for the sound it made is they varied the speed of the drive, depending on where the read/write head was on the disk. You have to understand a little something about physics to get why they did this.

(Update: 10-21-2011: I realized after doing some research that I held a mistaken notion that the variable-speed drive was a way of achieving data compression. The reason they varied the speed of the drive is the format they used, called GCR, used a fixed sector size. The GCR format compressed data on the disk. Since they varied the speed, they were able to get fewer sectors closer in to the center than towards the outer edge. In effect, they inverted the “normal” way of storing data, and so more evenly distributed data on floppy disks. This article explains it in more detail, under Technical Details:Formatting:Group Code Recording.)

All other disk drives on other computers stored more data on the inner tracks of a disk than on the outer tracks. The reason was the disk was always spun at the same speed in those drives, no matter where the read/write head was on the disk, and the read/write head always read and stored data at a constant rate. In physics we know, though, that when you’re rotating anything at a constant rate, the part near the center moves at a slower speed than parts that are farther from the center. As a result, the disk drive will end up packing more data into a smaller space near the center of the disk than it will near the outside of it. The “density” of the data will vary depending on how far from the center it’s stored. Imagine dropping bits of material on a piece of paper that’s sliding beneath your hand. If you speed up the paper, the bits of material will be more spread out on it. I’m not sure how this was done on the Apple II drives, but on On the Mac, the way they tried to deal with this issue was to spin the disk faster when the head was moved towards the center, and slower as the head moved to the outer edge, thereby generating different “motor” sounds as it sped up and slowed down the disk.

Their GCR format compressed data, which made it possible to store a little more data per disk than on most other drives. Looking back on it, this technique might seem trivial, given the amount of data we store today, but back then it was rather significant. A conventional double-sided, double-density 3-1/2″ disk drive could store 720K on a disk. But a Mac could store 800K on the same disk, providing 11% more space per disk.

Back then most computer users didn’t have a lot of data to store. Applications and games were small in size. Documents might be 30K at most. Most people didn’t think of storing large images, and certainly not videos, on these early machines. The amount of data being passed around on networks tended to be pretty small as well. So even what seems like a piddly amount of extra disk space now was significant then.

Edit 7/9/2019: I found a video from YouTube user “Jason’s Macintosh Museum,” using a 128K Macintosh, running its Guided Tour disk. It gives you a feel for what running the original Mac was like.

Edit 10-17-2011: A minor point to add. You’ll notice in the picture of the Mac that there’s no disk eject button. This was because the computer apparently did some housekeeping tasks using your disk, and it would be damaging to data on your disk, and/or cause the system to crash, if you could eject the disk anytime you wanted. I have no idea what these housekeeping tasks were, but whenever the user wanted to eject the disk (which you could do by selecting a menu option, or pressing a command key combination, or dragging the disk icon to the trashcan icon on the desktop), the computer would spend time writing to the disk before ejecting it via. a servo mechanism. Sometimes it would spend a significant amount of time doing this. It may have been an operating system bug, but I saw instances where it would take 5 minutes to eject the disk! All the while the disk drive would be running…doing something you knew not what. In some rare instances the computer wouldn’t eject the disk at all, no matter what you tried. In those situations it had a pinhole just to the right of the disk slot, which you can see in the picture if you look closely, that you could stick the end of a paperclip into, to manually force the disk drive mechanism to eject the disk. I remember seeing a vendor once that sold nice colored “disk ejectors,” which had a handle that looked like a USB thumb drive, and a pin at one end that you’d stick into this hole. A paperclip did the trick just fine, though.

A major difference between the Mac and other computers of the day was it did not come with a programming language. There were programming languages available for it, but you had to get them separately. It was a bold departure from the hacker culture that had influenced all the other computers in the marketplace.

In contrast to the computers I had used prior to using the Mac, including the Apple II, the experience of using it was elegant and beautiful. It had the highest resolution of any computer I had used before. I loved the fact that I could look at images in greater, finer detail. On the desktop interface, windows appeared to “warp” in and out of view, except it was really the window’s frame that moved. It didn’t have enough computing power to move a whole window’s contents. The point is they didn’t just appear. You get that effect on the modern Mac UI as well, and it looks a lot neater.

The main drawback was that it could only run one app. at a time. Even so, it was possible to cut, copy, and paste across applications. How was this done? Well, Mac OS System (the name of the operating system at the time) had what we’re all familiar with, a clipboard function. It also had a desk accessory called “Scrapbook” that allowed you to add multiple images and document clippings to it. You could add clippings by using the cut, copy, and paste feature in an application. Scrapbook would automatically cache these clippings, possibly saving them to disk (this is reminding me that I used to do a lot of disk swapping with the old Macs, which was a reason that more financially well-endowed users bought a second floppy drive, or a hard drive). The scrapbook was finite in size, and would eventually cycle out old clippings, as I recall. Anyway, when you’d load a new application, what was in the clipboard would remain available, so you could paste it. In addition, if you put clippings in the scrapbook, those were also available. Desk accessories could be run at all times, and so you could open Scrapbook while you were using an app., go through its clippings, and grab anything else you wanted to paste. Needless to say, cutting and pasting across applications was an involved process.

This was soon fixed by a system utility written by Andy Hertzfeld called Switcher, once the Mac was given more memory (the very first model came with a total of 128 kilobytes of memory, and so wasn’t such a great candidate for this). The idea was to enable users to load multiple apps. into memory, and allow them to switch between them without having to quit out of one to get to the others. It enabled you to go back to the desktop to launch an app. without having to quit out of the apps you had already loaded. It was rather like how apps. on mobile devices work now.

I read up on the history of Switcher a few years ago. Microsoft was very enthusiastic about it at the time, because they recognized that users would buy more apps. if it was easier to launch more than one at a time. It was really nice to use. It was like using OS X’s multiple desktop feature, except that you could only see one app. on the screen at a time. It had the same effect, though. When you’d switch, the app. you were using would “slide” out of view, and the new one would slide on right behind it. It was like you were shifting your gaze from one app. to the other. It worked really well for early Mac apps., because there was no reason to be doing background processing with what was available. It created the illusion that all apps. were “running” at the same time, when they really weren’t. All the other apps. were in suspended animation when they weren’t in view. Copying clippings and pasting between apps. became a breeze, though.

It was said of Steve Jobs that he had high standards that drove engineers at Apple nuts, but it seems to me he was willing to compromise on some things. The original Mac had a monochrome display, which I’m sure he knew wasn’t as exciting as color. It was a single-tasking machine, so in the beginning, people were running and quitting out of applications a lot. It had a small amount of memory for what it did, and so you couldn’t load multiple apps., which made multi-tasking impractical. You couldn’t cut and paste things between applications easily without the Switcher add-on, which came out about a year after the Mac was released. I’m sure all of these compromises were made to keep the price point low.

The Mac had its critics early on, calling it a “child’s toy.” “Productive people don’t need cute icons and menus to get work done. They get in the way.” There were a lot of advocates for the command-line interface over the GUI in those days. In a way, they were right. Alan Kay said years later that the reason they came up with an iconic, point-and-click graphical interface at Xerox PARC, which the Lisa and Mac drew inspiration from, was to create an easy-to-use environment for children. Not to say that a graphical interface has to be for children, but this is the model Apple drew from.

Jobs leaves Apple

The unthinkable happened at Apple in 1985. Jobs was ousted from the Mac project. He was replaced by his hand-picked CEO, John Sculley, and Jobs left. I remember reading about it in InfoWorld, and being kind of shocked. How could the man who started the company be ousted? How could they think that they could just get rid of this creative person, and keep it the same company? Would Apple survive without him? It felt like something was wrong with this. I wasn’t a big Apple fan at the time, but I knew enough to know that this was a big deal.

After this, I lost track of Jobs for a while. Apple seemed to just move along without him. As I mentioned earlier, they came out with newer, better computers. The Apple Mac had grown to 10% market share by the end of the 1980s, an impressive feat when PCs were growing in dominance by leaps and bounds. In hindsight, the only thing I can point to as a possible problem is they made only incremental improvements to the Mac. They coasted, and for a few years they got away with it.

The only thing about this period that I thought sucked at the time was Apple was suing every computer maker in sight that had a graphical interface, claiming they had violated its copyrights. It had the appearance that they were trying to kill off any competition. The only company they won against was Digital Research, with their GEM operating system, and then only on the PC version, which died out shortly thereafter. It was getting so bad that the technology press was calling on Apple to quit it, saying they were stifling innovation in the industry. It didn’t matter anyway, because Microsoft Windows was coming along, and it would eventually eat Apple’s lunch. Microsoft might’ve actually had Apple to thank for Windows’s success. Apple probably weakened Microsoft’s other competitors.

Nevertheless, Apple seemed to be succeeding without Jobs for a time. It was only when Sculley left in the early 1990s that things went downhill for them, from what I could see.

NeXT

I rediscovered Jobs a bit in college, when I heard about his new venture that created the NeXT computer in 1988.

The NeXT computer, from simson.net

The keyboard, monitor, and laser printer for the NeXT, from simson.net

The NeXTStep interface, from Wikipedia

Come to think of it, maybe Jobs was trying to communicate something in the name…

At the time that the NeXT came out, it seemed futuristic. The computer was shaped like a cube. The case was made out of magnesium, and it featured a removable magneto-optical drive (removable in the sense that you could take the magneto-optical disk, which was housed in a cartridge, out of the drive). Each disk held 256 MB, which was a lot in those days. Most people who had hard drives had 1/4 of that storage capacity at most. The disk was made out of a metal. The way the drive worked was a laser would heat a spot on the disk that it wanted to write to, to what’s called the Curie Point (a specific temperature), so that the magnetic write head could change its polarity. Pretty complicated! I guess this was the only way they could find at the time to achieve that kind of rewritable storage capacity on a removable disk, probably because it afforded a relatively large or imprecise read/write head. Only the part of the disk that was heated by a narrow laser beam would change. So the only part you had to worry about being terribly precise was the laser (and the read head).

Out of the gate, the NeXT computer’s operating system was based on Unix, using the Mach kernel, as I recall. It used Objective C as the standard development language for the system, and was accompanied by some visual user interface design tools. The display system used a technology called Display PostScript to create a good WYSIWYG (What You See Is What You Get) environment.

In 1990, NeXT made a splash by announcing a slimmer, sleeker model, coined the “pizza box,” because aside from its nice look, that’s what it looked like. The magneto-optical drive was gone. It was replaced by a hard drive, and a high-density floppy drive. The big feature that got developers’ attention was a Motorola digital signal processor (DSP) chip that was built into it. One of the ways it was used was to calculate complex mathematical equations at high speed, taking the load for that off of the main processor.

the second generation NeXT computer, from Wikipedia

I got only a few chances to use a NeXT, for a brief time. Again, the computer was way out of my price range. It seemed nice enough. It had that feel about it that was like the Mac, where it would do things–just fine touches–so you didn’t have to think about them. I remember having an “information” dialog open on a file, and doing something to the file. Rather than having to refresh the information window, it updated itself automatically in the background. We take stuff like this for granted now, but back then I noticed stuff like that, because no other computer acted that way.

Doing some research in retrospect about a year ago, I found a demo video that Jobs had created about the second generation NeXT computer. I discovered that they had designed software for it so it could be used as an office platform. You could embed things like spreadsheets, sales presentations, and audio clips in e-mails you’d send to people. This was before most people had even heard of the internet, and long before the MIME protocol was developed. They had advanced video hardware in it so that you could watch high-quality digital color video, which was really hard for most computers to do then. They had also shown an example of a subscription app., demonstrating how you could read your daily issue of the New York Times online. This was done around the same time that the very first web browser was invented. If this rings a bell, though, that’s because Apple has done demos like this within the last few years, as had Microsoft, when they first introduced Windows Vista.

A little trivia. Tim Berners-Lee wrote the world’s first web server, and the first web browser on a NeXT workstation.

The world’s first web browser

Once I got out into the work world, in the mid-90s, I read that things weren’t going so well for NeXT. They eventually sold off their hardware division to Canon. However, things weren’t looking totally down for Jobs. I learned that he had also been heading up a company called Pixar. “Toy Story” came out, and it was amazing. The computer graphics were not as good of an effect as “Jurassic Park,” which had come out a couple years earlier, but I was still pretty impressed with it, because it was the first feature-length movie to use only computer graphics for the whole thing. Mostly what appealed to me were the memories of the toys I had as a kid. The story was good, too.

It seemed like NeXT was on its last leg when Apple bought the company in late 1996. Apple wasn’t doing so hot, either, but it obviously had more cash on hand. The joke was on the people who did the deal, though, because in less than a year, Jobs was back on top and in charge at Apple.

In short order we had the iMacs, and amazingly they were selling like hotcakes. Apparently their shape and their color were what appealed most to customers, not what the computer actually did! No more beige boxes! Yay! Uh…and where did they come from? Eh, not important…

The first iMac, from Wikipedia

Jobs did some things that surprised me after he took over. He cancelled Hypercard, one of the most innovative pieces of software Apple ever produced. Hypercard was a multimedia authoring environment that enabled neophytes to programming to write their own programs on the Mac. You didn’t even need to know a programming language. It was a visual programming environment. You just needed to arrange media elements into a series of steps (“cards” in a “deck”), and set up buttons or links to trigger some actions. The closest equivalent to it on modern Macs is a program called “Automator.” I’ve tried using it, though, and it feels clunky. Hypercard had long been treated as a neglected stepchild at Apple, so in a way Jobs was putting it out of its misery.

He cancelled the Newton, Apple’s PDA. It had become the most popular handheld computer used by hospitals. As a result, they all had to find some other mobile platform to use, and all their mobile software had to be rewritten, because the Newton’s operating system architecture, and development language was proprietary.

Edit 10-13-2011: Thirdly, he cancelled Apple’s clone licensing program, which killed off Mac clone makers. This, to me, was more understandable from Apple’s perspective.

There had been efforts to make Mac clones in the 1980s. My memory is they were all the product of reverse-engineering. Apple finally allowed “genuine” Mac clones, under a license agreement, in the 1990s. Apple of course retained ownership over Mac OS. My memory is this happened after Sculley left. I could understand the appeal of this idea, since PCs (of the IBM variety) solidified their dominance in the market once clones came out. It didn’t work out the same way for Apple, however. A few things this strategy probably didn’t take into account. One is that Microsoft had to deal with a lot more variety in hardware in their operating system, in order to make the clone market work. I vaguely remember hearing about compatibility/stability problems with Mac clones. Secondly, the PC clone manufacturers had to accept much lower profit margins than IBM did when it owned the hardware market. Thirdly, Microsoft didn’t depend on hardware for its revenue. Apple’s business model did, and they were allowing competitors to undercut them on price. For Apple it was rather like Sun’s strategy with Java: have a loss-leader in a major, desirable technology, which the company owned the rights to, in hopes of gaining revenue on the back end in a market that was increasingly perceived as commoditized, which…didn’t really work out for them.

In a few years, NextStep would take over the Mac, with OS X. One of the things Jobs commented on recently was that after he left Apple in 1985, they just didn’t innovate like they had under his direction. It needed to catch up. So transplant NeXT’s work of 1992 into the Mac of ten years later!

Even though the OS X interface looks a lot different from the NeXT, under the covers it’s NeXTStep. The display technology is derivative from what was used on the NeXT. The NeXT operating system was Unix-based, as is OS X. Objective C was brought into the Mac platform as a supported language. In essence, OS X has been the “next” iteration of the NeXT computer. Like a phoenix, it rose again. This was apparently a part of the deal Apple made in buying the company. They recognized that some next-generation OS was needed for the Mac, since it was aging, and from the beginning they had planned to use NeXT technology to do that.

Old apps. written for Mac OS would no longer run on the new system, unless they were “carbonized.” This involved recompiling existing applications to use an emulation library. The problem was if you depended on a Mac OS app. written by a software company that was no longer in business, your best bet was not to upgrade.

Things were not so great in paradise. Apparently the transition from Mac OS to OS X was rough. I remember hearing vague complaints from Mac users about the switchover. They really didn’t get the memo that it was a whole new operating system that operated differently from what they had been used to for years. It may have not entirely been their fault, in the sense of not being willing to learn a new system. I remember hearing complaints about system instability as well.

To their credit, Apple quickly fixed a lot of the stability problems, from what I understand.

This was only for starters. Rather than focus solely on developing the desktop computer market, since Jobs said that Microsoft had “won that battle,” he took Apple in a whole new direction by saying that they should develop mobile devices “for the rest of us.” Apple has also been capturing the market for electronic publishing, with iTunes and the App Store. This combination has been the source of its meteoric success since then.

Unlike “the rest of the world,” I was never that enthused about Apple’s new direction. I haven’t owned an iPod, or any of their other mobile devices. I have an old Pocket PC, and a digital camera that I use. I bought my first Apple product, a MacBook Pro, in 2008, and aside from some kinks that needed to be worked out, it’s been a nice experience.

For the longest time I was not that big of an Apple fan. When I met other Apple users they often came across as elitist, like they had the bucks to buy the best technology, and they knew it. That turned me off. I used stuff from Apple from time to time, but I liked other technology better. That was because my priorities were different from most people. Nevertheless, there was something about Steve Jobs I liked. He had a creative, innovative spirit. I liked that he cared about quality. Ironically, Apple’s products always seemed more conservative than my tastes. It was an adjustment to use my current laptop. It’s allowed enough flexibility that I don’t feel totally hemmed in by its “ease of use,” but there are a few small things I miss.

Jobs was an inspiration. Like some other people I’ve seen around in my life, he was someone I followed and kept track of with interest for many years. He gave us technology that was worth our time to use. What I appreciated most about him was he pushed beyond what was widely thought of as “the way things are” in computing. Unlike most Apple fans and followers, I haven’t seen that much in the way of original ideas out of him. What I credit him with is taking the best ideas that others have come up with, trying to pare them down so that the average person can understand them, having the courage to make products out of them when no one else would, and then marketing the hell out of them. Part of what mattered to him was what computers made possible, and the experience of using them. In the beginning of all this, it seemed like his dreams were far out ahead of where most people were with respect to technology. In his return to Apple, he stayed out ahead, but he seemed to have a keen sense of not getting too far ahead of customers’ expectations. I think he discovered that there’s no virtue, at least in business, of getting too far out ahead of the crowd you’re trying to impress.

There were a couple really memorable moments with Jobs in the last 10 years that I’d like to cover. The first was his 2005 commencement address to the students at Stanford. Here he reveals some things about his life story that he had kept close to the vest for years. He had some good things to say about death as well. It’s one of the most inspirational speeches I’ve heard.

Below is a really great joint interview with Jobs and Bill Gates at the D5 Conference in 2007. It was interesting, engaging, and funny. It covers some of the history that I’ve talked about here, and what we’ve seen from Apple and Microsoft in the present.

This was a rare thing. I think this was one of only three times where Jobs and Gates had appeared together in public, and contrary to the mythology that they were rivals who hated each other,…well, they were rivals, but they got along swimmingly.

Just a little background, the intro. music you hear is from “Love Will Find A Way,” by Yes. Mitch Kapor, who’s introduced in the audience was the founder of Lotus Software. He developed the Lotus 1-2-3 spreadsheet for the PC (Lotus was bought by IBM in the 1990s). Last I heard some years ago, he had become a major advocate for open source software.

Jobs quoted Alan Kay, saying, “People that love software want to do their own hardware.” Maybe he did say that, but the quote I remember is, “People who are really serious about software *should* make their own hardware.” When I first heard that, I remember thinking Kay was putting a challenge to programmers, like, “Real programmers make their own hardware,” but I later realized what he probably meant was that software developers should take control away from the hardware engineers, because the hardware they had created, which was being used in computers, was a crappy design. So what he was probably saying was that really good software people would be better at making hardware to run their software. The way Jobs expressed this is a shallow interpretation of what Kay said, because Kay was very critical of what both Motorola and Intel did in their hardware designs. Apple has only used hardware from both of these companies for the main chipsets for their 16-, 32-, and 64-bit computers.

Note: There is a 7-minute introduction before the event with Jobs and Gates starts.

Gates said something towards the end that struck me, because it really showed the friendship between the two of them. He said, totally unprompted, that he wished he had Jobs’s taste. Jobs was famously quoted as saying in “Triumph of the Nerds”:

The only problem with Microsoft is they just have no taste. They have absolutely no taste. I don’t mean that in a small way. I mean that in a big way, in the sense that they don’t think of original ideas, and they don’t bring much culture into their product. And you say, “Why is that important?” Well, proportionally-spaced fonts come from typesetting and beautiful books. That’s where one gets the idea. If it weren’t for the Mac, they would never have that in their products. And so I guess I am saddened–not by Microsoft’s success. I have no problem with their success. They’ve earned their success, for the most part. I have a problem with the fact that they just make really third-rate products.

It felt like things had come full circle.

Saying goodbye

I was a bit shocked to hear of Jobs’s death last Wednesday. I knew that his health had been declining, but I thought he might live another year or so. He had only stepped down as Apple’s CEO in late August. In hindsight, though, it makes sense. He loved what he did. Rather than retire, and decline in obscurity, he held on until he couldn’t hold on any longer.

I’ve felt a little sad about his death at times. I know that Jobs’s favorite music was the Beatles and Bob Dylan, but on the day he died, this song, “It’s So Hard To Say Goodbye To Yesterday,” by Boyz II Men was running through my head. It expresses my sentiments pretty well.

Bye, Steve.

—Mark Miller, https://tekkie.wordpress.com

Apple changes its iOS developer terms again, allowing Squeak apps.

A few months ago Apple made some controversial changes to its App Store developer terms, which were seen as overly restrictive. The scuttlebutt was that the changes were aimed at banning Adobe Flash from the iPhone, since Apple had already said that Flash was not going to be included, nor allowed on the iPad. The developer terms said that all apps. which could be downloaded through the App Store had to be originally written only in C, C++, Objective-C, or Javascript compatible with Apple’s WebKit engine. No cross-compiled code, private libraries, translation or compatibility layers were allowed. Apple claimed they made these changes to increase the stability and security of the iOS environment for users. I considered this a somewhat dubious claim given that they were allowing C and C++. We all know the myriad security issues that Microsoft has had to deal with as a result of using C and C++ in Windows, and in the applications that run on it. A side-effect I noticed was that the terms also banned Squeak apps., since Squeak’s VM source code is written in Smalltalk and is translated to C for cross-compilation. In addition any apps. written in it are originally written in Smalltalk (typically), and are executed by the Squeak VM, which would be considered a translation layer. The reason this was relevant was that someone had ported Squeak to the iPhone a couple years ago, and had developed several apps. in it.

The Weekly Squeak revealed today that Apple has made changes to its App Store terms, and they just so happen to allow Squeak apps. As Daring Fireball has revealed, Apple has removed all programming language restrictions. They have even removed the ban on “intermediary translation or compatibility layers”. The one caveat is they do not allow App Store apps. to download code. So if you’re using an interpreter in your app., the interpreter and all of the code that will execute on it must be included in the package. (Update 9-12-10: Justin James pointed out that the one exception to this rule is Javascript code which is downloaded and run by the WebKit engine). This still restricts Squeak some, because it would disallow users or the app. from using something like SqueakMap or SqueakSource as a source for downloading code into a Squeak image, but it allows the typical stand-alone application case to work.

John Gruber, the author of Daring Fireball, speculates that these new rules could allow developers to use Adobe’s Flash cross-compiler, which Adobe had scuttled when Apple imposed the previous restrictions. John said, “If you can produce a binary that complies with the guidelines, how you produced it doesn’t matter.” Sounds right to me.

However, looking over the other terms that John excerpts from the license agreement gives me the impression that Apple still hasn’t figured everything out yet about what it will allow, and what it won’t allow, in the future. It has this capricious attitude of, “Just be cool, bro.” So things could still change. That’s the thing that would be disappointing to me about this if I were an iPhone developer right now. I got the impression when the last license terms came out that Apple hadn’t really thought through what they were doing. While I get a better impression about the recent changes, I still have a sense that they haven’t thought everything through. To me the question is why? I guess it’s like what Tom R. Halfhill once told me, that Steve Jobs never understood developers, even back in the days when the company was young. Steve Wozniak was the resident “master developer” in those days, and he had Jobs’s ear. Once he left Apple in the 1980s that influence was gone.

My journey, Part 4

See Part 1, Part 2, Part 3

The real world

Each year while I was in school I looked for summer internships, but had no luck. The economy sucked. In my final year of school I started looking for permanent work, and I felt almost totally lost. I asked CS grads about it. They told me “You’ll never find an entry level programming job.” They had all landed software testing jobs as their entree into corporate software production. Something inside me said this would never do. I wanted to start with programming. I had the feeling I would die inside if I took a job where all I did was test software. About a year after I graduated I was proved right when I took up test duties at my first job. My brain became numb with boredom. Fortunately that’s not all I did there, but I digress.

In my final year of college I interviewed with some major employers who came to my school: Federal Express, Tandem, Microsoft, NCR. I wasn’t clear on what I wanted to do. It was a bit earth-shattering. I had gone into CS because I wanted to program computers for my career. I didn’t face the “what” (what specifically did I want to do with this skill?) until I was about ready to graduate. I had so many interests. When I entered school I wanted to do application development. That seemed to be my strength. But since I had gone through the CS program, and found some things about it interesting, I wasn’t sure anymore. I told my interviewer from Microsoft, for example, that I was interested in operating systems. What was I thinking? I had taken a course on linguistics, and found it pretty interesting. I had taken a course called Programming Languages the previous year, and had a similar level of interest in it. I had gone through the trouble of preparing for a graduate level course on language compilers. I was taking it at the time of the interview. It just didn’t occur to me.

None of my interviews panned out. Looking back on it in hindsight it was good this happened. Most of them didn’t really suit my interests. The problem was who did?

Once I graduated with my Bachelor’s in CS in 1993, and had an opportunity to relax, some thoughts settled in my mind. I really enjoyed the Programming Languages course I had taken in my fourth year. We covered Smalltalk for two weeks. I thoroughly enjoyed it. At the time I had seen many want ads for Smalltalk, but they were looking for people with years of experience. I looked for Smalltalk want ads after I graduated. They had entirely disappeared. Okay. Scratch that one off the list. The next thought was, “Compilers. I think I’d like working on language compilers.” I enjoyed the class and I reflected on the fact that I enjoyed studying and using language. Maybe there was something to that. But who was working on language compilers at the time? Microsoft? They had rejected me from my first interview with them. Who else was there that I knew of? Borland. Okay, there’s one. I didn’t know of anyone else. I got the sense very quickly that while there used to be many companies working on this stuff, it was a shrinking market. It didn’t look promising at the time.

I tried other leads, and thought about other interests I might have. There was a company nearby called XVT that had developed a multi-platform GUI application framework (for an analogy, think wxWindows), which I was very enthusiastic about. While I was in college I talked with some fellow computer enthusiasts on the internet, and we wished there was such a thing, so that we didn’t have to worry about what platform to write software for. I interviewed with them, but that didn’t go anywhere.

For whatever reason it never occurred to me to continue with school, to get a masters degree. I was glad to be done with school, for one thing. I didn’t see a reason to go back. My undergrad advisor subtly chided me once for not wanting to advance my education. He said, “Unfortunately most people can find work in the field without a masters,” but he didn’t talk with me in depth about why I might want to pursue that. I had this vision that I would get my Bachelor’s degree, and then it was just a given that I was going to go out into private industry. It was just my image of how things were supposed to go.

Ultimately, I went to work in what seemed like the one industry that would hire me, IT software development. My first big job came in 1995. At first it felt like my CS knowledge was very relevant, because I started out working on product development at a small company. I worked on adding features to, and refactoring a reporting tool that used scripts for report specification (what data to get and what formatting was required). Okay. So I was working on an interpreter instead of a compiler. It was still a language project. That’s what mattered. Besides developing it on MS-DOS (UGH!), I was thrilled to work on it.

It was very complex compared to what I had worked on before. It was written in C. It had more than 20 linked lists it created, and some of them linked with other lists via. pointers! Yikes! It was very unstable. Anytime I made a change to it I could predict that it was going to crash on me, causing my PC to freeze up every time, requiring me to reboot my machine. And we think now that Windows 95 was bad about this… I got so frustrated with this I spent weeks trying to build some robustness into it. I finally hit on a way to make it crash gracefully by using a macro, which I used to check every single pointer reference before it got used.

I worked on other software that required a knowledge of software architecture, and the ability to handle complexity. It felt good. As in school, I was goal-oriented. Give me a problem to solve, and I’d do my best to do so. I liked elegance, so I’d usually try to come up with what I thought was a good architecture. I also made an effort to comment well to make code clear. My efforts at elegance usually didn’t work out. Either it was impractical or we didn’t have time for it.

Fairly quickly my work evolved away from doing product development. The company I worked for ended up discarding a whole system they’d worked two years on developing. The reporting tool I worked on was part of that. We decided to go with commodity technologies, and I got more into working with regular patterns of IT software production.

I got a taste for programming for Windows, and I was surprised. I liked it! I had already developed a bias against Microsoft software at the time, because my compatriots in the field had nothing but bad things to say about their stuff. I liked developing for an interactive system though, and Windows had a large API that seemed to handle everything I needed to deal with, without me having to invent much of anything to make a GUI app. work. This was in contrast to GEM on my Atari STe, which was the only GUI API I knew about before this.

My foray into Windows programming was short lived. My employer found that I was more proficient in programming for Unix, and so pigeon-holed me into that role, working on servers and occasionally writing a utility. This was okay for a while, but I got bored of it within a couple years.

Triumph of the Nerds

Around 1996 PBS showed another mini-series, on the history of the microcomputer industry, focusing on Apple, Microsoft, and IBM. It was called Triumph of the Nerds, by Robert X. Cringely. This one was much easier for me to understand than The Machine That Changed The World. It talked about a history that I was much more familiar with, and it described things in terms of geeky fascination with technology, and battles for market dominance. This was the only world I really knew. There weren’t any deep concepts in the series about what the computer represented, though Steve Jobs added some philosophical flavor to it.

My favorite part was where Cringely talked about the development of the GUI at Xerox PARC, and then at Apple. Robert Taylor, Larry Tesler, Adele Goldberg, Andy Warnok, and Steve Jobs were interviewed. The show talked mostly about the work environment at Xerox (how the researchers worked together, and how the executives “just didn’t get it”), and the Xerox Alto computer. There was a brief clip of the GUI they had developed (Smalltalk), and Adele Goldberg briefly mentioned the Smalltalk system in relation to the demo Steve Jobs saw, though you’d have to know the history better to really get what was said about it. Superficially one could take away from it that Xerox had developed the GUI, and Apple used it as inspiration for the Mac, but there was more to the story than that.

Triumph of the Nerds showed the unveiling of the first Macintosh in 1984 for the first time, that I had seen. I read about it shortly after it happened in 1984, but I saw no pictures and no video. It was really neat to see. Cringely managed to give a feel for the significance of that moment.

Part 5