The pop culture
This is my last post in this series. In the other parts I’ve written positively about my experience growing up in the computing culture that our society has created. Now, bringing this full circle, I will be looking at it with a critical eye, because I’ve come to realize that some things have been missing. If you compare what I’ve been talking about with the work of Engelbart and Kay, from the 1960s and 70s, I think you’ll see that not much progress has been made since then. In fact, this business began as a retrogression from what had been developed. Some of what had been invented and advocated for has been misunderstood. Some of it was adopted for a time but pushed aside later. Some has remained in a watered-down form. I’ve spoken of analogies before about the amazing things that were invented more than 30 years ago. I’m going to make another one. They were like Hero’s steam engine. The culture was not ready to understand them, and so they didn’t go much of anywhere, despite their tremendous potential.
Technological progress has been made pretty much in a brute force, slog-it-out fashion. Eventually things get refined, and worked out, but mainly by “reinventing the wheel”.
Quoting from an interview with Alan Kay in ACM Queue published in 2005:
In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.
So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.
It was a different culture in the ’60s and ’70s; the ARPA (Advanced Research Projects Agency) and PARC culture was basically a mathematical/scientific kind of culture and was interested in scaling, and of course, the Internet was an exercise in scaling. There are just two different worlds, and I don’t think it’s even that helpful for people from one world to complain about the other world—like people from a literary culture complaining about the majority of the world that doesn’t read for ideas. It’s futile.
I don’t spend time complaining about this stuff, because what happened in the last 20 years is quite normal, even though it was unfortunate. Once you have something that grows faster than education grows, you’re always going to get a pop culture.
The way he defined this pop culture is “accidents of history” and expedient gap-filling. He said for example that BASIC became popular because it was implemented on a GE time-sharing system. GE decided to make this system accessible by many. This made BASIC very popular just by virtue of the fact that it was made widely available, not because it had any particular merit. Another more modern example he gave was Perl, which has been used to fill short-term needs, but it doesn’t scale. In a speech (video) he gave 10 years ago he mentioned the web browser as a terrible idea, because it locks everything down to a standardized protocol, rather than having data come with a translator which makes the protocol more flexible. It got adopted anyway, because it was available and increased accessibility to information. Interestingly, contrary to the mantra of the open source movement of “standards”, I get the impression he doesn’t agree with this very much. Not to say he dislikes open source. I think he’s very much in favor of it. My sense is, with a few exceptions, he thinks we’re not ready for standards, because we don’t even have a good idea of what computing is good for yet. So things need to remain malleable.
I think part of what he’s getting at with his comments on the “pop culture” is a focus on features like availability, licensing, and functionality and/or expressiveness. Features excite people because they solve immediate problems, but since by and large we don’t look at architectures critically, this leads to problems down the road.
We tend to not think in terms of structures that can expand and grow with time, as requirements change, though the web has been forcing some developers to think about scalability issues, in concert with Information Systems. Unfortunately all too often we’re using technologies that make these goals difficult to accomplish, because we’ve ignored ideas that would make it easier. Instead of looking at them we’ve tried to adapt architectures that were meant for other purposes to new purposes, because the old model is more familiar. Adding on to an old architecture is seen as cheaper than trying to invent something more appropriate. Up front costs are all that count.
I’m going to use a broader definition of “pop culture” here. I think it’s a combination of things:
- As Kay implied, an emphasis on “features”. It gets people thinking in terms of parts of computing, not the whole experience.
- An emphasis on using the machine to accomplish tasks through applications, rather than using it to run simulations on your ideas, though there have been some applications that have brought out this idea in domain-specific ways, like word processing and spreadsheets.
- It’s about an image: “this technology will define you.” Of course, this is an integral part of advertising. It’s part of how companies get you interested in products.
- A technology is imbued with qualities it really doesn’t have. Another name for this is “hype”.
The problem with the pop culture is it misdirects people from what’s really powerful about computing. I don’t see this as a deliberate attempt to hide these properties. I think most vendors believed the messages they were selling. They just didn’t see what’s truly powerful about the idea of computing. Customers related to these messages readily.
The pop culture has defined what computing is for years. It has infected the whole culture around it. According to Kay, there’s literally been a dumbing down of computer science as a result. This has continued on into the age of the internet. You hear some people complain about it with “Web 2.0” now. Personally I think LOLCode is a repetition of this cycle. Why do I point this out? Microsoft is porting it to .Net…Great. Just what we need. Another version of BASIC for people to glom onto.
The truth is it’s not about defining who you are. It’s not about this or that feature. That’s what I’m gradually waking up to myself. It’s about what computers and software really can be–an extension of the mind.
The “early years”
I found a bunch of vintage ads, and a few that are recent, which in some ways document, but also demonstrate this pop culture as it’s evolved. I’ve also tried to convey the multiplicity of computer platforms that used to exist. I couldn’t find ads for every brand of computer that was out there at the time. Out of these, only a couple give a glimpse of the power of computing. As I looked at them I was reminded that computer companies often hired celebrities to pitch their machines. A lot of the ads focused on how low-price their computers were. The exceptions were usually IBM and Apple.
The pitch at that time was to sell people “a computer”, no matter what it was. You’ll notice that they had different target audiences: some for adults, some for businesspeople, and some for kids. There was no sense until the mid-1980s that software compatibility mattered.
I went kind of wild with the videos here. If a video doesn’t seem to play properly, refresh the page and try again. I’ve tried to put these in a semi-chronological order of when they came out, just so you can kind of see what was going on when.
Dick Cavett pitches the Apple II
Well, at least she’s using the machine to think thoughts she probably didn’t before. That’s a positive message.
William Shatner sells the Commodore Vic-20
Beam into the future…
The Tandy/Radio Shack TRS-80 Color Computer
The Tandy TRS-80 Model 4 computer
Tandy/Radio Shack (where the “TRS” moniker came from) had (I think) 4 “model” business computers. This was #4. The Model I looked more like the Color Computer in configuration, but all the other ones looked like the Model 4 shown here. I think they just had different configurations in the case, like number of disk drives, and the amount of memory they came with.
The IBM PC
The one that started it all, so to speak. Want to know who your Windows PC’s granddaddy was? This was it. Want to know how Microsoft made its big break? This was it.
These ads didn’t feature a celebrity, per se, but an actor playing Charlie Chaplin.
The Kaypro computer
As I recall, the Kaypro ran Digital Research’s CP/M operating system.
This reminds me of something that was very real at the time. A lot of places tried to lure you in telling you the price of the computer–with nothing else to go with it…The monitor was extra, for example. So as the ad says, a lot of times you’d end up paying more than the list price for a complete system.
The Commodore 64
What’s funny is I actually wrote a BASIC program on the Apple II once that did just what this ad says. I wrote a computer comparison program. It would ask for specifications on each machine, weight them based on importance (subjective on my part), and made a recommendation about which was the better one. I don’t remember if I was inspired by this ad or not.
Bill Cosby pitches the TI-99/4a with a $100 rebate
Kevin Costner appears for the Apple Lisa
This ad is from 1983.
“I’ll be home for breakfast.” Okay, is he coming back in later, or did he just come to his desk for a minute and call it a day?
The Apple Macintosh
This ad is from 1984. I remember seeing it a lot. I loved the way they dropped the IBM manuals on the table and you could literally see the IBM monitor bounce a little from the THUD. 😀
Of course, who could forget this ad. Ridley Scott’s now-famous Super Bowl ad for the Macintosh.
Even though it’s a historical piece, this pop culture push for the Macintosh was unfortunate, because it made the Mac into some sort of “liberating” social cause (in this case, against IBM). It’s a piece that’s more useful for historical commentary (like this, I guess…).
Ah yes, this is more like it:
This ad gave a glimpse of the true potential of computing to the audience of this era–create, simulate whatever you want. Unfortunately it’s a little hard to see the details.
Alan Alda sells the Atari XL 8-bit series
Alan Alda did a whole series of ads like this for Atari in 1984. I gotta say, this ad’s clever. I especially liked the ending. “What’s a typewriter?” the girl asks. Indeed!
The Coleco Adam
This is the only ad video I could find for it. It features 4 ads for the Colecovision game console and the Coleco Adam computer. Yep, the video game companies were getting in on the fun, too. Well, what am I saying? Atari got into this a long time before. I remember that the Adam came in two configurations. You could get it as a stand-alone system, or you could get it as an add-on to the Colecovision game console. The console had a front expansion slot you could plug the “add-on” computer into.
What I remember is the tape you see the kid loading into the machine was a special kind of cassette, called a “streaming tape”. Even though data access was sequential, supposedly it could access data randomly on the tape about as fast as a disk drive could access data on a floppy disk. Pretty fast.
According to a little research I did, the Adam ran Digital Research’s CP/M operating system.
Microsoft Windows 1.0 with Steve Ballmer
This ad is from 1986. Steve Ballmer back when he had (some) hair. 🙂 This is the only software ad of any significance during this era. I don’t remember seeing this on TV. Maybe it was an early promo at Microsoft.
Atari Mega ST
I never saw TV ads for the Atari STs. I’ve only seen them online. In classic “Commodore 64” fashion, this one compares the Atari Mega ST to the Mac SE on features and price. Jack Tramiel’s influence, for sure.
The Commodore Amiga 500
A TON of celebrities for this one!
Pop culture: The next generation
As with the above set, there are more I wish I could show here, but I couldn’t find them. These are ads from the 1990s and beyond.
“Industrial Revelation”, by Apple
I like the speech. It’s inspiring, but…what does this have to do with computing?? It sounds more like a promo ad for a new management technique, and a little for globalization. This was a time when Apple started doing a lot of ads like this. It was getting ridiculous. I remember an ad they put on when the movie Independence Day came out, because the Mac Powerbook was featured in it: “You can save the world, if you have the right computer”. A total pop culture distraction.
IBM OS/2 Warp
The one that “started” it all–“Start me up!” What they didn’t tell us was “then reboot!”
“I Am” ad for Lotus Domino, from IBM
Wow. So Lotus Domino makes you important, huh? Neat ad, but I think people who actually used Domino wouldn’t have said it turns them into “superman”. What was nice about it was you could store pertinent information in a reasonably easy-to-access database that other people could use to access it, rather than using an e-mail system as your information “database”. Unfortunately the ad never communicated this. Instead it tried to pass it off as some “empowerment” tool. Domino was okay for what it did. I think it’s a concept that could be improved upon.
IBM Linux ad
So…Linux is artificially intelligent? I realize they’re talking about properties of open source software, but this is clearly hype-inducing.
The following isn’t an ad, but it’s pop culture. It’s touting features, and it claims that technology defines us. It doesn’t really talk about computing as a whole, but it claims to.
The Machine is Us/ing Us
by Michael Wesch, Assistant Professor of Cultural Anthropology,
Kansas State University
I think it’s less “the machine” than “Google is using us”. Tagging is a neat cataloging technique, but each link “teaches the machine a new idea”? Gimme a break!
So what are you saying?
In short, for one, I’m saying, “Don’t believe the hype!”
Tron is one of my favorite movies of all time. I listened to an interview with its creator, Steven Lisberger, that was done several years ago. He said now that the public had discovered the internet, there was a “stepping back” from it, to look at things from a broader view. He said people were saying, “Now that we have this great technology, what do we do with it?” He said he thought it was a good time for artists to revisit this subject (hinting at a Tron sequel, which so far hasn’t shown up). This is what I loved. He said of the internet, “It’s got to be something more than a glorified phone system.”
For those who have the time to ponder such things, like I’ve had lately, I think we need to be asking ourselves what we’re creating. Are we truly advancing the state of the art? Are we reinventing what’s already been invented, as Alan Kay says often? Are we regressing, even? Are we automating what used to exist in a different medium? Think about that one. Kay has pointed this out often, too.
What is a static web page, a word processing document, or a text file? It’s the automation of paper, an old medium. What’s new with it is hypertext, though as I mention below, Douglas Engelbart had already done something similar decades before. Form fields and buttons are very old. The VT3270 terminal had these features for decades (there was a button for the “submit” function on the terminal keyboard).
What is a web browser? It’s like a VT3270 terminal, a technology created by IBM in the 1970s, adapted to the GUI world, with the addition of componentized client-side plug-ins.
What is blogging? It’s a special case of a static web site. It’s a diary done in public, or at best an automation of an old-fashioned bulletin board (I’m talking about the cork kind), or newsletter.
What is an e-commerce site? It’s the automation of mail order, which has existed since the 19th century.
What about search? It’s a little new, because it’s distributed, but Engelbart already invented a searchable system that used hypertext in the late 1960s.
What is content management? It’s the automation of copy editing. Magazines and newspapers have been doing it for eons, even when you take internationalization into account. It takes it from an analog medium, and brings it to the internet where everything is updated dynamically. I imagine it adds to this the idea of content for certain groups of viewers (subscribers vs. non-subscribers, for example). A little something new.
What about collaborative computing over a network, with video? Nope. Douglas Engelbart invented that in the late 1960s, too.
What about online video? It’s the automation of the distribution of home movies. The fact that you can embed videos into other media is fairly new (a special case), but the idea that you could bring various media together into a new form existed in the Smalltalk system in the 1970s. It’s just been a matter of processing power that’s enabled digital video to be more widespread. Network streaming is new, though, to the best of my memory.
What about VOIP? It’s just another version of a phone system, or a PBX.
What about digital photography and video (like from a camcorder)? I don’t really have to explain this one, because you already know what I’ll say about it. The part that’s really new is you don’t have to wait to get your photos developed, and you can see pretty much how the video will look while you’re taking it. I’ll put these under the “something new” column, because it really facilitates experimentation, something that was difficult with film.
What about digital music? Same thing. The only things new about it are it’s easier to manipulate (if it’s not inhibited by DRM), and your player doesn’t have to have any moving parts. The main bonus with it is since it’s not analog its quality doesn’t degrade with time.
What about photo and video editing? Professionals have done this in analog media for years.
And there’s more.
Are you getting the impression that this is like that episode of South Park called “The Simpsons Already Did It”?
South Park: Blocking out the Sun
There’s not a whole lot new going on here. It just seems like it because all of the old stuff is being brought into a new medium, and/or the hardware has gotten powerful and cheap enough to have finally brought old inventions to a level that most people can access.
The pop culture’s mantra: accessibility
The impression I get is a lot of what people have bought into with computing is accessibility, but to what? When you talk to a lot of people about open source software they think it’s about this as well, along with community.
Here’s what’s been going on: Take what used to be exclusive to professionals and open it up to a lot more people through technology and economies of scale. It’s as if all people have cared about is porting the old thing to the new cheaper, more compact thing just so we can pat ourselves on the back for democratizing it and say more people can do the old thing. Accessibility and community are wonderful, but they’re not the whole enchilada! They are just slices.
From what I’ve heard Alan Kay say, what’s not being invested in so much is creating systems where people can simulate their ideas, experiment with them, and see them play out; or look at a “thought model” from multiple perspectives. There are some isolated areas where this has been with us for a long time: word processing, spreadsheets, desktop publishing, and later photo and video editing. Some of these have been attempts to take something old and make it better, but the inspiration has always been some old medium.
One guess as to why Kay’s vision hasn’t become widespread is because there hasn’t been demand for it. Why? Because our educational system, and aspects of our society, have not encouraged us to think in terms of modeling our ideas, except for the arts, and perhaps the world of finance. Making this more broad-based will require innovation in thought as well as technology.
The following are my own impressions about what Kay is after. It seems to me what he wants to do is make modes of thought more accessible to the masses. This way we won’t have to depend on professional analysts to make our conclusions for us about some complex subject that affects us. We can create our own models, our own analyses, and make decisions based on them. This requires a different kind of education though, a different view of learning, research, and decision-making, because if we’re going to make our own models, we have to have enough of a basis in the facts so that our models have some grounding in reality. Otherwise we’re just deceiving ourselves.
What Kay has been after is a safe computing environment that works like we think. He sees computers as a medium, one that is interactive and dynamic, where you can see the effects of what you’re trying to do immediately; a medium where what it synthesizes is under our control.
He’s never liked applications, because they create stovepipes of data and functionality. In most cases they can’t be expanded by the user. He envisions a system where data can be adapted to multiple uses, and where needed functionality can be added as the user wants. Data maintains some control over how it is accessed, to free the user from having to interpret its format/structure, but other than that, the user is in control of how it’s used.
It seems like the lodestone he’s been after is the moment when people find a way to communicate that is only possible with a computer. He’s not talking about blogging, e-mail, the web, etc. As I said above, those are basically automation of old media. He’s thinking about using a computer as a truly new medium for communication.
He believes that all of these qualities of computing should be brought to programmers. In fact everyone should know how to program, in a way that’s understandable to us, because programming is a description of our models. A program is a simulation of a model. The environment it runs in should have the same adaptive qualities as described above.
What’s hindered progress in computing is how we think about it. We shouldn’t think of it as a “new old thing”. It’s truly a new thing. In Engelbart’s and Kay’s view it’s an extension of our minds. We should try to treat it as such. When we succeed at it, we will realize what is truly powerful about it.
A caveat that Kay and others have put forward is we don’t understand computing yet. The work that Engelbart and Kay have done have been steps along the way towards realizing this, but we’re not at the endpoint. Not by a longshot. There’s lots of work to be done.
—Mark Miller, https://tekkie.wordpress.com