The real computer revolution

For so many years Alan Kay has been saying “The computer revolution hasn’t happened yet”. He sees the computer as a new medium, just as the book and printing press represented a new medium in the Middle Ages. He’s said that we will know when the computer revolution has arrived when we learn to communicate in a way that is only possible with the computer. I can’t find the source for this, but I know he’s said it. I don’t believe he meant things like YouTube, or even blogging, though tagging would probably be a part of it. He meant something more socially transformative. Just as the book and printing press democratized knowledge, and enabled people to discuss ideas with each other using argument and logic, instead of always appealing to authority–creating fertile ground for democratic movements–the computer has the same potential. Most of all, the computer revolution, in his view, is not the “automation of paper”. Remember the “vision of the future” that was once touted as the business computing panacea of the “paperless office”? He says that’s not it.

I tried a little exercise after hearing this. I tried to think what would this unique form of communication be? The only thought that came to mind was simulation, the ability to model a phenomenon or process, and use it to communicate ideas about them. When I thought about whether this was being done already, the first thought that came to my head was the whole issue of global warming and what’s causing it. A lot of the arguments around it are generated off of atmospheric computer models being generated by scientists, though right now it seems to me this process is imperfect. I’m just using it as an example of something that’s starting to happen with respect to computing as a medium, and what it signifies.

I found this blog post recently by Mark Guzdial, called “Computing for Knowledge Transformation” (it’s on his author blog on (Update 5-24-2013: This is from Mark’s Amazon blog that has since disappeared. He created a new blog a few years ago using WordPress, though I don’t believe he moved his old blog posts). He has outlined some more ideas regarding what the true impact of computing will be on the future.

Guzdial has been exploring what computer science education means today, given the current state it’s in. An idea of his that’s been catching on where he works, at Georgia Tech, is Media Computation–applying computer science and programming to media, such as graphics, video, and audio. It’s been attracting a lot of interest. And get this, most a lot quite a few of the students in his class are female! People should take a look at this.

Anyway, he discusses a concept of writing I found interesting: that beginners, writers just starting out, “write to tell,” basically regurgitating knowledge, and experts “transform knowledge” in the process of their writing. The writer synthesizes his/her piece, drawing together varying sources of information, and the writer is transformed in the process, gaining new knowledge. I could relate to this. Just doing a little self reflection I’d say I’ve been doing quite a bit of the former on this blog, and a bit of the latter. He said that the same could be said of computing: there’s “knowledge telling” (he cites PowerPoint as an example of this mode of computing. Quite apt I’d say…), and then there’s “knowledge transformation.”  The future of computing is the latter:

Computational scientists are using computing as a way of creating knowledge, of figuring out how to communicate it, and of a way of transforming their own knowledge in the process.  In fact, all forms of Computational-X (where X is journalism, photography, biology, chemistry, and so on) are about transforming knowledge in X through effective use of computation.

Programming is an important tool when using computing for knowledge transformation.  If you are using the computer as a way of creating new knowledge, you are almost certainly using the computer in a new way that others have not considered.  You cannot do an HCI process of task analysis when the task is “invent a process or representation that has never been created before.”

Note that “effective use” has nothing to do with software engineering.  Most computational science code that I’ve seen is pretty poor engineering quality. (Which does raise the interesting research question of what kind of program methodology and structuring makes the most sense for computational-X professionals and what will they actually use.)

“Computational Thinking” and “Computing for Knowledge Transformation” are both verb-like phrases — it’s what you do.  What do we teachers teach if we want people to learn computational thinking, to achieve computing for knowledge transformation? What computing for knowledge transformation is about includes:

  • Representations: Using computing to understand how information and models in a domain might be structured.
  • Mappings: Between representations, between models and the real world counterparts.
  • Simulation: Execution of a model to test it, to gather data on it, to expore information in silico.
  • Integration: We don’t program in C4KT in order to build an artifact.  We program to explore ideas and create knowledge, where the output will probably be read into Excel and graphed, or will drive a script in Maya to generate an animation. Programming in C4KT is not about programming a blank sheet of paper to create a new killer app.  It’s about creating the data, models, and representations that can’t be created in existing tools, then moving back to the existing tools, perhaps for analysis and visualization.

(my emphasis in bold italics)

I’ve probably cited this before, but as I read this, I can’t help but think of the spreadsheet. It accomplishes some of these tasks for some domains. I think the key is to build on that model of what an application is, to create more opportunities for exploration. At least that’s a way that most people right now can relate to this. Kay would probably argue for the “total computing environment,” where programming and “tangible modeling” (you create models and manipulate them using an intuitive system interface) is emphasized more, and applications hold less sway.

What Guzdial discusses here sounds very scientific. I can imagine scholars and researchers using these sophisticated techniques to advance knowledge, but I’m wondering what about opportunities for this in the business community? What about opportunities for this among average people? Certainly spreadsheets are used by some average computer users. A more popular example might be the use of photo editing software. It typically doesn’t involve programming, but it does delve into the realm of simulation (you can preview what your edit will look like before you finalize it), though it’s a simple form of that. It does at least allow exploration.

I’m not expecting Guzdial to come up with the answer for this (though if he’d like to, I wouldn’t mind hearing about it). This is sort of a train of thought post. I thought Guzdial’s post was very interesting, because it helped answer a question that’s been rattling around in my head ever since Alan Kay talked about this: What will the computer revolution really look like? I think Guzdial is on to the answer.


7 thoughts on “The real computer revolution

  1. Pingback: Top Posts «

  2. @Boberg:

    I reviewed the video. It sounds like the evidence is inconclusive at best at this point. I have heard about concerns raised about wi-fi here in the U.S., though the voices raising them are not that loud at this point. For many years concerns were raised about cell phones, that they could cause cancer, etc. A few scientific studies thought they had found a link. Some rigorous studies have been done, and every time there’s been no conclusive link found between cell phone use and cancer. The most that’s been found to happen is a slight heating of the brain on the side of the head where the cell phone is used.

    Wi-fi technology has actually been around for more than 10 years. I worked at a place that used it when creating system solutions for clients, back in the mid-90s. We didn’t use it ourselves though. I don’t think it was out of health concerns, just that the wired infrastructure was faster and more mature.

    The FCC here sets limits on electromagnetic radiation levels for everything from electronic devices to computers. They take health into account, but for the most part they try to keep the electromagnetic spectrum from becoming a “smog” of interference where signals for legitimate uses can’t be received. They’re not as restrictive as in Europe. I remember there used to be concerns about the radiation from CRT monitors, and that Europe had more stringent shielding requirements than we did. The only conclusive evidence found that I had heard of was that pregnant women should not sit close to them, because it had a tendency to cause problems for the pregnancy. In one famous case prolonged CRT exposure supposedly caused a pregnant nurse who worked in a hospital to have a natural abortion. This was in the 1980s.

    Some people I knew had concerns about radiation from computers themselves, though I’ve heard no scientific worries about that. Heck, there’s background radiation from the electrical wires in my home. Every time I take a plane trip (about once a year) I get exposed to higher levels of radiation from the Sun.

    To tell you the truth I don’t know whether to be concerned about it or not. There have been health concerns raised about all sorts of things related to technology, due to electromagnetic radiation, but so far I haven’t heard anything conclusive about the actual effects on biological functions. I’ve noticed no short-term health effects at least, though it doesn’t mean much for me to say it.

    The only guidelines we have about radiation that are really solid have to do with charged particles, things that can physically impact our cells. It seems to me that just as with dosages of exposure to substances, there should be guidelines for electromagnetic exposure, if that’s warranted. I assume it would mainly have to do with the intensity/power of a signal, rather than frequency, though as with electricity, frequency can have an effect on how much of an impact a signal would have on the body. Tesla proved that.

  3. After Marshall McLuhan’s death, his son, Eric McLuhan, completed a remaining work entitled Laws of Media. In it McLuhan lays down four laws as questions:

    1. What does it enhance or intensify?
    2. What does it render obsolete or replace?
    3. What does it retrieve that was previously obsolesced?
    4. What does it produce or become when pushed to an extreme?

    Marshall McLuhan called this a “tetrad” and said it could be applied to any human artifact.

    I think Kay has been thinking about computer’s from McLuhan’s viewpoint. He is of the opinion that up to now we have simply retrieving old forms of media and using them with this new tool or obsolescing current media. What Kay is wondering is: “What do computers actually enhance?”

    Personally, I think what we don’t realize about computers is that they are very primitive devices at this time. Until we are able to create computers that actually do physically emulate the brain we are still dealing with a purely mechanistic device. We will not be able to create computers that can tackle human logic, context and concepts until this physical emulation is achieved.

  4. @Czerepak:

    To a certain extent you’re right. That’s why we have programming languages. They’re a way to make the computer/machine do something for us, as opposed to actually having a conversation with it. If you’re talking about human augmentation with computers (ie. computer implants–Paul Murphy has an interesting post on this here), the basic problem I see with it is that today’s computers are digital. Our brains are analog.

    Anyway, I think we are on the cusp of taking the next step in the evolution of computing. The old paradigm of computer science is wilting. I don’t know that it will disappear, but it’s much lower in significance than it used to be. People like Guzdial are trying out new approaches to CS that are closer to the vision that Kay has had: merging media (graphics and audio) with computation.

    Yes, computers were clunky in the past. I say this primarily because they were too slow to do sophisticated programming with more ease. It took some sweat to do it. Now computers are fast enough that the ability to run programs written in languages like Ruby, Lisp, Smalltalk, etc. is within reach. It’s now a feasible option. With this, the goals of the computer revolution Guzdial outlines become more achievable.

    As Kay has pointed out, we could’ve had this decades ago if the hardware companies had gotten their @#$^ together and actually innovated rather than just miniaturizing an outdated and flawed design.

    A major barrier now is perception. I’ve found that a lot of what we end up using computers for comes down to this: what is it good for? The answer you get differs depending on who you talk to. Alan Kay’s vision is that it’s a medium for people to try out and explore ideas. They don’t have to just be mechanically oriented. They could be artistic. Where he’s targeted Squeak is at education, though it can be used for all sorts of things (and is). He’s targeted Croquet (written in Squeak) at more grown-up pursuits. He’s not resting on his laurels. He’s now working on the OLPC laptop, and a next generation programmable system.

  5. oh.. thats not what i need .. i need the reason why computer became a revolutionary medium. pls answer thanks and God Bless

  6. @Ruby:

    Well that’s the thing. It hasn’t really become a revolutionary medium yet. That’s more of an aspirational idea. The potential is definitely there, but it needs to be developed. After all these years it’s still “a work in progress”.

    I’ve written about other aspects of this in other blog posts. You could have typed “medium” into the search bar and my blog would’ve given them to you. Here are a couple:

    The Computer as Medium


    My journey, Part 6, where I refer to online video of a TV show that aired in 1992 called “The Machine That Changed The World” (Look for the sub-section titled “About ‘The Machine That Changed The World'”). I quote the show in “My journey” where they talk about the computer being a new medium.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s