The computer as medium

In my guest post on Paul Murphy’s blog called “The PC vision was lost from the get go” I spoke to the concept, which Alan Kay had going back to the 1970s, that the personal computer is a new medium, like the book at the time the technology for the printing press was brought to Europe, around 1439 (I also spoke some about this in “Reminiscing, Part 6”). Kay made this realization upon witnessing Seymour Papert’s Logo system being used with children. More recently Kay has with 20/20 hindsight spoken about how like the book, historically, people have been missing what’s powerful about computing because like the early users of the printing press we’ve been automating and reproducing old media onto the new medium. We’re even automating old processes with it that are meant for an era that’s gone.

Kay spoke about the evolution of thought about the power of the printing press in one or two of his speeches entitled The Computer Revolution Hasn’t Happened Yet. In them he said that after Gutenberg brought the technology of the printing press to Europe, the first use found for it was to automate the process of copying books. Before the printing press books were copied by hand. It was a laborious process, and it made books expensive. Only the wealthy could afford them. In a documentary mini-series that came out around 1992 called “The Machine That Changed The World,” I remember an episode called “The Paperback Computer.” It said that there were such things as libraries, going back hundreds of years, but that all of the books were chained to their shelves. Books were made available to the public, but people had to read the books at the library. They could not check them out as we do now, because they were too valuable. Likewise today, with some exceptions to promote mobility, we “chain” computers to desks or some other anchored surface to secure them, because they’re too valuable.

Kay has said in his recent speeches that there were a few rare people during the early years of the printing press who saw its potential as a new emerging medium. Most of the people who knew about it at the time did not see this. They only saw it as, “Oh good! Remember how we used to have to copy the Bible by hand? Now we can print hundreds of them for a fraction of the cost.” They didn’t see it as an avenue for thinking new ideas. They saw it as a labor saving device for doing what they had been doing for hundreds of years. This view of the printing press predominated for more than 100 years still. Eventually a generation grew up not knowing the old toils of copying books by hand. They saw that with the printing press’s ability to disseminate information and narratives widely, it could be a powerful new tool for sharing ideas and arguments. Once literacy began to spread, what flowed from that was the revolution of democracy. People literally changed how they thought. Kay said that before this time people appealed to authority figures to find out what was true and what they should do, whether they be the king, the pope, etc. When the power of the printing press was realized, people began appealing instead to rational argument as the authority. It was this crucial step that made democracy possible. This alone did not do the trick. There were other factors at play as well, but this I think was a fundamental first step.

Kay has believed for years that the computer is a powerful new medium, but in order for its power to be realized we have to perceive it in such a way that enables it to be powerful to us. If we see it only as a way to automate old media: text, graphics, animation, audio, video; and old processes (data processing, filing, etc.) then we aren’t getting it. Yes, automating old media and processes enables powerful things to happen in our society via. efficiency. It further democratizes old media and modes of thought, but it’s like just addressing the tip of the iceberg. This brings the title of Alan Kay’s speeches into clear focus: The computer revolution hasn’t happened yet.

Below is a talk Alan Kay gave at TED (Technology, Entertainment, Design) in 2007, which I think gives some good background on what he would like to see this new medium address:

“A man must learn on this principle, that he is far removed from the truth” – Democritus

Squeak in and of itself will not automatically get you smarter students. Technology does not really change minds. The power of EToys comes from an educational approach that promotes exploration, called constructivism. Squeak/EToys creates a “medium to think with.” What the documentary Squeakers” makes clear is that EToys is a tool, like a lever, that makes this approach more powerful, because it enables math and science to be taught better using this technique. (Update 10/12/08: I should add that whenever the nature of Squeak is brought up in discussion, Alan Kay says that it’s more like an instrument, one with which you can “mess around” and “play,” or produce serious art. I wrote about this discussion that took place a couple years ago, and said that we often don’t associate “power” with instruments, because we think of them as elegant but fragile. Perhaps I just don’t understand at this point. I see Squeak as powerful, but I still don’t think of an instrument as “powerful”. Hence the reason I used the term “tool” in this context.)

From what I’ve read in the past, constructivism has gotten a bad reputation, I think primarily because it’s fallen prey to ideologies. The goal of constructivism as Kay has used it is not total discovery-based learning, where you just tell the kids, with no guidance, “Okay, go do something and see what you find out.” What this video shows is that teachers who use this method lead students to certain subjects, give them some things to work with within the subject domain, things they can explore, and then sets them loose to discover something about them. The idea is that by the act of discovery by experimentation (ie. play) the child learns concepts better than if they are spoon-fed the information. There is guidance from the teacher, but the teacher does not lead them down the garden path to the answer. The children do some of the work to discover the answers themselves, once a focus has been established. And the answer is not just “the right answer” as is often called for in traditional education, but what the student learned and how the student thought in order to get it.

Learning to learn; learning to think; learning the critical concepts that have gotten us to this point in our civilization is what education should be about. Understanding is just as important as the result that flows from it. I know this is all easier said than done with the current state of affairs, but it helps to have ideals that are held up as goals. Otherwise what will motivate us to improve?

What Kay thinks, and is convinced by the results he’s seen, is that the computer can enable children of young ages to grasp concepts that would be impossible for them to get otherwise. This keys right into a philosophy of computing that J.C.R. Licklider pioneered in the 1960s: human-computer symbiosis (“man-computer symbiosis,” as he called it). Through a “coupling” of humans and computers, the human mind can think about ideas it had heretofore not been able to think. The philosophers of symbiosis see our world becoming ever more complex, so much so that we are at risk of it becoming incomprehensible and getting away from us. I personally have seen evidence of that in the last several years, particularly because of the spread of computers in our society and around the world. The linchpin of this philosophy is, as Kay has said recently, “The human mind does not scale.” Computers have the power to make this complexity comprehensible. Kay has said that the reason the computer has this power is it’s the first technology humans have developed that is like the human mind.

Expanding the idea

Kay has been focused on using this idea to “amp up” education, to help children understand math and science concepts sooner than they would in the traditional education system. But this concept is not limited to children and education. This is a concept that I think needs to spread to computing for teenagers and adults. I believe it should expand beyond the borders of education, to business computing, and the wider society. Kay is doing the work of trying to “incubate” this kind of culture in young students, which is the right place to start.

In the business computing realm, if this is going to happen we are going to have to view business in the presence of computers differently. I believe for this to happen we are going to have to literally think of our computers as simulators of “business models.” I don’t think the current definition of “business model” (a business plan) really fits what I’m talking about. I don’t want to confuse people. I’m thinking along the lines of schema and entities, forming relationships which are dynamic and therefor late-bound, but with an allowance for policy to govern what can change and how, with the end goal of helping business be more fluid and adaptive. Tying it all together I would like to see a computing system that enables the business to form its own computing language and terminology for specifying these structures so that as the business grows it can develop “literature” about itself, which can be used both by people who are steeped in the company’s history and current practices, and those who are new to the company and trying to learn about it.

What this requires is computing (some would say “informatics”) literacy on the part of the participants. We are a far cry from that today. There are millions of people who know how to program at some level, but the vast majority of people still do not. We are in the “Middle Ages” of IT. Alan Kay said that Smalltalk, when it was invented in the 1970s, was akin to Gothic architecture. As old as that sounds, it’s more advanced than what a lot of us are using today. We programmers, in some cases, are like the ancient pyramid builders. In others, we’re like the scribes of old.

This powerful idea of computing, that it is a medium, should come to be the norm for the majority of our society. I don’t know how yet, but if Kay is right that the computer is truly a new medium, then it should one day become as universal and influential as books, magazines, and newspapers have historically.

In my “Reminiscing” post I referred to above, I talked about the fact that even though we appeal more now to rational argument than we did hundreds of years ago, we still get information we trust from authorities (called experts). I said that what I think Kay would like to see happen is that people will use this powerful medium to take information about some phenomenon that’s happening, form a model of it, and by watching it play out, inform themselves about it. Rather than appealing to experts, they can understand what the experts see, but see it for themselves. By this I mean that they can manipulate the model to play out other scenarios that they see as relevant. This could be done in a collaborative environment so that models could be checked against each other, and most importantly, the models can be checked against the real world. What I said, though, is that this would require a different concept of what it means to be literate; a different model of education, and research.

This is all years down the road, probably decades. The evolution of computing moves slowly in our society. Our methods of education haven’t changed much in 100 years. The truth is the future up to a certain point has already been invented, and continues to be invented, but most are not perceptive enough to understand that, and “old ways die hard,” as the saying goes. Alan Kay once told me that “the greatest ideas can be written in the sky” and people still won’t understand, nor adopt them. It’s only the poor ideas that get copied readily.

I recently read that the Squeakland site has been updated (it looks beautiful!), and that a new version of the Squeakland version of Squeak has been released on it. They are now just calling it “EToys,” and they’ve dropped the Squeak name. is still up and running, and they are still making their own releases of Squeak. As I’ve said earlier, the Squeakland version is configured for educational purposes. The version is primarily used by professional Smalltalk developers. Last I checked it still has a version of EToys on it, too.

Edit: As I was writing this post I went searching for material for my “programmers” and “scribes” reference. I came upon one of Chris Crawford‘s essays. I skimmed it when I wrote this post, but I reread it later, and it’s amazing! (Update 11/15/2012: I had a link to it, but it’s broken, and I can’t find the essay anymore.) It caused me to reconsider my statement that we are in the “Middle Ages” of IT. Perhaps we’re at a more primitive point than that. It adds another dimension to what I say here about the computer as medium, but it also expounds on what programming brings to the table culturally.

Here is an excerpt from Crawford’s essay. It’s powerful because it surveys the whole scene:

So here we have in programming a new language, a new form of writing, that supports a new way of thinking. We should therefore expect it to enable a dramatic new view of the universe. But before we get carried away with wild notions of a new Western civilization, a latter-day Athens with modern Platos and Aristotles, we need to recognize that we lack one of the crucial factors in the original Greek efflorescence: an alphabet. Remember, writing was invented long before the Greeks, but it was so difficult to learn that its use was restricted to an elite class of scribes who had nothing interesting to say. And we have exactly the same situation today. Programming is confined to an elite class of programmers. Just like the scribes, they are highly paid. Just like the scribes, they exercise great control over all the ancillary uses of their craft. Just like the scribes, they are the object of some disdain — after all, if programming were really that noble, would you admit to being unable to program? And just like the scribes, they don’t have a damn thing to say to the world — they want only to piddle around with their medium and make it do cute things.

My analogy runs deep. I have always been disturbed by the realization that the Egyptian scribes practiced their art for several thousand years without ever writing down anything really interesting. Amid all the mountains of hieroglypics we have retrieved from that era, with literally gigabytes of information about gods, goddesses, pharoahs, conquests, taxes, and so forth, there is almost nothing of personal interest from the scribes themselves. No gripes about the lousy pay, no office jokes, no mentions of family or loved ones — and certainly no discussions of philosophy, mathematics, art, drama, or any of the other things that the Greeks blathered away about endlessly. Compare the hieroglyphics of the Egyptians with the writings of the Greeks and the difference that leaps out at you is humanity.

You can see the same thing in the output of the current generation of programmers, especially in the field of computer games. It’s lifeless. Sure, their stuff is technically very good, but it’s like the Egyptian statuary: technically very impressive, but the faces stare blankly, whereas Greek statuary ripples with the power of life.

What we need is a means of democratizing programming, of taking it out of the soulless hands of the programmers and putting it into the hands of a wider range of talents.

Related post: The necessary ingredients for computer science

—Mark Miller,

13 thoughts on “The computer as medium

  1. Mark –

    Excellent, excellent post! Thank you! Reading this cheered up my night immensely; I just had a very sharp pin poked into a balloon that I’ve spent about 6 months inflated and had a lot of pride and self-image riding on, and I am hoping that I can still salvage it.

    Also, hope my email from yesterday got through to you OK, let me know if it didn’t.



  2. I noticed that the Squeakers video featured Jerome Bruner and that alan kay recommended one of his books “Towards a theory of instruction” as outstanding

    I followed this up and found a great page (jerome bruner and the process of education ) which summarises Bruners thinking. I notice how Bruner takes concepts from both sides of the conventional curriculum wars and welds them together, for instance, he thinks that both structure and intuition are important. I summarised his approach as briefly as possible as incorporating structure, readiness, intuition, motivation.

    I see this as the way forward – building a pyramid made up bits from both sides of the curriculum wars.

    I thought the Crawford essay, while interesting, was different from alan kay’s approach. Crawford sees programming as the new alphabet whereas Kay stresses science more. I think Kay’s approach encompasses a wider group historically and represents a better strategy in how to proceed.

  3. @Bill Kerr:

    The part about making a connection between writing and programming was meant to be part of the “adult” section of my post, where I extrapolate the idea of the computer as medium from its use with children, to its use by older children and adults. Programming is a form of expression, but it does more than just convey static thoughts. It describes a model that can be put in motion. I don’t believe, though, that this has to be confined to the realms of science and math.

    I’ll refer you to a couple things Alan Kay said that I think are compatible with what Crawford said. In my post called “Redefining computing, Part 2”, I show some of what Kay talked about, having to do with the professional practice of programming. In his speech he said:

    “I think the main thing about doing OOP work, or any kind of programming work, is that there has to be some exquisite blend between beauty and practicality. There’s no reason to sacrifice either one of those, and people who are willing to sacrifice either one of those I don’t think really get what computing is all about. It’s like saying I have really great ideas for paintings, but I’m just going to use a brush, but no paint. You know, so my ideas will be represented by the gestures I make over the paper.”

    In the Early History of Smalltalk he said:

    Should we even try to teach programming? I have met hundreds of programmers in the last 30 years and can see no discernable influence of programming on their general abiltity to think well or to take an enlightened stance on human knowledge. If anything, the opposite is true. Expert knowledge often remains rooted in the environments in which it was first learned–and most metaphorical extensions result in misleading analogies. A remarkable number of artists, scientists, philosophers are quite dull outside of their specialty (and one suspects within it as well). The first siren’s song we need to be wary of is the one that promises a connection between an interesting pursuit and interesting thoughts. The music is not in the piano, and it is possible to graduate Julliard without finding or feeling it.

    I have also met a few people for whom computing provides an important new metaphor for thinking about human knowledge and reach. But something else was needed besides computing for enlightenment to happen.

    Kay has tipped his hat some to the connection between writing and programming. In his NSF proposal from 2006 he said:

    A newer idea that is moving towards the mainstream is that specifications should be executable and debuggable. We want to go even further to “ship the specifications” – that is, the specifications should not just be a model of the meanings in a system, but should simply be the actual meanings of the systems.

    I drew this idea of writing and programming originally from Richard Gabriel. I wrote a blog post, called “Coding like writing”, based on an article he wrote. He talks about the process of building a model, and what it needs to be like. He said:

    Writing a beautiful text takes sitting before a medium [my emphasis] that one can revise easily, and allowing a combination of flow and revision to take place as the outlines and then the details of the piece come into view.

    He draws the connection by saying:

    Engineering is and always has been fundamentally such an enterprise, no matter how much we would like it to be more like science than like art. And the reason is that the requirements for a system come not only from the outside in the form of descriptions of behavior useful for the people using it, but also from within the system as it has been constructed, from the interactions of its parts and the interactions of its parts separately with the outside world. That is, requirements emerge from the constructed system which can affect how the system is put together and also what the system does. Furthermore, once a system is working and becomes observable, it becomes a trigger for subsequent improvement.

    So this is getting more into “how a computing environment should work”. I think this is also kind of what Crawford was talking about, but what he said has a lot more to do with us, how most of us think about computing.

    I do not know if you have experienced this yet, but both Kay and Gabriel talk about the idea that there can be beauty in programming. I know about this, because I have experienced ugliness in programming, working as a professional software developer. In my mind this idea of beauty in programming begins to get into the realm of literature. Programming code can express a model concisely and accurately, and can express the intent of the model’s designer. In the world of professional software development this is hardly ever experienced, because the tools and the languages used do not promote these ideas. Instead they present a computing model to the programmer that is somewhat better than programming with the hardware itself, and they promote a sense that you are telling the machine what to do. Today’s OOP languages (C++, Java, .Net) give a little bit of the sense that you’re constructing a model, but not much. EToys is of course the opposite of this. It promotes the idea that programming is the process of building models and creating relationships between them to ultimately build larger models.

    I agree with you that with respect to children a computer’s best use is towards teaching math and science, but I think as people get older, the idea of the computer as medium can expand beyond this, using the principles of math, science, and engineering to create systems that serve us, and even allow us to express ourselves in ways that were not possible before.

    In Kay’s speeches titled “The Computer Revolution Hasn’t Happened Yet” (he did a series of these a few years ago) he’s said one of the things he looks forward to seeing is the day when people start using computers to communicate in ways that only computers make possible. He didn’t mean communicating just in terms of text, graphics, audio, video, etc., because these are just automations of old media. He meant it in the vein of J.C.R. Licklider’s philosophy of human-computer symbiosis, that computers will enable human’s to think in ways that were impossible or very difficult before.

    Edit: I meant to address Crawford’s use of the term “alphabet”, because you brought that up. He said “programming is the new writing”. I don’t think this idea in and of itself in any way goes against the idea of using programming to teach math and science. The idea is, and Crawford gets into this in his essay (follow the link I put in my post), that computing and programming bring to mind new concepts, ideas of non-linearity (as opposed to linearity), precision and specificity in abstraction, looping (as opposed to “chunking”), and such. Programming is not merely a digital form of the old literature that comes from old writing on an old medium. It literally represents a whole new school of thought. Crawford said the then-current state of affairs (mid-1990s) was that programming was like using hieroglyphics, and that the typical programmer (in the professional realm, that is) is like an ancient Egyptian scribe who had nothing interesting to say. The scribes were a skilled, exclusive club, as is the case in the programming profession. He talked about developing an alphabet for the computing medium, because this is what led to widespread literacy and the Greek flowering of literature, philosophy, and history (ie. interesting stuff). It opened reading and writing up to artists, philosophers, politicians, historians, architects, engineers, storytellers, ordinary people, etc. Crawford would like to see the same happen to programming, where most everyone in society is able to do it–to be computationally literate.

    I think this ties in well with Kay’s philosophy of what programming should become down the road. In his vision for the Dynabook he envisioned that it would usher in a new literacy, that children and adults of ordinary stature would be able to modify their own personal medium any way they liked to carry out scientific experiments, play games, carry out transactions, create personal or business projects, or customize the experience of reading other’s material. Today most people do this through software that’s written by programmers, which in most cases always works the same way whether the users like it or not, and which comes in pieces that cannot easily be modified to interact with other pieces of software. Kay envisioned that tools would exist on the Dynabook for carrying out common tasks, but they could be modified by the users (who are skilled as programmers themselves) if they needed them to act differently.

  4. hi mark,

    It’s a big topic. Some of the papers you mention I have read before. The alan kay quote about many computer programmers not being creatures of the enlightenment is one thing that got me thinking about this.

    What I meant to say but didn’t say (unfortunately, I just said science, it would have been better if I had said “scientific world view or outlook”) was more like this:

    As educators we need to take as our starting point a more philosophical / historical approach which attempts to situate computing in the longer history of maths and science, going back to the Greeks. You could call this a history of Big Ideas. Computing adds significantly to some of these Big Ideas, eg. recursion, but they tend to come towards the end of a long journey. (I did read the Crawford paper)

    One thing that arises from that is that static type knowledge might be best done on paper, or, not much harm is done by doing it on paper. But dynamic type knowledge is best done using a computer. As a teacher of maths and science I definitely think that many students need a thorough grounding of discussing ideas and writing some it down on paper before moving onto the computer. I remember arguing years ago that it was far better for students to develop tessellations on paper than on the computer. Some nice ones here

    This sort of outlook comes across clearly in Kay’s paper, The Early History of Smalltalk , where he talks about being strongly influenced by Plato’s Ideas and Liebnitz’s monads, ie. the influence of philosophy on the development of a new paradigm. Another way to look at it is through Kay’s articulation of the non universals

    One problem wrt educational development is that subjects are compartmentalised in instrumental fashion into science, maths, computing etc. and the whole notion of the historical development of Big Ideas (non universals) tends to be lost.

    I changed mid way through 2007 deciding not to go back to school and teach mainly computing. This decision arose mainly through studying some of alan kay’s papers. I wrote about this last year – just looked it up and notice that you had a fair bit to say in the comments!

    in general programmers are not creatures of the enlightenment

  5. @Bill Kerr:

    One problem wrt educational development is that subjects are compartmentalised in instrumental fashion into science, maths, computing etc. and the whole notion of the historical development of Big Ideas (non universals) tends to be lost.

    I’ve complained about the state of CS education in universities in the same way. CS education is also compartmentalized, though there’s an emphasis on math. The same thing occurs. There is a woeful lack of historical context. It’s extremely rare, if it exists at all, for a CS curriculum to talk about computers in a context that is remotely close to what I talked about in this post.

    A typical CS curriculum will put programming languages, operating systems, computer graphics, artificial intelligence, database theory, etc. into their own sets of courses. There’s no cross-pollenation between them. I realized this when Alan Kay talked about Ivan Sutherland’s Sketchpad project, and the design of the Smalltalk system.

    I had heard about Sketchpad in a course I took on computer graphics when I was getting my Bachelor’s degree. I learned that it was the first system that allowed the user to interactively draw shapes using a light pen. We may have also learned that it was the ancestor to CAD (Computer-Aided Design) systems, which suggested a little of what Kay talked about WRT it. What Kay revealed is that Sketchpad was the first object-oriented system. This to me has more profound meaning in light of what the Xerox PARC team accomplished with Smalltalk.

    I was introduced to Smalltalk (just the language part) in a senior-level course on programming languages. Fortunately the teacher gave us a bit of context for it. It’s only been recently that I’ve learned that Smalltalk really is a unification of a bunch of CS concepts that have traditionally been compartmentalized. It’s an operating system. It’s a language. It’s an interactive environment. It is a kind of database. And as a system, it’s object-oriented, like Sketchpad was. I only got pieces of what these projects were meant to convey when I was in college. My professors only had the vaguest ideas about what these projects really represented.

    Another example I can give is Lisp. From the beginning this language, or “building material” as Kay likes to call it, was categorized as an “AI language”. At the university I went to students didn’t get much exposure to Lisp unless they chose an “AI track”. What I’ve come to learn recently is that Lisp is as much a universal programming environment as assembly language is on today’s typical hardware architectures. It just represents a very different computing model. Some at MIT realized this 30 years ago and they created two companies to manufacture Lisp machines, computers that ran Lisp at the hardware level. These died out in the mid-1990s.

    A counter-example is that when I went to jr. high school in the early 1980s, science teachers experimented with the idea of having computers in the science classroom. I have no idea though if they taught programming. I suspect they just ran educational software (“dissect the frog” and that sort of stuff). I expected that computers would always remain in “computer labs”, but some tried to buck that trend. I remember thinking this was odd at the time, but now I see it as a forward-thinking move on the part of these teachers.

    There is a private school where I live (I think it still exists) that comingles the different disciplines. I remember reading about it years ago, that for example they combined the activities of learning math with learning how to repair an automobile. They wanted to place math in a real world context to show its relevance, rather than it being just a theoretical concept.

    The different subjects in school often had a historical context to them. English literature, math, and science all taught some of their history. With the exception of English Lit., it was usually presented as “so-and-so discovered X”. So it wasn’t real rich, but it at least gave you a timeline to follow if you wanted to explore it further. There were opportunities to do that. Every once in a while we would be assigned research papers: research a historical figure. One of the men I took up a study on was Archimedes. I can’t remember where I heard about him, probably in math class. He’s really fascinating. The Palimpsest that’s been found recently, containing a copy of one of his classic texts, is even more fascinating. Apparently Archimedes was using a form of calculus more than 2000 years ago.

    I agree. It would be nice if subjects that fit nicely together were more integrated, and that a historical context was given to them. I suspect that the reason the subjects are compartmentalized has more to do with the teachers. Kay said it himself: Most experts, he suspects, are quite dull within their profession, and are dull outside of it. In other words they focus a lot on their specialty and don’t explore outside of it to entertain new ideas. So while scientists could teach math, they tend not to (they use it). Mathematicians, likewise, could teach science, but again, they tend not to. Many math teachers don’t bring a sense of art and elegance into their courses, though they could and it would be relevant. Professors of computer science could bring in all of these things, plus a rich sense of culture and history, but they tend not to.

    I don’t know why that is. It may be a general cultural thing, that as adults we’re supposed to stop exploring, entertaining outside ideas, and just get better and better at our specialties so that we can put more credentials after our names, rise in seniority, and make more money. I vaguely remember Kay saying that this starts in school. All children start out as open-minded explorers, but this dies in school, because they’re supposed to sit still and just absorb what the teacher tells them, and regurgitate it on tests.

    What do you think? I have my theory, but you’re in the school environment. Have you looked into why it’s so compartmentalized?

  6. Regarding compartmentalization… it takes 4 years of college to get a BA, let alone be considered an “expert” in a topic. It takes roughly 10 years of intensive study to become an “expert”, in fact, according to an article I read a while ago about it ( 10 years is a universal constant… chess, basketball, programming, you name it. 10 years is the time it takes for the brain to internalize the experiences in a way that you no longer need to conciously think about the subject to excel in it.

    So, with that kind of commitement needed, how many people will become “expert” in one subject, let alone many? Particularly since by the end of our peak learning years, many of us have children or spouses, or many commitments to the community, and of course, jobs, all of which limit the 10,000 – 20,000 hours of work that the article quotes as being the minimum for an “expert.” Myself, for example, I know that the only topic I may become an “expert” in until my son is out of college is fatherhood and husbandhood. When I am 50, I might have the time to take up a new subject!

    It’s the time commitment needed, combined with the fact that fields have very deep amounts of knowledge, that make this tough. Think about it… a physics textbook today contains more confirmed, accurate knowledge than the entire field of physics had 200 years ago most likely. Your typical BS in Physics knows more about the topic than Newton! A “steampunk enthusiast” most likely knows more metalurgy than all of Bethlehem Steel in the 1800’s put together. My uncle the electrical engineer can no doubt run circles around Thomas Edison. And so on. As fields mature and develop, the shift for most students of it goes from learning a few principles and trying to discover more, to learning piles of already known principles and appplying them. In the environment of a mature field, it is insanely difficult to learn enough to make it useful in a cross disciplinary category. Where cross discipline stuff is useful is when you leverage a new field to shed light on an old one (like using computers for accounting), or when you use an old field to research a new one (like taking the techniques of filmmaking to make video games more interesting). In my opinion, the latter approach is typically more interesting, but it requires someone who is an expert in a mature field to have the vision and ability to see how to apply their existing knowledge to a new field. Not an easy combination to find!


  7. @Justin:

    I often use Alan Kay as an example of someone who is cross-disciplinary. He’s an unconventional learner. From what I know of him, he was a voracious reader as a child. He often irritated his teachers, because he would challenge their knowledge, because of something he read himself.

    He only had (has) a Bachelor’s degree in molecular biology, and mathematics, yet the biological knowledge in particular was critical for him to realize the potential of what he called “object-oriented programming”. He had, however, done a lot of his own learning up until that point. I know he went through a masters program in “computer science”, though from his description I suspect it was called something different then. From the stories I’ve read of him it’s unclear if he completed it (getting a degree). He’s talked about entering a “masters program in ARPA”, and from there he moved on to Xerox PARC. He’s talked a lot about the culture at ARPA, that they were what I think he’d call “real computer scientists”, because they experimented with everything, right down to the hardware designs. They also had a wonderful research philosophy where nothing was out of bounds for consideration.

    One of their objectives was to create architectures that scaled well. To do this they figured they had to find out how to distribute computing in a way such that the individual elements were self-sufficient, and did not lose their power, but the distribution of responsibilities did not slow down the overall system. They did this by continually looking at existing artifacts, deconstructing them down to their essence, and trying to figure out what architectures would implement the same things, but better. Out of this came a significant part of the computing world we know today: the internet, personal computers, OOP, etc. The central theme, it seemed, was decentralization.

    The philosophy of “man-computer symbiosis”, put forward by Licklider, and implemented by Engelbart, was also formed at ARPA. Seymore Papert took it in a different direction (I think outside ARPA), bringing forward the idea of teaching sophisticated math to children.

    Kay attributes these philosophies with generating the greatest wave of innovation the computer industry has ever seen. It lived until the late 1970s, and we haven’t seen its likes again, until recently. Kay claims that the project he’s been working on, funded by the NSF, since about 2006 to create an end-user personal computer environment in 20KLOC brings back the old ARPA culture. If you read Kay’s paper, “The Early History of Smalltalk” (you can Google it easy), he says one of the critical factors in moving society forward in relation to the power of computing is culture. It’s not a matter of learning skills and continually developing skills. Skills are necessary, but what’s more important, by virtue of the fact that there’s so little of it surrounding computing, is culture, which embodies values, priorities, and ways of thinking.

    I’m guessing but it sounds like the old ARPA culture was cross-disciplinary by its nature, that it encouraged people to bring together the ideas of computing, science, engineering, art, philosophy, etc. It was open to all sorts of ideas. It also sounds like not everyone had to be an expert on all of these subjects. It was helpful that they had some background specialty, perhaps, but it sounded like it was necessary for everyone to learn from each other.

    Kay learned much from Licklider, Sutherland, Deutsch, Papert, and Engelbart to ultimately formulate his idea that the computer is a new medium.

  8. I think that Alan Kay and others like him are a really rare breed. You see them buried at places like PARC and Microsoft Research. Unfortunately, few companies take a long term view that allows them to spend money on exploratory research that is not directly tied to a project and cannot be immediately monetized. That’s one reason why IT is driven by “innovation” (building upon existing knowledge) rather than “inventiveness” (developing new knowledge and building on it). “Innovation” only requires a minimum of cross disciplinary work… you know, like the programmer asking the accountant how they would like the software to be improved. “Inventiveness” requires an astounding measure of cross disciplinary work, like Alan Kay doing Smalltalk, or John McCathy doing Lisp. Both of which did not directly make much money, but because unlying knowledge that allowed many others to make a lot of money.


  9. Correction on what I said earlier. I found some other biographical info. on Alan Kay, and he did get his Masters and Ph.D., from the University of Utah.


    Alan Kay’s approach is interesting to me, partly because I get the sense he wants to unleash the genius in everybody. He sees the predominant educational system as being run by small minds (whose own genius was probably killed at a young age) who kill that genius in their students because they don’t know any better, and so he works where he can to use computing to create a better educational system.

    He’s quite humble. You and I both have seen his presentations online. He gives a lot of credit to the people I listed in my last comment, and doesn’t accept much credit for himself. So I don’t think he sees himself as a genius above all others or anything. I think what makes him unique is he studies cognition, and epistemology: how people know what they know. He tries to question himself about what he knows as well. It’s not an admission that he knows nothing (he’s not a post-modernist), but rather an admission that it’s hard to know the truth about much of anything with absolute certainty. And ironically I’ve found that this attitude makes one less ignorant, not more. Perhaps what he’s trying to communicate is the only true thing we can really understand is how we understand things, what’s going on in our own heads.

    In relation to his knowledge of computing he had the advantage of being at the right place at the right time with the right people to get a very deep education about it. Having said that, the way he tells it, he just “fell into it”. So it wasn’t like it was planned. Ivan Sutherland, for example, was at the University of Utah when he attended there. It’s something he really wants to see restored, since as I said, that culture disappeared after the late 1970s. Kay attributes its disappearance to the rise of the commercial PC industry, which occurred around the same time.

  10. If you haven’t read any of Daniel C Dennett’s books, I think you would REALLY like them; I suggest “Conciousness Explained”. While some of his stuff has been strongly argued against (he even argues against it himself sometimes!), where it shines is that he pulls stuff from all over the place and puts it together. It may be “philosophy”, but I think that anyone who is as interested in the connections between how people interact wsith computers, AI, and what it truly means to be human would enjoy it. Very, very cross discilinary work, and written in a way that you don’t need a background in any of the subjects to get it, which is good because it covers so many things! You made me think of it when you mentioned cognition and epistemology. 🙂


  11. Pingback: The beauty of mathematics denied « Tekkie

  12. Pingback: My journey, Part 6 « Tekkie

  13. Though Smalltalk is not that popular these days, there is a new renaissance in Smalltalk development, thanks to Squeak.I went through many sites of the Smalltalk and agree with all the supporters of Smalltalk. The more I learn about Smalltalk and Squeak the more I’m impressed. In the process of my learning I have collected some good sites (more than 200) related to Smalltalk and Squeak (lessons, tutorials and programming). If you are interested take a look at the below link.
    200 sites to know about smalltalk programming

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s