Feeds:
Posts
Comments

Archive for February, 2007

I’ve been wondering about this for a while. We hear about how the more popular web frameworks can scale with an n-tier architecture, but what about Seaside? Session state is maintained inside of the Squeak image, and unlike other web frameworks it does not save session state to a database. I imagine it lacks that capability at this point.

The more popular web frameworks gain scalability by allowing you to configure them so they save state to a database or some other form of persistence that is universally available to all of the web servers in a farm. This way the same application can be installed on multiple servers, and a load balancer can assess how busy any server is and switch a client to a server that is less busy, even while the user is in the middle of using the application, while still maintaining application state for that client. This is done so that everyone can have their requests answered in a reasonable amount of time without bringing a server to its knees.

Ramon Leon at On Smalltalk has made a good start at providing an answer on how to scale Seaside. His answer is to run multiple Squeak images at the same time, and have a load balancer choose which image a client is directed to. After that point the client will continue to interact with the image that was chosen for it, without switching to other images in mid-use. He called this a “sticky session” scheme. He provides some details about how to set this up, but since the load balancing scheme he’s using is very similar to how Ruby on Rails (RoR) is load balanced he directs people to look up on the web how it’s done for RoR. RoR and Seaside are similar in that both run their own simple http servers. I’m not sure if RoR’s server has a name, but Seaside’s is called “Comanche”. Since they both have a similar http setup it’s possible to use the same load balancing software with both.

He goes into detail describing his maintenance tools (shell scripts), since there are some reliability issues with the software he uses. He has no major complaints with it though.

I’m real happy Ramon talked about this. He is answering the questions anyone curious about Seaside would have about the viability of deploying it in a corporate environment. Keep up the good work, Ramon.

Advertisements

Read Full Post »

Lisberger gives an interview

Hat tip to Tron 2.0 News for this:

Steven Lisberger the creator/director of the movie Tron gave an interview to IGN Entertainment. Here is Part 1 and Part 2 of the interview. They talk about what’s happening now (no, there’s no Tron sequel in the making, unfortunately), and they reminisce about some funny things that are Tron-related, and talk a bit about the making of the film, among other things.

In my opinion the best part is when Lisberger talks about the philosophical basis for the movie:

Lisberger: There’s a metaphor in the film which is that you try to reach your program. But forget about technology for a second. What the movie is really saying is that for each of us, there’s a higher self… a potential self. We had to kill off Clue, Flynn’s program. God, are you really going to listen to this tape?

IGN: Yeah!

Lisberger: We had to kill off Flynn’s program so that he wouldn’t run into himself when he went into cyberspace, but we’re like programs! I like to think that somewhere on some dimensional level there’s a User for me. There is a version of me that is the best person I could be. Whether I live up to that and whether I communicate with that, it’s up to me. That’s why the disc-mandala that Tron uses to communicate with Alan is a symbol of self. Mandalas are always a symbol of self! His higher self — his User — puts the information he needs to succeed on that disk. So either you believe in the Users or you don’t. Either you believe that there is a potentially great version of you and your job is to communicate with it… and what is the force or the MCP in the real world that is standing between you and the best version of yourself that you would like to be?

In some way I always knew that the movie had this basic subtext behind it, even if I wasn’t mature enough to understand it. You could tell because in the real world (in the movie) they added effects that looked vaguely computer-like in some scenes. There’s the conversation between Dillinger and Dr. Gibbs where Gibbs says, “[O]ur spirit remains in every program we design for this computer!” There’s the scene where Tron approaches the I/O tower, which has a kind of “temple” look to it. The tower guard makes a brief spiritual-sounding invocation, and then Tron enters, where he makes contact with his user, Alan–in a world where users are supposed to not exist. It’s almost like watching a spiritual experience, for a brief moment, and then it gets technical. And then there’s the final scene where Flynn in the real world greets his fellow travelers in victory and says, “Greetings, programs!” This is what I always liked about this movie. Even though the story line was not well developed at all, and the real reason a lot of people went to see it was the graphics (lots of eye candy), there was this aspect of it that actually had some deep meaning. Programs had users, but users had a higher connection with something as well, even though it was only implied. There was always the suggestion, just there subtly, that maybe we are “programs” too in some cosmic computer. Lisberger and the interviewer do get into “The Matrix” movie for a bit. It had a similar theme, though it got into it much more deeply, and in a darker way.

I have long felt a spiritual connection to computing, so I have an affinity for the idea that “our spirit lives in every program we create”. Sometimes it doesn’t feel that way, but I try to find ways to bring it back. I feel it has something to do with why I’m here on Earth.

Incidentally, I discovered an article yesterday that fills in some more details on the movie connection between Alan Kay and Tron. The article is in commemoration of Kay being inducted into the CRN Industry Hall of Fame, which happened last December. Most of it is about his background and accomplishments. Towards the end it says that he met his wife, Bonnie MacBird, while she was doing research for Tron. She worked as a storywriter for the movie, along with Lisberger. It was Bonnie who named the “Alan” character in the movie after Alan Kay. Neat, huh? 🙂

Read Full Post »

Got math?

M. J. McDermott talks about math education in Washington state.

She talks about and demonstrates some “reformed” methods for solving arithmetic problems that have been commonly taught in the 4th and 5th grades: Cluster problems (taught in the “Terc” books), and partial products and partial quotients, and the “lattice method” of multiplication (taught in books called “Everyday Math”). She says that these “reform” math curricula discourage the teaching of what’s called the “standard algorithm” for multiplication and division (called “long division”).

In my own life I’ve used a method similar to the “cluster problems” to do multiplication and division in my head if I don’t have pencil and paper handy, or a calculator. The best thing I have when I just do it in my head is my own internal multiplication table. I find it difficult to do the standard algorithms in my head, though I think they’re fine if I have pencil and paper. If I don’t have to do it in my head I prefer the standard algorithms. I wouldn’t want to write out all the different sub-answers I’d have to write with the other methods. I remember when I did math drills in elementary and jr. high school. It would’ve taken me forever to finish math problems using the “reform” methods she describes. By the way, she’s for teaching the standard algorithm.

McDermott illustrates as well how problems are posed. They don’t do math drills. Instead they plan vacations, with geographic data. Group work is emphasized, as opposed to students working alone.

She describes how when she returned to college about 10 years ago to get a degree in atmospheric science, she encountered traditional aged students, just out of high school, in her classes who couldn’t do the math. She said they had the inability to work alone. They always felt like they had to check their answers against others’. They had a lack of basic math fluency in the symbolic language of math, and they were lacking in the ability to think logically. They lacked basic math skills such as arithmetic (which I assume means things like order of operations), algebra, and trigonometry. I’m really curious about the reason behind the problems with algebra and trig, but she doesn’t explain. I wonder how they got into college, especially the atmospheric science program without these skills. She said they had a complete dependence on calculators.

She paints a rather scary picture. It illustrates something that Justin James, a tech blogger, talked about a while back (unfortunately I can no longer find the blog entry where he talked about this), that math is important for developing higher order brain functions. Even if you don’t end up using the type of math you’ve learned, you gain mental abilities through the math work that you can use in other areas of life.

I think what this presentation illustrates is it’s easy to take traditional math skills for granted, because we don’t realize what we’ve gained from them unless we use those math skills directly. If we don’t use the math directly we tend to think that we had the cognitive skills we developed as a result all along. It looks like that’s a misconception we should be wary of.

One of the areas she mentioned where the skill of thinking logically is critical is computer programming. I agree. I’ve written previously about the lack of interest in enrolling in computer science in college, and about people who are in non-computer science courses actively trying to get out of them when computer programming is made part of them. Would this explain part of the aversion?

Edit 2/14/07: As I investigated this video further I found that some people were calling the demonstrated “reform methods” constructivist education. In some ways they may be right, just as people picked up on this when “whole math” and “whole language” were the “reform du jour” of the late 1990s. However I am suspicious that what’s been tried so far is not entirely representative of this education method. Sadly, I think that constructivism has been misunderstood by its practitioners in this country.

The idea is supposed to be that the teacher is an active participant in the education of their students, and that the students are an active part of their own education as well. What I used to hear about the older attempts at constructivist reforms is they put teachers in a passive role, leaving the learning totally up to the students. Further, the teacher was not to correct the students. The curriculum shunned rote methods. Anything that smacked of memorization or repetition was excluded. Students were supposed to come up with their own answers, and the teachers were not supposed to judge their work for fear of damaging their self-esteem.

Why they insisted on excluding repetition is beyond me. When children play they do things repeatedly. Repetition is part of their natural learning process. I don’t think what the educational establishment has come up with is what the creators of constructivist theory had in mind.

My sense of this teaching style is that overall the goal is to teach students how to learn, by using the students’ innate process of experimentation and play, while at the same time teaching them about real subjects they need to learn. The teacher should be a guide to the students. Yes, the students are to explore and experiment to arrive at their own answers to problems, however the problems are posed by the teachers, giving the students a goal to achieve. If the students are coming up with answers that are off, the teacher can introduce further problems prodding the students to explore the subject more, until they arrive at a correct answer. The goal is to build understanding of the subject. Yes, students build their own model of the world through this process, but it does a disservice to the student if that constructed model is not compatible with the real world. Further, the teacher provides some concrete examples of the subject matter, giving the students a grounding in it, before they are allowed to go off and explore it.

My impression is that the goal is to help students arrive at correct answers, but process is emphasized so that it’s the students who obtain the answers, through their own process of discovery, without the teacher imposing a single way to arrive at those answers. This creates a sense of accomplishment in the students, and self-esteem, encouraging them to continue the process. I may be wrong on some points.

I think there are times when rote methods are best for gaining a grounding in a subject. One of the things I thought was insane about “whole language” was they banned phonics from the reading curriculum. Phonics has consistently worked well as a way to teach children about reading. I think the reason it’s essential is that language in large part is standardized in our modern civilization. It provides a way of conveying information that is logically consistent from the disseminator, to the receiver. It’s essential that people’s understanding of language be consistent. It’s how we make connections with each other, and conduct transactions. Students who don’t get this are going to be handicapped.

I think the goal of reform efforts has been noble. The problem has been in their execution. The goal was to get the schools out of the old Industrial Age model of assuming that the students were blank slates upon which the teacher would impress their knowledge. The teacher was pretty much the only one who was active in the process. The students would just passively receive the information (hopefully) and then either regurgitate it later or synthesize it into something new.

The goal with constructivist methods was to create active learners and explorers. That sounds good to me. The problem seems to be that the ways in which this method is being used leads to students who do not fully understand the subject matter. It seems they end up behind other students who were taught using the older methods. I think the old methodology provides a useful benchmark. If using a new methodology results in students who are behind grade level I think that indicates that the methodology needs to be revised and some assumptions about it need to be questioned.

Read Full Post »

I’ve had this feeling coming over me in the last week that something significant has been happening in the computer world. To some observers it may not seem like it. With the exception of stuff moving to the web, things are “same as it ever was” to quote David Byrne. The reason I bring this up is I’ve been hearing more and more lately signs that computer science in academia is losing its vibrancy, at least in the U.S. It began several years ago at the end of the Y2K crisis, and the dot-com bust. Enrollment in computer science at universities nationwide fell off dramatically. One article I read a few years ago said that enrollment was the lowest it’s been since the 1970s.

About a week ago I found an interview with Alan Kay (AK), by Stuart Feldman (SF), in ACM Queue from the Dec. 2004/Jan. 2005 issue. Kay had been getting some publicity in ’04 and ’05 since he was given the Turing Award in 2003 for his work on Smalltalk 30 years earlier. The interview was enlightening in many ways. Kay has been saying these things for the last several years. I didn’t totally understand until now why it’s been important. I think it’s time people listen. I encourage you to read the article, but I will put some salient quotes below:

(AK) One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects.

You could think of it as putting a low-pass filter on some of the good ideas from the ’60s and ’70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were.

So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.

SF So Smalltalk is to Shakespeare as Excel is to car crashes in the TV culture?

AK No, if you look at it really historically, Smalltalk counts as a minor Greek play that was miles ahead of what most other cultures were doing, but nowhere near what Shakespeare was able to do.

If you look at software today, through the lens of the history of engineering, it’s certainly engineering of a sort—but it’s the kind of engineering that people without the concept of the arch did. Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.

The languages of Niklaus Wirth have spread wildly and widely because he has been one of the most conscientious documenters of languages and one of the earlier ones to do algorithmic languages using p-codes (pseudocodes)—the same kinds of things that we use. The idea of using those things has a common origin in the hardware of a machine called the Burroughs B5000 from the early 1960s, which the establishment hated.

SF Partly because there wasn’t any public information on most of it.

AK Let me beg to differ. I was there, and Burroughs actually hired college graduates to explain that machine to data-processing managers. There was an immense amount of information available. The problem was that the DP managers didn’t want to learn new ways of computing, or even how to compute. IBM realized that and Burroughs didn’t.

The reason that line lived on—even though the establishment didn’t like it—was precisely because it was almost impossible to crash it, and so the banking industry kept on buying this line of machines, starting with the B5000. Barton was one of my professors in college, and I had adapted some of the ideas on the first desktop machine that I did. Then we did a much better job of adapting the ideas at Xerox PARC (Palo Alto Research Center).

Neither Intel nor Motorola nor any other chip company understands the first thing about why that architecture was a good idea.

Just as an aside, to give you an interesting benchmark—on roughly the same system, roughly optimized the same way, a benchmark from 1979 at Xerox PARC runs only 50 times faster today. Moore’s law has given us somewhere between 40,000 and 60,000 times improvement in that time. So there’s approximately a factor of 1,000 in efficiency that has been lost by bad CPU architectures.

The myth that it doesn’t matter what your processor architecture is—that Moore’s law will take care of you—is totally false.

SF It also has something to do with why some languages succeed at certain times.

AK Yes, actually both Lisp and Smalltalk were done in by the eight-bit microprocessor—it’s not because they’re eight-bit micros, it’s because the processor architectures were bad, and they just killed the dynamic languages. Today these languages run reasonably because even though the architectures are still bad, the level 2 caches are so large that some fraction of the things that need to work, work reasonably well inside the caches; so both Lisp and Smalltalk can do their things and are viable today. But both of them are quite obsolete, of course.

I’m sure there are Lispers and Smalltalkers out there who would disagree. To clarify, Kay said here in 2003:

“Twenty years ago at PARC,” Kay says, “I thought we would be way beyond where we are now. I was dissatisfied with what we did there. The irony is that today it looks pretty good.”

Back to the ACM Queue interview:

AK A commercial hit record for teenagers doesn’t have to have any particular musical merits. I think a lot of the success of various programming languages is expeditious gap-filling. Perl is another example of filling a tiny, short-term need, and then being a real problem in the longer term. Basically, a lot of the problems that computing has had in the last 25 years comes from systems where the designers were trying to fix some short-term thing and didn’t think about whether the idea would scale if it were adopted.

It was a different culture in the ’60s and ’70s; the ARPA (Advanced Research Projects Agency) and PARC culture was basically a mathematical/scientific kind of culture and was interested in scaling, and of course, the Internet was an exercise in scaling. There are just two different worlds, and I don’t think it’s even that helpful for people from one world to complain about the other world—like people from a literary culture complaining about the majority of the world that doesn’t read for ideas. It’s futile.

I don’t spend time complaining about this stuff, because what happened in the last 20 years is quite normal, even though it was unfortunate. Once you have something that grows faster than education grows, you’re always going to get a pop culture. It’s well known that I tried to kill Smalltalk in the later ’70s. There were a few years when it was the most wonderful thing in the world. It answered needs in a more compact and beautiful way than anything that had been done before. But time moves on. As we learned more and got more ambitious about what we wanted to do, we realized that there are all kinds of things in Smalltalk that don’t scale the way they should—for instance, the reflection stuff that we had in there. It was one of the first languages to really be able to see itself, but now it is known how to do all levels of reflection much better—so we should implement that.

SF If nothing else, Lisp was carefully defined in terms of Lisp.

AK Yes, that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” This is the whole world of programming in a few lines that I can put my hand over.

I realized that anytime I want to know what I’m doing, I can just write down the kernel of this thing in a half page and it’s not going to lose any power. In fact, it’s going to gain power by being able to reenter itself much more readily than most systems done the other way can possibly do.

All of these ideas could be part of both software engineering and computer science, but I fear—as far as I can tell—that most undergraduate degrees in computer science these days are basically Java vocational training. I’ve heard complaints from even mighty Stanford University with its illustrious faculty that basically the undergraduate computer science program is little more than Java certification.

[Make a note of this point. I’m going to refer to it later–Mark]

SF What do you think a programming language should achieve and for whom, and then what is the model that goes with that idea?

AK Even if you’re designing for professional programmers, in the end your programming language is basically a user-interface design. You will get much better results regardless of what you’re trying to do if you think of it as a user-interface design. PARC is incorrectly credited with having invented the GUI. Of course, there were GUIs in the ’60s. But I think we did do one good thing that hadn’t been done before, and that was to realize the idea of change being eternal.

I wanted to give you some pieces from the article to give you the idea that while Kay is critical of where things are today, he’s discussing ideas for how to rejuvenate computer science.

Referring back to that thing that Kay said about the current state of affairs in computer science programs at some universities, that they’re nothing but Java vocational training, I’ve been hearing this from several quarters. One of which is a blogger I’ve read frequently named Justin James. He wrote an entry recently called “Ripoff Educations”. Quoting from it:

I have recently been talking to someone who is in the process of studying “Computer Science” at the University of South Carolina. I put “Computer Science” in quotation marks, because as far as I can tell, they are teaching “How to Program In Java” more than Computer Science. His education is so threadbare, he did not know what Perl was for (I am not saying he had to learn it there, but to never hear of it?), and had never heard of Lisp, Scheme, or even functional programming. In sum, they are teaching how to perform certain tasks in Java, subject by subject, and completely ignoring the fundamental scientific and mathematical underpinnings of programming.

Indeed, the coursework is so blindingly rudimentary that a course that was CS 112 (Data Structures, which I had to take in the elderly language known as “C”) at my alma mater (Rutgers College) is not introduced until the student is nearly complete with their degree, CS 350! Color me amazed. Without learning the basics (and a course in trees, hashes, sorting, etc. is pretty basic in my book) until the junior year, how can someone even claim that this is a serious program? In fact, not one course in formal logic is mandated. My experience with programming is that, at the end of the day, it is work with formal logic. Yet, it is not taught.

Well, it isn’t. It is a sad joke. I would not pay a dime for this “education”, and neither should the prospective students who are interested in learning to be a good programmer. It is a real shame that this is the norm, thanks to schools slapping together a shake-n-bake program of mediocrity to attract students.

Where I went to college, texts like Knuth’s works were used. KBR was a textbook. A typical student graduated having worked with at least 5 different languages, having enough UNIX experience to call themselves a junior systems administrator, and knowing enough math to practically have minored in it without trying.

I met with a friend recently who had just dropped out of a computer science masters program at his (and my) alma mater, after 1 semester. It wasn’t that it turned out to not be his taste, or that it was too hard, but rather it appeared that the faculty he was dealing with did not care to teach, and in one case was incompetent. While he said there were a few good faculty who were doing good work there, he found that most of the faculty was disengaged from the task of teaching. Freshman enrollment in CS was down dramatically from when we were undergrads at this university.

Meanwhile “across the way” at the Business College of the same university things were hopping. Their CIS program has dramatically improved from what it used to be almost 20 years ago. They have a lot of enrollment. Many of the freshmen coming into the CIS program are students who probably would’ve enrolled in computer science in the past. CIS used to teach COBOL, some C, and MVS JCL. Now it has a strong Java program, with some computer science theory, but the emphasis is on hands-on experience. Further they teach the soup-to-nuts of building web applications. The computer science program does not do this. They teach Java, and computer science theory, but they don’t appear to teach a thing about the web, except the network architecture of Ethernet. I’m not necessarily complaining here, but I think the program is mired in the past. It’s losing favor but doesn’t appear to know what to do with itself.

My friend speculated that perhaps what’s going on with other universities is that computer science departments are seeing this migration of students, and they are copying the CIS program of the Business College in an attempt to compete. Such unoriginal thinking, as Justin James points out, cheats the students, because they could get trained in Java anywhere and for much cheaper. It also cheats the discipline of computer science.

What would I do? I’d explore taking a risk. I’m not responsible for any computer science program at a university, so this will be easier said than done, I imagine. Why not try dredging up and using some excellent computer science that was done and proven in the past, but has been largely ignored or marginalized? “Ignored?” you might say. Yes, ignored.

Here is the truth in what I think is going on. Thirty years ago Dr. Edsgar Dijkstra used to complain about the data processing mindset that was pervading computer science. It’s still there. Paul Murphy, a blogger at ZDNet, has complained about it frequently. This mindset has run its course. That’s a major reason the CIS program at my alma mater is kicking the pants of the CS program. The technology that was part and parcel of this mindset has matured. The only reason that people will come back to computer science at this point is if it blazes new trails, and even that’s relative. Computer science departments could put a major focus on dynamic languages and the curriculum would seem “new”, because they’ve been marginalized for so long. It must focus less on teaching students for what’s currently out there, and focus more on teaching students for research, fostering entreprenuership, and technology transfer. “This won’t be attractive. We’ll just have the same problem we have now,” you say. Not if it’s exciting and new. What’s exciting? How about developing programming languages for the web that help to dramatically reduce the amount of code people have to write to get something done? What about languages that help the programmer express their intent clearly? What about rethinking human-computer interaction on a GUI or the web interface, and the way information is stored and retrieved? That may not seem like computer science, but it’s not beyond technical study.

Whatever happened to dynamic programming languages? Whatever happened to teaching the mathematics/formal logic of computer science? I’m not saying it’s gone everywhere, but it seems to be fading. Computer science was originally an offshoot of math. What about rethinking the hardware architecture–delving into the von Neumann machine itself? What about studying the architectural and software decisions that went into making the OLPC laptop? A lot of scientific innovation went into it. Why not study what makes a virtual machine tick? Why not give up the traditional data processing model (or phase it out), and have these things as the basis of a new computer science program? The masters and Ph.D. programs could build from there, encouraging students to “think new thoughts” about the science of computing. The foundations for this transformation are already out there, sitting around. They’ve been there for 30 or 40 years waiting to be used. Yes, they failed once, and I’m not saying the program should just teach those languages and technologies and say, “Go forth and use this.” I’m saying use them as a basis for building something new. Computer science, now more than ever, needs to nurture bringing forth new ideas. I’ve already talked about computer science luminaries who have worked hard to innovate in academia. All it would take is looking at what they’ve published.

This might, however, have greater success at the top universities in the country. Prestige gets people to pay attention to what you’re doing. Computer science used to be okay with following some trends. It’s time for it to lead.

Even so, putting more of a focus on dynamic programming languages in the curriculum, for example, is actually not too risky at this point, even if a program is concerned with producing graduates who can go out and find jobs. They are coming more into popular use. Examples are Python and Ruby. Even Java and .Net are getting into the game. Sometime this year or next year Microsoft is scheduled to ship their Orcas release, which upgrades C# and VB.Net with dynamic language features. I’ve talked about this in a past postGroovy for Java was just released last month. It’s a FOSS dynamic language written in Java.

I would like to see computer science departments start by reconsidering what computers are to society and how they can best make their contribution to it.

I’ve referred to Lisa Rein’s Tour Of Alan Kay’s Etech 2003 Presentation before, but if you haven’t seen it (the videos of the presentation on the site), I’d encourage you to look at it. I’ve done so a few times, and it’s inspiring. You’ll come away thinking of computing in a whole new way. This, in my opinion, is what many computer science programs have been missing.

Edit 2/14/07: Early in the first excerpt of the interview with Alan Kay he says that the commercialization of computing produced a retrograde effect on computer science. I think this is true, however, I have mixed feelings about it. Had commercialization not happened when it did I might not have entered the computer field at all. I got caught up in the “pop culture” he talked about while I was growing up. So on the one hand I’m sorry to see the results of this pop culture on computer science, but on the other I see it as a good thing. It’s what got me interested in all of this in the first place.

Edit 2/16/07: I took out some of the quoting I did of the ACM Queue article, since I felt it distracted from my main argument. What I took out is in the ACM article. Follow the link.

Read Full Post »

A few weeks back I went to a local launch party for Groovy, a FOSS dynamic programming language written in Java. The language will be familiar to Java programmers. It was written that way. In fact, you can write straight strongly-typed Java code using the Groovy environment, so you can use as little or as much of the dynamically typed features as you like. The advantage you gain from using the dynamic typing features is you get brevity of code. As I’ve said before, it’s easier for the developer to express their intent in code without it getting lost in a bunch of types. The code that you write is more powerful. You don’t have to tell it every single step you want it to do. It can figure some of it out for you. This saves you time, and helps you write code that’s easier to read.

It enables you to either run code through an interpreter or compile Groovy code into modules that can be packed in JAR files and run on a JVM.

One of the exciting parts of Groovy is Grails–the Rails framework for Groovy. I’m not that familiar with the Java web frameworks that are out there. What was shown to me showed that Grails is compatible with Hibernate, and Spring. The presenter pointed out that as of the first final release of Grails, it’s not compatible with Struts. From what I heard at the meeting this is a major minus, because lots of places use this web framework.

Grails offers pretty much the same ease with which Ruby on Rails (RoR) creates web sites, but it does it in a way that Java developers would be familiar with. For one thing, rather than defining the data model in the database first, as with Ruby, you define the data model in Groovy code, and Grails sets up the database tables. All the same, Grails sets up default web views for you, just as RoR does.

As with RoR, it uses convention over configuration. All files that make up a Grails site go in certain proscribed directories. This goes for any Hibernate and Spring files that will be used in a Grails web app. as well. It’s not as if it picks a directory on a certain hard drive and sets it in stone. You can set up your own web app. directory. Once you set up the main directory, every folder under it is proscribed by the framework.

On the minus side it doesn’t make it too easy to set up your own domain-specific language (DSL). Groovy has some nice built-in DSLs for dealing with creating a web interface, opening files, executing shell commands, and dealing with XML, among other things (that’s just what I saw in a quick demo), but adding methods to standard types is clumsy. This is kind of a disappointment for those who like the ability to define their own DSL in a dynamic language.  It works the same way as “extension methods” will in the Orcas release of .Net. What they do is add the same method to every class that is used within a certain scope. This is unfortunate because you may want to add a method to just one standard type, but you’ll end up adding the method to every type that’s used within the defined scope. I guess it’s a compromise. The way the presenter for Groovy explained it is that since Groovy, and Java for that matter, operates in a multi-threaded environment, this was the only way it was possible. He explained that Ruby does not have this problem because each instance of a web application runs in its own thread, thereby making it easier to add methods to standard types. I’m not sure why this makes a difference, but I’ll take his word for it.

Overall I enjoyed the presentation. I’m happy to see that dynamic languages are “storming another beach head” in the programming world.

aboutGroovy is an educational site for Groovy. You can view tutorials, and preview some of the Groovy and Grails books that are out there.

Read Full Post »