You come to nature with all her theories, and she knocks them all flat. -Renoir

Computer science is a misnomer. There is no series of steps to follow that will lead to a great application. There is no methodology that guarantees success. The science that we do have is what I would call "low level". We can prove that the running speed of quicksort is n log n for the average case. We can find the maximum of a set of numbers in n time. But I can't prove that my web application is not going to crash.

What makes the software engineering discipline so different from the other engineering disciplines is that absolutely all of the work happens in the design phase. When designing a building, coming up with the blueprints is just the first step. But with programming, the blueprints are everything. Naturally, I mean extremely detailed blueprints. That's really what a program is though, right? It's just a set of detailed instructions that the computer executes for us. When we're creating a building, the building is done when it's built. When creating an application, it's done when we've come up with the all of the instructions needed. It's one more step removed. That's why when we've created one copy of an application, it's trivial to create n copies.

Anyway, we slowly learn that there is no magic formula to write an effective application. There are guidelines, principles, best practices, but in the end, following all the rules doesn't guarantee a masterpiece.

This sounds a lot like the difficulties faced when trying to create a bestselling novel. There are guidelines, and patterns for plot developemnt that have worked well. But in the end, you can follow all the best practices that are out there, and still not come up with a masterpiece.

I know I'm making it sound like the both of these are very hit and miss. This is not true. Great authors consistently produce great works. Great programmers also consistently produce great applications. While it's hard to define what makes the works great, it's usually much easier to recognize. What's more is that most programmers can agree on who the great programmers are, yet when asked to quantify why, it's not easy to come up with an answer. We can all recognize it, but measuring it is difficult.

Universities however, teach programming from a much more scientific point of view. We're taught the fundamentals, big O notation, operating system concepts, basic software engineering and process, and things along those lines. Yet it's rare to find a university (at least I haven't come across one) that studies the great masterpieces of programming. Or one that takes the worst of programming, and criticizes it.

I think that this scientific approach to teaching programming is fundamentally wrong. We should be taking the hint from the liberal arts schools. They all study the great works of the past, which are endlessly discussed, analyzed, and criticized to great detail. Various writing techniques are dissected, and emulated. Students are encouraged to stray from the path, to explore.

Maybe the reason why universities take this approach is that most past software has been closed source. It's only fairly recently that open source has exploded in a very big way. But that's slowly changing. More and more open source software is getting written, and at a faster and faster pace.

But what I'm seeing happen is that we don't have to wait for the universities. We as programmers are coming up with our own ways to learn more effectively. After all, we control what software gets written, so a lot of it is geared towards making our own lives easier. This also includes learning from others, and coming up with better ways of sharing and connecting information. It's this trend, which seems to indicate that we're getting better faster, that leaves me hopeful for the future.