Global Optimization

December 30, 2017

The software development process itself can be described as a global optimization problem. When we have questions about where we should be spending our time, or what practices we should follow, we need only to think about where we would realize the most value to determine what to focus on.

This is similar to how we would optimize a part of the system, or a piece of code. First, identify the slowest part or the bottleneck. Next, determine what the best recourse to take is, and subsequently implement. This could be to use a different algorithm, trade time for space, or make a different set of trade-offs to help make the improvement.

Taking a step back and applying this to software development itself, we need to first find where we have the biggest gap to reaching our goals, and work on addressing that in particular. At this point, looking to best practices or various tactics could prove useful to help with the task at hand. But, the circumstances surrounding the details are typically more important than other factors.

This is a simple perspective to take, but one that often isn't taken. Typically time is our resource, and thinking about the problem from an optimization lens can be an effective way to determine the best way to spend it.

Handmade hero

September 07, 2016

I've been, relatively speaking, recently exposed to Handmade Hero. It's a project to create a complete game, all recorded on video, with no libraries. It's written in C++, although most of it is straight C with a very small subset of C++.

It's clear that the programmer, Casey Muratori, has written a lot of code and is very comfortable with it. He demonstrates simple yet effective ways of solving problems that are typically considered dealbreakers when considering C as a language. Memory management, data structures, and overall program management are discussed in great detail, with solutions presented that effectively manage the complexity at hand.

Another aspect that resonated with me is what he calls "compression oriented" development. It's a refreshingly straightforward way to approach writing code, namely, write the usage code first, and then "compress" it when it gets repeated. He describes this as the first instinct when writing code, and urges that this should continue to be the case. Too many times programmers try to create the abstractions first, and then calling into the api can become awkward when there is a mismatch.

Although the emphasis is on game programming, I found it particularly striking how broadly applicable a lot of the insights were. Yes, a lot of it is game specific, but the techniques used to solve most of the problems encountered can be adpated for lots of other domains. Performance is discussed as well, at a level that is atypical these days outside of focused use cases. I've found myself enjoying and learning a lot from these videos, and highly recommend them.

Vintage Tech

March 16, 2015
Weak things must boast of being new, like so many German philosophies. But strong things can boast of being old. -- G.K. Chesterton (@GKCDaily) December 7, 2012
Obviously Chesterton was talking about software; scholars are divided as to whether he was talking about lisp or Debian Stable. -- Phil Hagelberg (@technomancy) December 7, 2012

With respect to technology, things can move fast. It can be hard to sift through what's valuable, and what should be ignored. But there are some things that have not changed much. I think it's worth taking a step back and reviewing these, as it's likely that these are the "classics" of programming.

I just have to start with lisp. It's been around since 1958, and yet, it remains not only relevant, but continues to push the forefront of possibility. The funny thing about it, is that every time I look at it, it gets better.

Lisp is the most important idea in computer science. -- Alan Kay

Arguably, it's simultaneously the most simple, and the most powerful programming language. In fact, its power is driven by its simplicity. Naturally, I'm referring to its consistent, simple, uniform syntax, which engenders macros. Admittedly, new macros are usually not prolific, but when they are necessary and implemented tastefully, it is incredibly difficult to imagine better solutions.

The greatest single programming language ever designed. -- Alan Kay (about Lisp)

It's not just that lisp has survived, it has thrived in recent years under new forms. Clojure has taken lisp, and altered it in subtle but crucial ways. Not only has it become more approachable through small, but important syntax alterations, it adds data structure advancements that imho other languages will continue to emulate for years to come. These persistent data structures simultaneously provide thread safety, and facilitate a natural functional style. Either of these are beneficial; having both is transformative.

You can reach a point with Lisp where, between the conceptual simplicity, the large libraries, and the customization of macros, you are able to write only code that matters. And, once there, you are able to achieve a very high degree of focus, such as you would when playing Go, or playing a musical instrument, or meditating. And then, as with those activities, there can be a feeling of elation that accompanies that mental state of focus. -- Rich Hickey

Having this level of direct feedback is hard to describe to anyone that isn't used to it.

Another item worthy of mention, is sicp. Amazingly, it goes from 0 to forking time in just a few chapters, but there's one particular item that continues to blow my mind. And that is, being able to represent a list data structure with nothing but closures. This isn't a major topic in the book, but I personally think it's worth highlighting because I think it's so profound.

For those that aren't used to this, lists can often represented recursively in functional languages as either an empty list, or a value combined with a list. Cons is often used as the name of a function to construct the value and the subsequent list. car is used for the first element, cdr is used for the subsequent elements.

It turns out that this can be modelled using only functions. No data type facilities are necessary. This is known as "Church encoding". Here's the scheme implementation:

(define (cons x y) (lambda (m) (m x y)))
(define (car z) (z (lambda (p q) p)))
(define (cdr z) (z (lambda (p q) q)))

I'm not trying to argue that lisp is "the one true way". However, I think it would be foolish to ignore any programming technology that has persisted for so many years. We'll see what the next few dozen years will hold.

Software Quality

February 03, 2014
First intention, then enlightenment. - Buddhist maxim

With software, quality is paramount. When quality begins to suffer, everything else follows suit. We see higher defect rates, longer delays to implement new features, and generally speaking, a more fragile code base. It also leads to a broken window effect, where modifications and updates will tend to be poor in quality as well.

But what does it mean to say that a piece of software has high quality? I think it means the code should be easy to maintain. And what does it mean to be maintainable? Maintainable code has the property that its intent is clear. Meaning, it's easy to understand what the code is doing, what its goal is, and why it's there. Quality software has additional characteristics as well, but I believe that most of these fall out of its maintainability. One litmus test is to read a piece of code and think about how you would add some additional features to it. If it's clear, then that means the code itself is clear.

A different way of phrasing this is to say that high quality code doesn't violate the principle of least surprise. Code should behave like it looks. When it's surprising, that's a sign that it's not as clear as it should be. A corollary is that two pieces of code that look the same should behave similarly. If they look different, they should behave differently.

So, how do we go about writing code that's high in quality? In my opinion, the best way to improve software quality is simply to care about it. Great code is usually difficult to get right, and it often takes a few failed attempts before a cleaner design can emerge. But these epiphanies will only appear after deliberate thought. And these thoughts will only come to us if we care about improvement in the first place.

Certainly, great code emerges from great effort. There are differing opinions on how to get there, but asking the right questions is the crucial part. It's the intention to improve that leads to enlightenment.

Usually, a problem can be solved in multiple ways, with each being an effective solution. Regarding programming languages, this is also true. Many problems can be solved with multiple languages, in fact theoretically all turing complete languages should be able to solve the same problems.

Nevertheless, we tend to stick to the language that we are most comfortable with, whether that be the first language that we learnt, or just the language that we're most comfortable working with currently. Learning new languages requires investment, not just in learning the language, but of the libraries, idioms, and environments as well.

When solutions are chosen to problems, they can usually be boiled down to some combination of what I'm calling the language itself, the libraries, or the environment.

Language

Language here is meant to be the expression of the solution to problems. It is the syntax, but more importantly the vocabulary of concepts that can be used as mechanisms to construct solutions. In the general sense, this is the most crucial, because like with human language, it constrains the concepts that are available for solutions.

Libraries

These are the existing bodies of code tha are available to be readily consumed. Most languages have a suitable standard library that are able to perform the necessities. Naturally, necessities vary a little bit based on audience and the problem, but there are certain agreed upon operations that are so ubiquitous that they are taken for granten. But, the breadth and depth of the libraries around a language can vary tremendously.

Environment

By environment, I refer to the tooling surrounding the language and libraries. This could be any ide's that are used, but more importantly, it's the ecosystem of operations available to support daily operations for programmers. These include things like debugging, build/package management, deployment, run time characteristics, and many others.

 

The decision as to whether a language can be used involves weighing the combination of these three elements. We could have the perfect language, but it won't be picked up if there aren't libraries to support day to day operations. And if we have a great language and even better library support, it may be dismissed if it can't run in our production environment. Note that the opposite can also be true. Sometimes a library is selected because of what it can do and the language and environment simply come along for the ride.

There are also social elements to consider as well. If our team is particularly well versed in a certain area, chances are good that we won't veer too far from our domain of expertise. Additionally, if some knowledge is highly esoteric, that can play a major role in the decision as well.

What's interesting to me is that it rarely comes out that the decision is merely the sum of its parts. There's a certain minimum score required in each area. The minimum varies based on context, but nevertheless, a champion in one but a flop in the others will most likely not be chosen.

Although in the abstract there are many factors to consider, practically a select few typically emerge as the important players. It's important to identify which of these are significant earlier on, and focus on those when we make our choices.