2013-06-11

Teaching a Developer to Fish

I write a lot about development philosophy here, and very little about technique. There are reasons for this, and I'd like to explain.

In my experience, often what separates an easy problem from an intractable one is method and mindset. How you approach a problem tends to be more important than the implementation you end up devising to solve it.

Let's say you're given the task of designing a recommendation engine - people like you were interested in X, Y, and Z. Clearly this is an algorithmic problem, and a relatively difficult one at that. How do you solve it? 

The algorithm itself isn't significant; as a developer, the algorithm is your output. The process you use to achieve the desired output is what determines how successful you'll be. I could talk about an algorithm I wrote, but that's giving a man a fish. I'd much rather teach a man to fish.

So how do you fish, as it were, for the perfect algorithm? You follow solid practices, you iterate, and you measure. That means you start with a couple of prototypes, you measure the results, you whittle down the candidate solutions until you have a best candidate, and then you refine it until it's as good as it can get. Then you deploy it to production, you continue to measure, and you continue to refine it. If you code well, you can A/B test multiple potential algorithms, in production, and compare the results.

How do you fish for a fix to a defect? You follow solid practices, you iterate, and you measure. You start by visual inspection, checking for code quality, and doing light refactoring to try to simplify the code and eliminate points of failure, to narrow down the possibilities. Often this alone will bring the root cause of the defect to the surface quickly, or even solve it outright. If it doesn't, you add logging, and you watch the results as you recreate the error, trying to recreate it in different ways, to assess the boundaries of the defect; if this is for an edge case, what exactly defines the "edge" that's affected? What happens during each step of execution when it happens? Which code is executing and which code isn't? What parameters are being passed around?

In my experience, logging tends to be a far more effective debugging tool than a step-wise debugger in most cases, and with a strong logging framework, you can leave your logging statements in place with negligible performance impact in production (with debug logging disabled), and with fine-grained controls to allow you to turn up verbosity for the code you're inspecting without turning all logging on and destroying the signal-to-noise ratio of your logging output.

You follow solid practices, you iterate, and you measure. If you use right process, with the right mindset, you'll wind up at the right solution.

That's why I tend to wax philosophical instead of writing about concrete solutions I've implemented. Chances are I wrote the solution to my problem, not your problem; and besides, I'd much rather teach a man to fish than give a man a fish.

No comments: