June 5, 2019

Debugger usage as bad smell?


A lot of times development is not straightforward, you hit a corner and don't know what else to do to fix a problem. During those periods I've seen many developers do the following things:
  1. Start the debugger right away and go step by step into the problem
  2. Sketch all possible scenarios that would lead to the problem, then println-debug each scenario
  3. Take a walk for 10 mins to think about the problem
 While all these approaches have worked for me, I tend to ignore the debugger use more and more over time. The reason for it can be summarized as:

   "By using a debugger, I'm always avoiding to first understand the problem, but instead patching it with a single-scenario fix"

In the past I've used debuggers a lot, starting it as soon as some problem arose. But then a pattern started to show up,  fixes were coming back with other problems that the first fix didn't cover.

It took a while to understand this and slowly move away from debuggers, it all started when some production applications started to fail, and we didn't have the chance to debug in such environment, we just had the logs.

Reading the code, checking the possible inputs and flows, creating a test case (hopefully automated) and then attempting different fixes resulted in more consistent patches, fewer round-trips with the customers and higher efficiency in the team.

So, whenever someone's starting to fix a bug, my first instinct is to try to create a test case that reproduces the problem before firing up the debugger gun.