“TDD Is Not a Silver Bullet”

As a writer, I’ve left a few incomplete efforts by the wayside, promising I would someday return and complete them. The one I feel most guilty about is an interview with a promising young developer named Tim.it's a gold bullet... get it?

In 2006, I had helped train and coach a development team in TDD at a large corporation’s headquarters in the U.S. In fact, the training and coaching were separated by a month or so. I initially worked with a cool manager who brought me in to run training, but by the time I returned for the follow-up coaching, the manager had departed. In his place was a humorless, dictatorial manager. While working with the team, I focused primarily on helping the individuals see how TDD might help them build quality software. Unfortunately the temperament of the manager meant I departed with few hopes for the success of agile or TDD at the team level.

When pairing with Tim, I could tell he was very sharp but also very skeptical and resistant to TDD. Two years later, I received an email from him. He had some interesting things to say. I asked him some questions on TDD, with the intent of shaping it into an interview I would publish. He asked that I not mention the company at that time. Well, four years later, I still think these are wonderful answers, and so I plan on publishing them over a couple or more blog entries. (Tim–sorry I took so long!)

Here’s one of the questions and answers from Tim.

Q #5. What did you finally see or recognize in TDD that you initially couldn’t see?

A.  I have somewhat of a loaded answer to this question.  I’ll preface with this… I didn’t go out and research TDD on my own.  I had never heard of it, or knew what it meant.  I wasn’t chomping at the bit to change the way I had been writing code for the 5+ years before that.  It was directed from the top down in my organization, and my team was a test bed, so to speak.  Right off the bat, I had preconceived notions as to why I had to learn this approach, but why should I?  So I had no expectations as to what I was supposed to see other than tangible things that management probably thought were the only byproduct of TDD (i.e. JUnit/HttpUnit test cases).

So I can only answer your question with what I have found with using the TDD approach.  It’s rather simple, and yet not so simple.  I have to bullet them:

  • I have found myself writing cleaner, more efficient, maintainable, and better “architected” code.
  •  I have a better understanding on topics as simple and innate as the Java specs, to more complex such as patterns, writing frameworks, and abstraction.
  • Some would laugh, but I never followed the whole “code to the interface, not the spec” statement.  I cannot imagine not following this now.
  • My skills with the editor I use have become extremely advanced.  So much so, that I would challenge anyone who claims TDD will slow you down that I can write the same thing simpler, faster, and tested.
  • I automatically have a spec/documentation (the test cases) since that’s where I start.  Any developer who says they keep up with documentation is flat out lying!
  • I have more confidence.
  • I have the feeling of “getting it”.
  • And I have more fun.

I can admit that I didn’t fully understand TDD until a couple of years ago. So if you do the math, it took @2 years to really sink in, but I know I am better now than I was then (and I thought I was pretty damn good).  I can only surmise that in 2 more years, I will feel the same compared to now.

By no means am I saying that I can solve any and every development endeavor and never get frustrated or challenged, all because of TDD. Development is challenging. If I didn’t get challenged than I would probably, really hate my job. I could do without the frustration, but that goes with the territory… and builds character.  I feel like I have to say this because (and I think I mentioned it before) some developers I talk to think that TDD is a silver bullet to solve these issues just because you have some test cases and a build server, and as you can see (I feel) it’s much more.

One thing that stands out for me is Tim’s notion that he has “more fun.” More on that in an upcoming post.

A Smackdown Tool for Overeager TDDers

smackdown!
Image source: https://commons.wikimedia.org/wiki/File:Cross_rhodes_on_gabriel.jpg

I’ve always prefaced my first test-driven development (TDD) exercises by saying something like, “Make sure you write no more code than necessary to pass your test. Don’t put in data structures you don’t need, for example.” This pleading typically comes on the tail of a short demo where I’ve mentioned the word incremental numerous times.

But most people don’t listen well, and do instead what they’ve been habituated to do.

With students in shu mode, it’s ok for instructors to be emphatic and dogmatic, smacking students upside the head when they break the rules for an exercise. It’s impossible to properly learn TDD if you don’t follow the sufficiency rule, whether deliberately or not. Trouble is, it’s tough for me to smack the heads of a half-dozen pairs all at once, and some people tend to call in HR when you hit them.

The whole issue of incrementalism is such an important concept that I’ve introduced a new starting exercise to provide me with one more opportunity to push the idea. The natural tendency of students to jump to an end solution is one of the harder habits to break (and a frequent cause of students’ negative first reaction when they actually try TDD).

I present a meaty first example (latest: the Soundex algorithm) where all the tests are marked as ignored or disabled, a great idea I learned from James Grenning. In Java, the students are to un-@Ignore tests one-by-one, simply getting them to pass, until they’ve gotten all tests to pass. The few required instructions are in the test file, meaning they can be hitting this exercise about two minutes after class begins.

Problem is, students have a hard time not breaking rules, and always tend to implement too much. As I walk around, I catch them, but it’s often a little too late. Telling them that they need to scrap their code and back up isn’t what they want to hear.

So, I built a custom test-runner that will instead fail their tests if they code too much, acting as a virtual head-smacking Jeff. (I built a similar tool for C++ that I’ve used successfully in a couple C++ classes.)

Here’s the (hastily built) code:

import org.junit.*;
import org.junit.internal.*;
import org.junit.internal.runners.model.*;
import org.junit.runner.*;
import org.junit.runner.notification.*;
import org.junit.runners.*;
import org.junit.runners.model.*;

public class IncrementalRunner extends BlockJUnit4ClassRunner {

   public IncrementalRunner(Class klass) 
         throws InitializationError {
      super(klass);
   }

   @Override
   protected void runChild(
         FrameworkMethod method, RunNotifier notifier) {
      EachTestNotifier eachNotifier = 
         derivedMakeNotifier(method, notifier);
      if (method.getAnnotation(Ignore.class) != null) {
         runIgnoredTest(method, eachNotifier);
         return;
      }

      eachNotifier.fireTestStarted();
      try {
         methodBlock(method).evaluate();
      } catch (AssumptionViolatedException e) {
         eachNotifier.addFailedAssumption(e);
      } catch (Throwable e) {
         eachNotifier.addFailure(e);
      } finally {
         eachNotifier.fireTestFinished();
      }
   }

   private void runIgnoredTest(
         FrameworkMethod method, EachTestNotifier eachNotifier) {
      eachNotifier.fireTestStarted();
      runExpectingFailure(method, eachNotifier);
      eachNotifier.fireTestFinished();
   }

   private EachTestNotifier derivedMakeNotifier(
         FrameworkMethod method, RunNotifier notifier) {
      Description description = describeChild(method);
      return new EachTestNotifier(notifier, description);
   }

   private void runExpectingFailure(
         final FrameworkMethod method, EachTestNotifier notifier) {
      if (runsSuccessfully(method)) 
         notifier.addFailure(
            new RuntimeException("You've built too much, causing " + 
                                 "this ignored test to pass."));
   }

   private boolean runsSuccessfully(final FrameworkMethod method) {
      try {
         methodBlock(method).evaluate();
         return true;
      } catch (Throwable e) {
         return false;
      }
   }
}

(Note: this code is written for JUnit 4.5 due to client version constraints.)

All the custom runner does is run tests that were previously @Ignored, and expect them to fail. (I think I was forced into completely overriding runChild to add my behavior in runIgnoredTest, but I could be wrong. Please let me know if you’re aware of a simpler way.) To use the runner, you simply annotate your test class with @RunWith(IncrementalRunner.class).

To effectively use the tool, you must provide students with a complete set of tests that supply a definitive means of incrementally building a solution. For any given test, there must be a possible implementation that doesn’t cause any later test to pass. It took me a couple tries to create a good sequence for the Soundex solution.

The tool is neither foolish-proof nor clever-proof; a small bit of monkeying about and a willingness to deliberately cheat will get around it quite easily. (There are probably a half-dozen ways to defeat the mechanism: For example, students could un-ignore tests prematurely, or they could simply turn off the custom test-runner.) But as long as they are not devious, the test failure from building too much gets in their face and smacks them when I’m not be around.

If you choose to try this technique, please drop me a line and let me know how it went!

Infinitest

I recently re-downloaded Ben Rady’s Infinitest, having finally gotten back into some extended Java development work. Infinitest runs tests continually, so that you need not remember to proactively run them.

The last time I played with Infinitest, perhaps over two years ago, there was no Eclipse plugin. As I recall, on save, tests would run in a separate JUnit window external to Eclipse.

At the time, I had had a brief chance to sit with Ben and look at some of the code while at Agile 2007. I thought Inifinitest was a cool idea. But I didn’t use it extensively, partly because I’d begun to move into lots of C++ work, but also because I had a hard time fitting it into my very ingrained test-code-refactor cycle habits. Having to deal with an external window seemed disruptive.

Infinitest today is another matter entirely. Plugins have been developed for Eclipse and IntelliJ. In Eclipse, tests run as soon as you save. But they don’t run in the JUnit view–instead, test failures are reported in the Problems window, just like compilation errors. Immediately, this means that Infinitest is not disruptive, but is instead built into the normal flow of development.

Showing test failures on save is an extremely important transition, one that will help change the outlook on what TDD means. No longer are tests only an optional afterthought, something that a developer might or might not remember to or choose to run. With Infinitest, you don’t think at all about running the tests, they just run themselves. Tests are continual gating criteria for the system: If the code doesn’t pass all of its tests, Eclipse continues to point out the problem. This forces the proper mentality for the TDDer: The system is not complete unless the tests pass.

Infinitest uses a dependency tree to determine which tests must be run on save. The nice part for a die-hard TDDer like me is that my tests–my saves–will execute very quickly. This is because the die-hard TDDer knows how to design systems so that the unit tests run fast. I do wonder, however, about what this feels like for the developer on a system with lots of poor dependencies and slow-running tests. I’ll have to experiment and see how that feels.

I hate to dredge up an annoying buzzphrase, but Infinitest is really a no-brainer. I’ve only used it regularly for a few days now, but I can’t see doing TDD in Java without it.

Eclipse Auto Insert

In Eclipse, I use Ctrl-1 and Ctrl-Space so much that the wear on those three keys is noticeably more than the others.

By default, adding a new (JUnit 4 style) test is a matter of typing “test,” then hitting ctrl-space to bring up the appropriate code template. Problem is, Eclipse brings up a number of choices, and hitting enter requires ensuring the right thing is selected (it’s not always in first position) and then pressing enter. I’ve lived with this so far and didn’t think much about it until last night, when I created over 50 tests in a 90-minute session, and really resented the overhead.

I edited the template and figured checking the “Auto Insert” box would solve the problem. No dice–Eclipse presents other choices when I type “test,” such as one that just inserts the lone @Test annotation. What if I rename the “test” code template to a unique name, like “tst?” No, I get a list with two things–my template, and an option to create a method stub whose name is “tst.” No auto insert! I tried a couple other things, such as introducing a special character so that Eclipse would think it couldn’t possibly create a method with that name. That didn’t work (I don’t recall specifically what happened).

I know auto insert works in Eclipse, because I can go into the body of a method, type “sysout” and then press Ctrl-space, and Eclipse auto-completes to “System.out.println()” immediately. It requires no selection on my part. This tells me that auto insert works only if there’s one and only one possible selection. Unfortunately, it looks like that will never be the case outside a method but within the class (i.e. where one can create a “Java type member”)–any template name could also represent the name for a new method stub.

Keystrokes, then, what about assigning keystrokes? Aha. In the General->Keys preference page, I filtered for “content assist.” The choice named simply Content Assist is the one triggered by Ctrl-Space, but another choice is Content Assist (type: Template Proposals), by default assigned no keyboard shortcut. I assigned Ctrl-\ to content assist for template proposals. Now I just type “test” Ctrl-\ and I have my test method stub.

I didn’t really want another key combination, mostly because I prefer to go by IDE defaults. My pairing sessions run me through a wide variety of development environments (at last count, within the past year or so, 10 different C++ editors and 3 different Java IDEs), so it’s easiest to not create dependencies on my own personal reworking of the key mappings.

Ah well, I suppose there are a few key mappings I’ll just have to remember to configure. This is the second one I will want universally for Eclipse; the other is Rerun JUnit Test.

My wish for Eclipse is improved pairing support. The simple idea: you make your key mappings available via a URI. The first time you pair with someone, you add this to Eclipse. From there, a universally defined keystroke allows you toggle through or select from the keymaps.
Comments:

Type the template you would like to use and hit ctrl+space twice. Eclipse will bring the list with choices filtered down to templates.

Auto insert works if the template you typed is the only choice possible. That may work if you delete the old style “test” template. However it still requires you to press ctrl+space twice.

Thanks Igor. I wasn’t aware of the ctrl-space double-press. You still have to hit enter! So, it’s ctrl-space-space-enter. I would have thought this wouldn’t be required since it’s configured as auto-insert. The double-key-press would be ok, but requiring a third sucks. I like my solution better except that it’s not default behavior.

That’s why I used to live with it 😀

I don’t so much love it, but I’ve changed the the templates so that the one I want (JUnit 4 in my case) is “tst“.

Works. 🙂
MB

Right, that was my first thought, and I’m just bugged by the need to look at an unnecessary popup.

You’re right, bugs me too. Would love the “syso” behavior. I’ve been messing around with Eclipse plugins recently, maybe I’ll dig around a bit.

Cheers, MB

Violating Standards in Tests

Should your test code be subject to the same standards as your production code? I believe there should be different sets of standards. I am collecting interesting idioms that are normally shunned in good production code, but are acceptable and advantageous in test code.

An obvious example is the use of macros in tools like James Grenning’s CppUTest. The testing framework (an updated version of CppUnitLite) requires programmers to use macros to identify test functions:

TEST(HelloWorld, PrintOk)
{
   printHelloWorld();
   STRCMP_EQUAL("Hello World!n", buffer); 
}

No self-respecting C++ programmer uses macros anymore to replace functions; per Scott Meyers and many others, it’s fraught with all sorts of problems. But in CppUTest, the use of macros greatly simplifies the work of a developer by eliminating their need to manually register the name of a new test with a suite.

Another example is the use of import static in Java. The general rule of thumb, suggested by Sun themselves, is to not overuse import static. Deleting the type information for statically scoped methods and fields can obscure understanding of code. It’s considered appropriate only when use of static elements is pervasive. For example, most developers faced with coding any real math do an import static on the java.lang.Math functions.

However, I use static import frequently from my tests:

import static org.junit.Assert.*;
import org.junit.*;
import static util.StringUtil.*;

public class StringUtilCommaDelimitTest {
   @Test public void degenerate() {
      assertEquals("", commaDelimit(new String[] {}));
   }

   @Test public void oneEntry() {
      assertEquals("a", commaDelimit(new String[] {"a"}));
   }
   ...
}

Developers unquestioningly use import static for JUnit 4 assertions, as they are pervasive. But the additional use here is for the method commaDelimit, which is defined as static in the target class StringUtil. More frequently, I’ll have the test refer to (statically defined) constants on the target class. For a test reader, it becomes obvious where that referenced constant would be defined.

What other coding standards are appropriate for tests only, and not for production code?

Atom