Unbalanced test suites
Having spoken to many developers and development teams so far, I've recognized several patterns when it comes to software testing. For example:
- When the developers use a framework that encourages or sometimes even forces them to marry their code to the framework code, they only write functional tests - with a real database, making (pseudo) HTTP requests, etc. using the test tools bundled with the framework (if you're lucky, anyway). For these teams it's often too hard to write proper unit tests. It takes too much time, or is too difficult to set up.
- When the developers use a framework that enables or encourages them to write code that is decoupled from the framework, they have all these nice, often pretty abstract, units of code. Those units are easy to test. What often happens is that these teams end up writing only unit tests, and don't supply any tests or "executable specifications" proving the correctness of the behavior of the application at large.
- Almost nobody writes proper acceptance tests. That is, most tests that use a BDD framework, are still solely focusing on the technical aspects (verifying URLs, HTML elements, rows in a database, etc.), while to be truly beneficial they should be concerned about application behavior, defined in a ubiquitous language that is used by both the technical staff and the project's stakeholders.
Please note that these are just some very rough conclusions, there's nothing scientific about them. It would be interesting to do some actual research though. And probably someone has already done this.
I think I've never really encountered what you would call a "well-balanced test suite", with a reasonable number of unit tests (mainly used to support development), integration tests (proving that your code integrates well with external dependencies including external services) and acceptance tests (proving that your application does what its stakeholders expect it to do). [Edit: in fact, former co-workers at Ibuildings - Reinier Kip and Scato Eggen - have created a project with a well-balanced testsuite. It was a technically sound project, delivered within budget, well on time. If you see them, please ask them to write a blog post about their experiences.]
I have also seen many developers who still didn't write any (or a sufficient number of) tests.
When I was still working as CTO at Ibuildings I wrote several articles to explain to developers that you can't get away with not writing tests anymore. Though I think it's sad that a lot of software still gets released without any tests at all, I wrote my articles in a very kind manner as to not offend anyone. I really don't like scaring people into testing, or pushing feelings of guilt on them. We all need to understand that to start testing should not be a matter of peer pressure. You need the insight that it will make your life and that of your team members easier, and you need to be internally motivated to do it. Also, you need some strong arguments for it, in case your environment (co-workers, managers, customers) try to convince you to stop "wasting time" by writing tests.
If you recognize yourself or your team in any of the things I've said above, please use the following articles as a way to revive the discussion about testing inside your team or your company. Keep the testing spirit alive. And for now and ever, may this day be remembered as Testing Awareness Day! Just kidding.
1. Why write tests?
The world of "software testing" is quite a confusing one and it takes several years to understand what's going on and how to do things "right". In no particular order:
- Developers use different words for different types of tests, but also for different types of test doubles.
- Developers are looking to achieve widely varying goals by writing tests.
- Developers have divergent views on the cost of writing and maintaining a test suite.
- Developers don't agree on when to test software (before, during, after writing the code).
If you're a programmer it's your inherent responsibility to prove that the production software you deliver:
- ... is functioning correctly (now and forever)
- ... provides the functionality (exactly, and only) that was requested
I don't think anybody would not agree with this. So taking this as a given, I'd like to discuss the above points in this article and its sequel.
Continue reading Why write tests? »
2. Does writing tests makes you go faster or slower?
The price you pay for writing automated tests consists of the following parts (and probably more):
- You need to become familiar with the test frameworks (test runners as well as tools for generating test doubles). This will initially take some time or make you slow.
- If your code is not well designed already, or if you're not producing clean code yet, writing the tests will be a pain. This will slow you down and may even lead you to giving it up.
- If you're not very familiar with writing tests, you'll likely end up with tests that are hard to maintain. This will slow you down during later stages of the project.
Continue reading Does writing tests makes you go faster or slower? »
3. When to write tests?
Now we need to answer one other important question that should guide you in your daily quest to improve the quality of the code you deliver: when should you write a test?
Continue reading When to write tests? »