Lesson 2

Introduction to Test Driven Development

Introducing automated tests and the concept of TDD

PRO

Lesson Outline

Introduction to Test Driven Development

Test Driven Development is all about using automated tests to drive the development of your application, rather than writing your application's code first and writing tests after. In its simplest form, the process for Test Driven Development looks like this:

  1. Write a test for functionality that you want to build
  2. Check that the test fails
  3. Implement code to satisfy the test
  4. Check that the test passes
  5. Repeat

Before we get into the specifics of Test Driven Development, let's take a few steps back and discuss what exactly automated tests are.

What are Automated Tests?

An automated test, put simply, is code that tests your code. We generally use a testing framework to help us write these tests, but you will write blocks of code that trigger the code you want to test, and then check the result.

Let's see what that would look like with just plain Javascript (i.e. without a framework that helps build/run tests):

// Test that `incrementTotal()` increases the total by 1

let myTestObject = new SomeObject();
let oldTotal = myTestObject.getTotal();

myTestObject.incrementTotal();

let newTotal = myTestObject.getTotal();

if (newTotal === oldTotal + 1) {
  console.log('test passed!');
} else {
  console.log('test failed!');
}

The above code will test if the incrementTotal() function increases the total on SomeObject by 1. If the code is working as it should we would see "test passed!" logged to the console, but if it isn't we will see "test failed!". This test doesn't add any functionality to our application, it is purely for checking if parts of our application are doing what they are supposed to be doing.

In practice, we don't just use vanilla Javascript to write tests. We will be using some tools to make writing and running tests easier, but at the basic level that is the concept: writing code to test your code.

Types of Tests

In this module we will be primarily focusing on two types of tests: Unit Tests and E2E Tests. When executed correctly will give us a high degree of confidence that our code is working as intended, but just know that these are not the only types of tests.

We will be touching on some general theory and best practices for testing in this module, but the main focus will be on practicality - we won't be worrying too much about strict definitions.

Unit tests are perhaps the most common and talked about type of test. A unit test is responsible for testing a single "unit" or chunk of code in isolation - again, what some people consider to be a single "unit" or "chunk" of code might be stricter or loser than your own definition. If we use a door as an analogy, unit tests for the door might include:

  • Pushing down on the handle should unlatch the closing mechanism
  • Turning the deadbolt switch should extend the deadbolt

The important thing to consider about unit tests is that just because individual units work, it does not mean the system as a whole will work as intended. In the example above, even if the pushing down on the handle should unlatch the closing mechanism unit test passes, that doesn't necessarily mean that the user will be able to open the door (maybe the door is too big for its frame and gets stuck).

While a unit test will test that one thing works in isolation, an End-to-End (E2E) test will test the system as a whole. These tests will be more end user oriented, and might cover things like:

  • If the door is locked, the user should be able to unlock it with a key and pass through the door
  • If the door is rammed whilst the closing mechanism is engaged, the door should not open

For the purpose of this module, you can consider any tests that we create for an individual component or provider/service to be a unit test, and any tests that use the browser to interact with the application user interface as a user would to be an E2E test.

Why Use Test Driven Development?

Now that we have a little bit of background on what automated tests are, let's get back to Test Driven Development specifically. There's really not that much more to explain other than the process I mentioned above:

  1. Write a test for functionality that you want to build
  2. Check that the test fails
  3. Implement code to satisfy the test
  4. Check that the test passes
  5. Repeat

Basically, if you want to write some code to implement a feature, you should write a test to test that code before you start implementing the feature. It's important to consider the why of this, though, because on the surface it just seems annoying and like a lot of work. It doesn't immediately seem like an appealing option to mess around with writing a test when all you want to do is jump in and get the functionality built.

Let's consider some of the key benefits to Test Driven Development:

  • Tests can't be skipped
  • Test coverage is high since a test needs to be written before code is created
  • The tests document the requirements of the application
  • It forces you to plan what your code will do before writing it
  • Issues can be identified and resolved during the development of the application
  • It is easier to identify what tests need to be written
  • A test first approach signals to your team what you are about to build before you spend time building it

Really, the main benefit of Test Driven Development (over testing your application without a TDD approach) to me is that it forces you to test properly. It makes the process of testing make a lot more sense.

Consider that you've developed an entire application, or you are at least part way through doing that, and you haven't been using a TDD approach. You figure hey, I should probably start adding some tests to this application. If you are new to testing, it's really hard to identify what to test or how to test it. Maybe you create a few tests for a few specific functions, but how do you know when you have done enough testing? It leaves you with this feeling that you're not really achieving much, and that there's a bunch of stuff in your application that is completely missed by tests.

When using TDD the path is much more clear. You don't get into the situation where you are way behind on your tests, because you have to write them first. You don't need to worry about what to test, because you test whatever it is that you are currently building. You don't need to worry about whether your tests cover an adequate amount of your application, because you know there are tests for all of the functionality you have created. This still doesn't mean that your application is going to work under all circumstances, but it does give you a much higher degree of confidence.

The Cost of Testing

The hardest part to justify about testing to yourself or to your bosses is the time cost. In the beginning, it is going to take a lot longer to build an application with an appropriate testing methodology that it would take to just build it without testing. It seems like testing can be "too expensive" but it is actually not testing that will likely end up being more expensive.

There are some cases where you might not get much benefit from tests, and there are some cases where there will be huge benefits. There are always different scenarios and there is no one size fits all solution or explanation, but in general this is what both approaches will look like:

  • Without testing: Fast and cheap initially, slow and expensive in the long run
  • With testing: Slow and expensive initially, fast and cheap in the long run

When you first start developing an application things generally go pretty smoothly and you have basically the whole application in your brains "working memory". Things are easy to keep track of and debug, and tests will just slow you down. But as the codebase grows it might become more messy, and as you add some new feature or fix some existing feature you might end up breaking some other feature. You might not even know you broke that other feature until a live user reports it. Bugs become difficult to fix because it is something you, or someone else, wrote a year ago and you have no idea what is going wrong. The benefits of the initial speed that led to a poorly architected application start to wear off as technical debt builds up. Adding new features becomes extremely difficult, and the risk of breaking things becomes extremely high.

On the other hand, if you start with tests in mind from the beginning things will move slowly. You might spend 50% more time, or 100% more time building out a feature that you could have got finished much more quickly without tests. Part of the reason writing tests is slower is because you have to actually write the tests of course, but to be able to test your code efficiently it also forces you to write better code. Where you might use some kind of dirty hack just to get around things without tests, with tests you might take the time to rethink how to implement a particular feature. It takes longer to build up your application, but then the tests start to pay off. Adding new features is a breeze for two reasons primarily:

  1. You codebase is designed better overall so it is just easier to work within
  2. You know that you have appropriate tests covering existing functionality, so you can code with confidence knowing that your tests will warn you about regressions (with a detailed view of what is actually going wrong and where)

These are the kinds of applications that will actually be good and that users will enjoy. Not writing tests for a large application usually results in that scenario where the application is 80% done, we just need a consultant to come in a finish the last few features and squash some bugs. That last 20% will end up being insanely expensive because of all the garbage that has been swept under the rug, and it might just be that you end up starting from scratch with tests.

Test Driven Development vs Behaviour Driven Development

This series will not strictly be about Test Driven Development (TDD) over Behaviour Driven Development (BDD), we will use a mix of both. The difference between the two approaches is subtle, and is getting back into that argument of "what exactly constitutes X". For the purpose of learning the basics of testing, I don't think it is important.

Both TDD and BDD are based on the idea that you write tests to drive the development of the application, the only difference is what those tests are.

In general, tests created with TDD are more granular and are concerned with the inner workings of the code (i.e. the writer of the tests knows or assumes how the code will be implemented), tests created with a BDD approach take more of a black box approach and are only concerned with what should happen, not how it is achieved.

A test written with a TDD approach might be described as:

  • It should remove the last element in the array

whereas a test written with a BDD approach might be described as:

  • It should remove the last grocery item added

Both of these are testing the same thing, but the TDD approach has knowledge of the internals of the implementation, whereas the BDD approach is only testing for the desired behaviour.

There are benefits to both approaches. With TDD it may be easier to spot why tests are failing because they are more granular, but if the implementation of certain functions change then the tests will need to be updated to reflect the new implementation (which isn't as much of an issue for BDD).

Throughout this module we will not be concerned with these differences, we will use whatever style of test seems most appropriate. The style will likely generally be closer to BDD than TDD.

Summary

Test Driven Development ensures that we reap all the benefits of having automated tests in our applications, and provides a structured way to create tests. It can seem like a bit of work upfront, but it is well worth the effort in the long run - every hour spent setting up tests will likely save you many multiple hours in the future.