At Automattic we use Mocha to run all tests written for Calypso project which powers WordPress.com. It also includes end-to-end tests, which live in their own repository. We have been using this setup for over 3 years now. I think it is a good moment to revisit this choice. I found this unit testing tools comparison very helpful when evaluating alternatives. I strongly agree with the conclusions shared by Martin Olsson in his article:
In this post I would like to highlight features of a testing utility called Jest that makes it a tempting alternative. We briefly discussed Jest internally on several occasions, mostly focusing on its unique approach to testing using the snapshots feature. In my opinion this is only the icing on the cake and that’s why I will rather discuss other advantages which I find more important.
Proof of concept for Calypso
I created two pull requests to investigate the efforts required to migrate our tests from Mocha to Jest. It was quite easy to get to the point where all unit tests that verify code executed on the server work properly with Jest. I also rewrote test files to use the Jest API to show how writing tests compares to Mocha. I was able to get even further and integrate Jest with Circle CI or PhpStorm. It didn’t take much time to setup code coverage and watch mode, neither. You can see more details in this PR which is almost ready to land.
I also experimented with the tests that run on the client. Again, having test runner working was mostly copy and paste with a few small changes. Unfortunately only 550+ out of 700+ tests suites were green. I took a shortcut and removed all failing tests to compare performance of Mocha and Jest. Long story short, I didn’t discover any out of the box improvements and even noticed that Jest is a bit slower with what we have. My explorations can be observed in this PR.
Finding that all those 550+ passing tests execute in 10 seconds with Mocha, but the remaining 150+ contribute to another 40 seconds was an interesting fact. In the case of Jest it’s 20 seconds and 50 seconds to run all failing tests. I think it’s hard to predict how Jest would perform if we would replace
mockery with Jest mocks and
useFakeDom helper with Jest browser env flag, but I would expect to see some improvements.
Another really intriguing thing is that we talk seconds here, whereas other companies were able to reduce the execution time by minutes when they migrated to Jest. You can check recently published Airbnb case study.
I heard about Jest for the first time last year, but it has been used by Facebook engineers for years. These days it is still developed and maintained by Facebook, but there is also a growing group of external contributors.
We already use Jest at Automattic. It tests code in the Delphin project which powers https://get.blog where users can register .blog domains. Another project where code is verified with Jest is Simplenote for Electron.
Jest is simple yet powerful. It has built-in support for the following features:
- Flexible configuration – e.g. uses glob patterns to detect test files.
- Setup and Teardown – includes also scoping.
- Matchers – let you validate different things using
- Testing asynchronous code – support for promises and
- Mock functions – let you modify or spy on the behavior of a function.
- Manual mocks – allow to override a module dependency when testing code.
- Fake timers – help you control the passage of time.
There is even more. I’m going to discuss some of them more broadly in the following sections.
Performance and isolation
After Jest documentation:
Jest parallelizes test runs across workers to maximize performance. Console messages are buffered and printed together with test results. Sandboxed test files and automatic global state resets for every test so no two tests conflict with each other.
It is totally different comparing to Mocha which runs all tests in one process. To achieve a simulated isolation between tests we had to introduce several test helpers which take care of a proper cleanup. This isn’t ideal but works in 99% of cases, because tests are run in a sequence.
Immersive watch mode
Fast interactive watch mode can run only test files related to changed files and is optimized to give signal quickly. It is very easy to set up and has a few other options. You can also filter tests by their file names or test names. We have watch mode working with Mocha, but it isn’t as much powerful. We had to build our own solution to run specific test folder or file, but this is something Jest offers for free.
Jest has a built-in code coverage reports which are a breeze to set up. It is possible to collect code coverage information from entire projects, including untested files. We didn’t manage to figure out how to achieve the same result with Mocha so far. It might be because we didn’t spend too much time trying.
I got Jest integrated with PhpStorm using the same config files that are used to be run from CLI and on continuous integration (CircleCI). It is possible to navigate with one click to a given test and also to re-run only one test. This is something that my coworkers often asked for in the past. We had Mocha integration with IDEs working in the past, but it was broken since we introduced our custom mechanism to collect test files.
Custom reporter integration
We only need to use a custom reporter to improve integration with Circle CI. It is possible with Jest using jest-junit-reporter. It works almost exactly the same as with Mocha.
When I saw Jest snapshots in action for the first time I felt a bit sceptical about this concept, because it goes against the test-first approach. On the contrary I had also mixed feelings about JSX and Redux when I initially heard about them. Based on that experience, I bet I will join the group of snapshots admirers as soon as I will use them in action. In the end they help us make sure UI does not change unexpectedly. This is how this feature is described in Jest’s documentation:
The aim of snapshot testing is not to replace existing unit tests, but providing additional value and making testing painless. In some scenarios, snapshot testing can potentially remove the need for unit testing for a particular set of functionalities (e.g. React components), but they can work together as well.
If you want to learn more about Jest snapshots you can watch a presentation from Rogelio Guzman captured at React Conf 2017:
Multi project runner
At the moment we have 4 setups that execute tests in a different configurations. We need that to be able to run integrations tests and unit tests. The latter need to work with code executed in the browser, on the server and to verify some of test helpers themselves. Jest team recently introduced a way to run multiple projects together which would greatly simplify using watch mode and integration with IDEs. I haven’t manage to make it working properly with our codebase yet, but it looks very promising.
New test framework
Jasmine always made it hard for us to move fast. Since we don’t own the codebase, it is hard to introduce new features, fix existing bugs, make design changes and just debug the code. On top of that Jasmine’s codebase is not flow typed, which make the integration harder.
The goal of this PR is to replace Jasmine with a framework that mirrors the functionality, but at the same time simplifies the things as much as possible.
They also forked Jasmine 2.5 into Jest’s own test runner and rewrote large parts of Jasmine as part of the latest Jest 20 release.
The future of Jest looks super exciting! Observing how fast new changes are released, I have a feeling that it will soon become a tool of choice for most of the projects in the whole React ecosystem.
In my opinion we should migrate to Jest. Here is the list of things that we should take into consideration when making final decision.
- Simpler API, less boilerplate code.
- Flexible and easy configuration.
- Test files executed in isolation.
- Advanced watch mode.
- Snapshots support = easier start with testing.
- Code coverage.
- Another migration.
- Mocha has still a bit better performance (according to my quick tests).